Sample records for quantitative framework based

  1. Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Ethene Sites

    DTIC Science & Technology

    2015-12-01

    FINAL REPORT Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...TITLE AND SUBTITLE Development and Validation of a Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation ...project ER-201129 was to develop and validate a framework used to make bioremediation decisions based on site-specific physical and biogeochemical

  2. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  3. 76 FR 37620 - Risk-Based Capital Standards: Advanced Capital Adequacy Framework-Basel II; Establishment of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-28

    ... systems. E. Quantitative Methods for Comparing Capital Frameworks The NPR sought comment on how the... industry while assessing levels of capital. This commenter points out maintaining reliable comparative data over time could make quantitative methods for this purpose difficult. For example, evaluating asset...

  4. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  5. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  6. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  7. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  8. Research on a Unique Instructional Framework for Elevating Students’ Quantitative Problem Solving Abilities

    NASA Astrophysics Data System (ADS)

    Prather, Edward E.; Wallace, Colin Scott

    2018-06-01

    We present an instructional framework that allowed a first time physics instructor to improve students quantitative problem solving abilities by more than a letter grade over what was achieved by students in an experienced instructor’s course. This instructional framework uses a Think-Pair-Share approach to foster collaborative quantitative problem solving during the lecture portion of a large enrollment introductory calculus-based mechanics course. Through the development of carefully crafted and sequenced TPS questions, we engage students in rich discussions on key problem solving issues that we typically only hear about when a student comes for help during office hours. Current work in the sophomore E&M course illustrates that this framework is generalizable to classes beyond the introductory level and for topics beyond mechanics.

  9. Misconceived Relationships between Logical Positivism and Quantitative Research: An Analysis in the Framework of Ian Hacking.

    ERIC Educational Resources Information Center

    Yu, Chong Ho

    Although quantitative research methodology is widely applied by psychological researchers, there is a common misconception that quantitative research is based on logical positivism. This paper examines the relationship between quantitative research and eight major notions of logical positivism: (1) verification; (2) pro-observation; (3)…

  10. Pattern Search in Multi-structure Data: A Framework for the Next-Generation Evidence-based Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela C

    With the advent of personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledge-bases) to predict diagnostic risks is fast emerging. Addressing this need, we pose and address the following questions (i) How can we jointly analyze both qualitative and quantitative data ? (ii) Is the fusion of multi-structure data expected to provide better insights than either of them individually ? We present experiments on two bio-medical data sets - mammography and traumatic brain studies to demonstrate architectures and tools for evidence-pattern search.

  11. A quantitative framework for the forward design of synthetic miRNA circuits.

    PubMed

    Bloom, Ryan J; Winkler, Sally M; Smolke, Christina D

    2014-11-01

    Synthetic genetic circuits incorporating regulatory components based on RNA interference (RNAi) have been used in a variety of systems. A comprehensive understanding of the parameters that determine the relationship between microRNA (miRNA) and target expression levels is lacking. We describe a quantitative framework supporting the forward engineering of gene circuits that incorporate RNAi-based regulatory components in mammalian cells. We developed a model that captures the quantitative relationship between miRNA and target gene expression levels as a function of parameters, including mRNA half-life and miRNA target-site number. We extended the model to synthetic circuits that incorporate protein-responsive miRNA switches and designed an optimized miRNA-based protein concentration detector circuit that noninvasively measures small changes in the nuclear concentration of β-catenin owing to induction of the Wnt signaling pathway. Our results highlight the importance of methods for guiding the quantitative design of genetic circuits to achieve robust, reliable and predictable behaviors in mammalian cells.

  12. Semi-quantitative estimation by IR of framework, extraframework and defect Al species of HBEA zeolites.

    PubMed

    Marques, João P; Gener, Isabelle; Ayrault, Philippe; Lopes, José M; Ribeiro, F Ramôa; Guisnet, Michel

    2004-10-21

    A simple method based on the characterization (composition, Bronsted and Lewis acidities) of acid treated HBEA zeolites was developed for estimating the concentrations of framework, extraframework and defect Al species.

  13. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  14. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  15. Development and application of a new grey dynamic hierarchy analysis system (GDHAS) for evaluating urban ecological security.

    PubMed

    Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang

    2013-05-21

    Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management.

  16. Development and Application of a New Grey Dynamic Hierarchy Analysis System (GDHAS) for Evaluating Urban Ecological Security

    PubMed Central

    Shao, Chaofeng; Tian, Xiaogang; Guan, Yang; Ju, Meiting; Xie, Qiang

    2013-01-01

    Selecting indicators based on the characteristics and development trends of a given study area is essential for building a framework for assessing urban ecological security. However, few studies have focused on how to select the representative indicators systematically, and quantitative research is lacking. We developed an innovative quantitative modeling approach called the grey dynamic hierarchy analytic system (GDHAS) for both the procedures of indicator selection and quantitative assessment of urban ecological security. Next, a systematic methodology based on the GDHAS is developed to assess urban ecological security comprehensively and dynamically. This assessment includes indicator selection, driving force-pressure-state-impact-response (DPSIR) framework building, and quantitative evaluation. We applied this systematic methodology to assess the urban ecological security of Tianjin, which is a typical coastal super megalopolis and the industry base in China. This case study highlights the key features of our approach. First, 39 representative indicators are selected for the evaluation index system from 62 alternative ones available through the GDHAS. Second, the DPSIR framework is established based on the indicators selected, and the quantitative assessment of the eco-security of Tianjin is conducted. The results illustrate the following: urban ecological security of Tianjin in 2008 was in alert level but not very stable; the driving force and pressure subsystems were in good condition, but the eco-security levels of the remainder of the subsystems were relatively low; the pressure subsystem was the key to urban ecological security; and 10 indicators are defined as the key indicators for five subsystems. These results can be used as the basis for urban eco-environmental management. PMID:23698700

  17. Reference condition approach to restoration planning

    USGS Publications Warehouse

    Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.

    2010-01-01

    Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.

  18. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    PubMed

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  20. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    PubMed

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Airborne electromagnetic mapping of the base of aquifer in areas of western Nebraska

    USGS Publications Warehouse

    Abraham, Jared D.; Cannia, James C.; Bedrosian, Paul A.; Johnson, Michaela R.; Ball, Lyndsay B.; Sibray, Steven S.

    2012-01-01

    Airborne geophysical surveys of selected areas of the North and South Platte River valleys of Nebraska, including Lodgepole Creek valley, collected data to map aquifers and bedrock topography and thus improve the understanding of groundwater - surface-water relationships to be used in water-management decisions. Frequency-domain helicopter electromagnetic surveys, using a unique survey flight-line design, collected resistivity data that can be related to lithologic information for refinement of groundwater model inputs. To make the geophysical data useful to multidimensional groundwater models, numerical inversion converted measured data into a depth-dependent subsurface resistivity model. The inverted resistivity model, along with sensitivity analyses and test-hole information, is used to identify hydrogeologic features such as bedrock highs and paleochannels, to improve estimates of groundwater storage. The two- and three-dimensional interpretations provide the groundwater modeler with a high-resolution hydrogeologic framework and a quantitative estimate of framework uncertainty. The new hydrogeologic frameworks improve understanding of the flow-path orientation by refining the location of paleochannels and associated base of aquifer highs. These interpretations provide resource managers high-resolution hydrogeologic frameworks and quantitative estimates of framework uncertainty. The improved base of aquifer configuration represents the hydrogeology at a level of detail not achievable with previously available data.

  2. Mathematical Tasks as a Framework for Reflection: From Research To Practice.

    ERIC Educational Resources Information Center

    Stein, Mary Kay; Smith, Margaret Schwan

    1998-01-01

    Describes the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) national reform project aimed at studying and fostering the development and implementation of enhanced mathematics instructional programs. It is a framework for reflection based on mathematical tasks used during classroom instruction and the ways in…

  3. Quantitative AOP-based predictions for two aromatase inhibitors evaluating the influence of bioaccumulation on prediction accuracy

    EPA Science Inventory

    The adverse outcome pathway (AOP) framework can be used to support the use of mechanistic toxicology data as a basis for risk assessment. For certain risk contexts this includes defining, quantitative linkages between the molecular initiating event (MIE) and subsequent key events...

  4. EmbryoMiner: A new framework for interactive knowledge discovery in large-scale cell tracking data of developing embryos.

    PubMed

    Schott, Benjamin; Traub, Manuel; Schlagenhauf, Cornelia; Takamiya, Masanari; Antritter, Thomas; Bartschat, Andreas; Löffler, Katharina; Blessing, Denis; Otte, Jens C; Kobitski, Andrei Y; Nienhaus, G Ulrich; Strähle, Uwe; Mikut, Ralf; Stegmaier, Johannes

    2018-04-01

    State-of-the-art light-sheet and confocal microscopes allow recording of entire embryos in 3D and over time (3D+t) for many hours. Fluorescently labeled structures can be segmented and tracked automatically in these terabyte-scale 3D+t images, resulting in thousands of cell migration trajectories that provide detailed insights to large-scale tissue reorganization at the cellular level. Here we present EmbryoMiner, a new interactive open-source framework suitable for in-depth analyses and comparisons of entire embryos, including an extensive set of trajectory features. Starting at the whole-embryo level, the framework can be used to iteratively focus on a region of interest within the embryo, to investigate and test specific trajectory-based hypotheses and to extract quantitative features from the isolated trajectories. Thus, the new framework provides a valuable new way to quantitatively compare corresponding anatomical regions in different embryos that were manually selected based on biological prior knowledge. As a proof of concept, we analyzed 3D+t light-sheet microscopy images of zebrafish embryos, showcasing potential user applications that can be performed using the new framework.

  5. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  6. Quantitative gene-gene and gene-environment mapping for leaf shape variation using tree-based models.

    PubMed

    Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun

    2017-01-01

    Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  7. Mechanochemical models of processive molecular motors

    NASA Astrophysics Data System (ADS)

    Lan, Ganhui; Sun, Sean X.

    2012-05-01

    Motor proteins are the molecular engines powering the living cell. These nanometre-sized molecules convert chemical energy, both enthalpic and entropic, into useful mechanical work. High resolution single molecule experiments can now observe motor protein movement with increasing precision. The emerging data must be combined with structural and kinetic measurements to develop a quantitative mechanism. This article describes a modelling framework where quantitative understanding of motor behaviour can be developed based on the protein structure. The framework is applied to myosin motors, with emphasis on how synchrony between motor domains give rise to processive unidirectional movement. The modelling approach shows that the elasticity of protein domains are important in regulating motor function. Simple models of protein domain elasticity are presented. The framework can be generalized to other motor systems, or an ensemble of motors such as muscle contraction. Indeed, for hundreds of myosins, our framework can be reduced to the Huxely-Simmons description of muscle movement in the mean-field limit.

  8. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE PAGES

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...

    2016-11-28

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  9. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  10. The Perceptions of U.S.-Based IT Security Professionals about the Effectiveness of IT Security Frameworks: A Quantitative Study

    ERIC Educational Resources Information Center

    Warfield, Douglas L.

    2011-01-01

    The evolution of information technology has included new methodologies that use information technology to control and manage various industries and government activities. Information Technology has also evolved as its own industry with global networks of interconnectivity, such as the Internet, and frameworks, models, and methodologies to control…

  11. An optimized framework for quantitative magnetization transfer imaging of the cervical spinal cord in vivo

    PubMed Central

    Grussu, Francesco; Ianus, Andrada; Schneider, Torben; Prados, Ferran; Fairney, James; Ourselin, Sebastien; Alexander, Daniel C.; Cercignani, Mara; Gandini Wheeler‐Kingshott, Claudia A.M.; Samson, Rebecca S.

    2017-01-01

    Purpose To develop a framework to fully characterize quantitative magnetization transfer indices in the human cervical cord in vivo within a clinically feasible time. Methods A dedicated spinal cord imaging protocol for quantitative magnetization transfer was developed using a reduced field‐of‐view approach with echo planar imaging (EPI) readout. Sequence parameters were optimized based in the Cramer‐Rao‐lower bound. Quantitative model parameters (i.e., bound pool fraction, free and bound pool transverse relaxation times [ T2F, T2B], and forward exchange rate [k FB]) were estimated implementing a numerical model capable of dealing with the novelties of the sequence adopted. The framework was tested on five healthy subjects. Results Cramer‐Rao‐lower bound minimization produces optimal sampling schemes without requiring the establishment of a steady‐state MT effect. The proposed framework allows quantitative voxel‐wise estimation of model parameters at the resolution typically used for spinal cord imaging (i.e. 0.75 × 0.75 × 5 mm3), with a protocol duration of ∼35 min. Quantitative magnetization transfer parametric maps agree with literature values. Whole‐cord mean values are: bound pool fraction = 0.11(±0.01), T2F = 46.5(±1.6) ms, T2B = 11.0(±0.2) µs, and k FB = 1.95(±0.06) Hz. Protocol optimization has a beneficial effect on reproducibility, especially for T2B and k FB. Conclusion The framework developed enables robust characterization of spinal cord microstructure in vivo using qMT. Magn Reson Med 79:2576–2588, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. PMID:28921614

  12. Implementing Response to Intervention in Title I Elementary Schools: A Quantitative Study of Teacher Response Relationships

    ERIC Educational Resources Information Center

    Webster, Katina F.

    2012-01-01

    General educators and special educators in Title I elementary schools perceive the relationships between principles of RTI and their state RTI framework, the implementation of RTI, and professional development received in RTI differently. A quantitative survey-based research methodology was employed including the use of Cronbach's alpha to…

  13. Framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms in conjunction with 3D landmark localization and registration

    NASA Astrophysics Data System (ADS)

    Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl

    2016-03-01

    We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.

  14. Pansharpening on the Narrow Vnir and SWIR Spectral Bands of SENTINEL-2

    NASA Astrophysics Data System (ADS)

    Vaiopoulos, A. D.; Karantzalos, K.

    2016-06-01

    In this paper results from the evaluation of several state-of-the-art pansharpening techniques are presented for the VNIR and SWIR bands of Sentinel-2. A procedure for the pansharpening is also proposed which aims at respecting the closest spectral similarities between the higher and lower resolution bands. The evaluation included 21 different fusion algorithms and three evaluation frameworks based both on standard quantitative image similarity indexes and qualitative evaluation from remote sensing experts. The overall analysis of the evaluation results indicated that remote sensing experts disagreed with the outcomes and method ranking from the quantitative assessment. The employed image quality similarity indexes and quantitative evaluation framework based on both high and reduced resolution data from the literature didn't manage to highlight/evaluate mainly the spatial information that was injected to the lower resolution images. Regarding the SWIR bands none of the methods managed to deliver significantly better results than a standard bicubic interpolation on the original low resolution bands.

  15. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    PubMed

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.

  16. A generalised individual-based algorithm for modelling the evolution of quantitative herbicide resistance in arable weed populations.

    PubMed

    Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul

    2017-02-01

    Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  17. TECHNOLOGY ASSESSMENT IN HOSPITALS: LESSONS LEARNED FROM AN EMPIRICAL EXPERIMENT.

    PubMed

    Foglia, Emanuela; Lettieri, Emanuele; Ferrario, Lucrezia; Porazzi, Emanuele; Garagiola, Elisabetta; Pagani, Roberta; Bonfanti, Marzia; Lazzarotti, Valentina; Manzini, Raffaella; Masella, Cristina; Croce, Davide

    2017-01-01

    Hospital Based Health Technology Assessment (HBHTA) practices, to inform decision making at the hospital level, emerged as urgent priority for policy makers, hospital managers, and professionals. The present study crystallized the results achieved by the testing of an original framework for HBHTA, developed within Lombardy Region: the IMPlementation of A Quick hospital-based HTA (IMPAQHTA). The study tested: (i) the HBHTA framework efficiency, (ii) feasibility, (iii) the tool utility and completeness, considering dimensions and sub-dimensions. The IMPAQHTA framework deployed the Regional HTA program, activated in 2008 in Lombardy, at the hospital level. The relevance and feasibility of the framework were tested over a 3-year period through a large-scale empirical experiment, involving seventy-four healthcare professionals organized in different HBHTA teams for assessing thirty-two different technologies within twenty-two different hospitals. Semi-structured interviews and self-reported questionnaires were used to collect data regarding the relevance and feasibility of the IMPAQHTA framework. The proposed HBHTA framework proved to be suitable for application at the hospital level, in the Italian context, permitting a quick assessment (11 working days) and providing hospital decision makers with relevant and quantitative information. Performances in terms of feasibility, utility, completeness, and easiness proved to be satisfactory. The IMPAQHTA was considered to be a complete and feasible HBHTA framework, as well as being replicable to different technologies within any hospital settings, thus demonstrating the capability of a hospital to develop a complete HTA, if supported by adequate and well defined tools and quantitative metrics.

  18. State Instability and Terrorism

    DTIC Science & Technology

    2010-01-01

    instability at the country-level using a modified breakdown theoretical framework. This framework is based especially upon the work of Emile Durkheim ...Quantitative Criminology, ed. Alex R. Piquero and David Weisburd. New York: Springer New York. 225 Durkheim , Emile . 1930 [1951]. Suicide: A...terrorism is a form ( Durkheim , 1930 [1951]; Useem, 1998). In addition, different types of instability ought to invite different levels of terrorism

  19. Preparation and Analysis of Cyclodextrin-Based Metal-Organic Frameworks: Laboratory Experiments Adaptable for High School through Advanced Undergraduate Students

    ERIC Educational Resources Information Center

    Smith, Merry K.; Angle, Samantha R.; Northrop, Brian H.

    2015-01-01

    ?-Cyclodextrin can assemble in the presence of KOH or RbOH into metal-organic frameworks (CD-MOFs) with applications in gas adsorption and environmental remediation. Crystalline CD-MOFs are grown by vapor diffusion and their reversible adsorption of CO[subscript 2](g) is analyzed both qualitatively and quantitatively. The experiment can be…

  20. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  1. Physiologically based pharmacokinetic (PBPK) modeling considering methylated trivalent arsenicals

    EPA Science Inventory

    PBPK modeling provides a quantitative biologically-based framework to integrate diverse types of information for application to risk analysis. For example, genetic polymorphisms in arsenic metabolizing enzymes (AS3MT) can lead to differences in target tissue dosimetry for key tri...

  2. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  3. A quantitative framework for assessing ecological resilience

    EPA Science Inventory

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  4. HOW CAN BIOLOGICALLY-BASED MODELING OF ARSENIC KINETICS AND DYNAMICS INFORM THE RISK ASSESSMENT PROCESS?

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic met...

  5. How Can Biologically-Based Modeling of Arsenic Kinetics and Dynamics Inform the Risk Assessment Process? -- ETD

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic me...

  6. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less

  7. An optimized framework for quantitative magnetization transfer imaging of the cervical spinal cord in vivo.

    PubMed

    Battiston, Marco; Grussu, Francesco; Ianus, Andrada; Schneider, Torben; Prados, Ferran; Fairney, James; Ourselin, Sebastien; Alexander, Daniel C; Cercignani, Mara; Gandini Wheeler-Kingshott, Claudia A M; Samson, Rebecca S

    2018-05-01

    To develop a framework to fully characterize quantitative magnetization transfer indices in the human cervical cord in vivo within a clinically feasible time. A dedicated spinal cord imaging protocol for quantitative magnetization transfer was developed using a reduced field-of-view approach with echo planar imaging (EPI) readout. Sequence parameters were optimized based in the Cramer-Rao-lower bound. Quantitative model parameters (i.e., bound pool fraction, free and bound pool transverse relaxation times [ T2F, T2B], and forward exchange rate [k FB ]) were estimated implementing a numerical model capable of dealing with the novelties of the sequence adopted. The framework was tested on five healthy subjects. Cramer-Rao-lower bound minimization produces optimal sampling schemes without requiring the establishment of a steady-state MT effect. The proposed framework allows quantitative voxel-wise estimation of model parameters at the resolution typically used for spinal cord imaging (i.e. 0.75 × 0.75 × 5 mm 3 ), with a protocol duration of ∼35 min. Quantitative magnetization transfer parametric maps agree with literature values. Whole-cord mean values are: bound pool fraction = 0.11(±0.01), T2F = 46.5(±1.6) ms, T2B = 11.0(±0.2) µs, and k FB  = 1.95(±0.06) Hz. Protocol optimization has a beneficial effect on reproducibility, especially for T2B and k FB . The framework developed enables robust characterization of spinal cord microstructure in vivo using qMT. Magn Reson Med 79:2576-2588, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.

  8. Determining open cluster membership. A Bayesian framework for quantitative member classification

    NASA Astrophysics Data System (ADS)

    Stott, Jonathan J.

    2018-01-01

    Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.

  9. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  10. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    NASA Astrophysics Data System (ADS)

    Walters, Charles David

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008) related to quantitative reasoning. However, this may prove challenging, as prior to entering the classroom, PSTs often have few opportunities to develop MKT by examining and reflecting on students' thinking. Videos offer one avenue through which such opportunities are possible. In this study, I report on the design of a mini-course for PSTs that featured a series of videos created as part of a proof-of-concept NSF-funded project. These MathTalk videos highlight the ways in which the quantitative reasoning of two high school students developed over time. Using a mixed approach to grounded theory, I analyzed pre- and postinterviews using an extant coding scheme based on the Silverman and Thompson (2008) framework for the development of MKT. This analysis revealed a shift in participants' affect as well as three distinct shifts in their MKT around quantitative reasoning with distances, including shifts in: (a) quantitative reasoning; (b) point of view (decentering); and (c) orientation toward problem solving. Using the four-part focusing framework (Lobato, Hohensee, & Rhodehamel, 2013), I analyzed classroom data to account for how participants' noticing was linked with the shifts in MKT. Notably, their increased noticing of aspects of MKT around quantitative reasoning with distances, which features prominently in the MathTalk videos, seemed to contribute to the emergence of the shifts in MKT. Results from this study link elements of the learning environment to the development of specific facets of MKT around quantitative reasoning with distances. These connections suggest that vicarious experiences with two students' quantitative reasoning over time was critical for participants' development of MKT.

  11. Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595

  12. Getting started in research: designing and preparing to conduct a research study.

    PubMed

    Macfarlane, Matthew D; Kisely, Steve; Loi, Samantha; Macfarlane, Stephen; Merry, Sally; Parker, Stephen; Power, Brian; Siskind, Dan; Smith, Geoff; Looi, Jeffrey C

    2015-02-01

    To discuss common pitfalls and useful tips in designing a quantitative research study, the importance and process of ethical approval, and consideration of funding. Through careful planning, based on formulation of a research question, early career researchers can design and conduct quantitative research projects within the framework of the Scholarly Project or in their own independent projects. © The Royal Australian and New Zealand College of Psychiatrists 2014.

  13. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  14. Manuscript 116 Mechanisms: DNA Reactive Aagents

    EPA Science Inventory

    ABSTRACT The U.S. Environmental Protection Agency’s Guidelines for Carcinogen Risk Assessment (2005) uses an analytical framework for conducting a quantitative cancer risk assessment that is based on mode of action/key events and human relevance. The approach stresses the enh...

  15. Quantitative design of emergency monitoring network for river chemical spills based on discrete entropy theory.

    PubMed

    Shi, Bin; Jiang, Jiping; Sivakumar, Bellie; Zheng, Yi; Wang, Peng

    2018-05-01

    Field monitoring strategy is critical for disaster preparedness and watershed emergency environmental management. However, development of such is also highly challenging. Despite the efforts and progress thus far, no definitive guidelines or solutions are available worldwide for quantitatively designing a monitoring network in response to river chemical spill incidents, except general rules based on administrative divisions or arbitrary interpolation on routine monitoring sections. To address this gap, a novel framework for spatial-temporal network design was proposed in this study. The framework combines contaminant transport modelling with discrete entropy theory and spectral analysis. The water quality model was applied to forecast the spatio-temporal distribution of contaminant after spills and then corresponding information transfer indexes (ITIs) and Fourier approximation periodic functions were estimated as critical measures for setting sampling locations and times. The results indicate that the framework can produce scientific preparedness plans of emergency monitoring based on scenario analysis of spill risks as well as rapid design as soon as the incident happened but not prepared. The framework was applied to a hypothetical spill case based on tracer experiment and a real nitrobenzene spill incident case to demonstrate its suitability and effectiveness. The newly-designed temporal-spatial monitoring network captured major pollution information at relatively low costs. It showed obvious benefits for follow-up early-warning and treatment as well as for aftermath recovery and assessment. The underlying drivers of ITIs as well as the limitations and uncertainty of the approach were analyzed based on the case studies. Comparison with existing monitoring network design approaches, management implications, and generalized applicability were also discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework

    PubMed Central

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-01-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316

  17. Going Beyond the Millennium Ecosystem Assessment: An Index System of Human Well-Being

    PubMed Central

    Yang, Wu; Dietz, Thomas; Kramer, Daniel Boyd; Chen, Xiaodong; Liu, Jianguo

    2013-01-01

    Understanding the linkages between ecosystem services (ES) and human well-being (HWB) is crucial to sustain the flow of ES for HWB. The Millennium Ecosystem Assessment (MA) provided a state-of-the-art synthesis of such knowledge. However, due to the complexity of the linkages between ES and HWB, there are still many knowledge gaps, and in particular a lack of quantitative indicators and integrated models based on the MA framework. To fill some of these research needs, we developed a quantitative index system to measure HWB, and assessed the impacts of an external driver – the 2008 Wenchuan Earthquake – on HWB. Our results suggest that our proposed index system of HWB is well-designed, valid and could be useful for better understanding the linkages between ES and HWB. The earthquake significantly affected households' well-being in our demonstration sites. Such impacts differed across space and across the five dimensions of the sub-index (i.e., the basic material for good life, security, health, good social relations, and freedom of choice and action). Since the conceptual framework is based on the generalizable MA framework, our methods should also be applicable to other study areas. PMID:23717635

  18. A multi-scale, multi-disciplinary approach for assessing the technological, economic and environmental performance of bio-based chemicals.

    PubMed

    Herrgård, Markus; Sukumara, Sumesh; Campodonico, Miguel; Zhuang, Kai

    2015-12-01

    In recent years, bio-based chemicals have gained interest as a renewable alternative to petrochemicals. However, there is a significant need to assess the technological, biological, economic and environmental feasibility of bio-based chemicals, particularly during the early research phase. Recently, the Multi-scale framework for Sustainable Industrial Chemicals (MuSIC) was introduced to address this issue by integrating modelling approaches at different scales ranging from cellular to ecological scales. This framework can be further extended by incorporating modelling of the petrochemical value chain and the de novo prediction of metabolic pathways connecting existing host metabolism to desirable chemical products. This multi-scale, multi-disciplinary framework for quantitative assessment of bio-based chemicals will play a vital role in supporting engineering, strategy and policy decisions as we progress towards a sustainable chemical industry. © 2015 Authors; published by Portland Press Limited.

  19. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    PubMed

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  20. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging.

    PubMed

    Sturla, Francesco; Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-04-01

    Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow's disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment.

  1. Towards a Quantitative Endogenous Network Theory of Cancer Genesis and Progression: beyond ``cancer as diseases of genome''

    NASA Astrophysics Data System (ADS)

    Ao, Ping

    2011-03-01

    There has been a tremendous progress in cancer research. However, it appears the current dominant cancer research framework of regarding cancer as diseases of genome leads impasse. Naturally questions have been asked that whether it is possible to develop alternative frameworks such that they can connect both to mutations and other genetic/genomic effects and to environmental factors. Furthermore, such framework can be made quantitative and with predictions experimentally testable. In this talk, I will present a positive answer to this calling. I will explain on our construction of endogenous network theory based on molecular-cellular agencies as dynamical variable. Such cancer theory explicitly demonstrates a profound connection to many fundamental concepts in physics, as such stochastic non-equilibrium processes, ``energy'' landscape, metastability, etc. It suggests that neneath cancer's daunting complexity may lie a simplicity that gives grounds for hope. The rationales behind such theory, its predictions, and its initial experimental verifications will be presented. Supported by USA NIH and China NSF.

  2. a New Object-Based Framework to Detect Shodows in High-Resolution Satellite Imagery Over Urban Areas

    NASA Astrophysics Data System (ADS)

    Tatar, N.; Saadatseresht, M.; Arefi, H.; Hadavand, A.

    2015-12-01

    In this paper a new object-based framework to detect shadow areas in high resolution satellite images is proposed. To produce shadow map in pixel level state of the art supervised machine learning algorithms are employed. Automatic ground truth generation based on Otsu thresholding on shadow and non-shadow indices is used to train the classifiers. It is followed by segmenting the image scene and create image objects. To detect shadow objects, a majority voting on pixel-based shadow detection result is designed. GeoEye-1 multi-spectral image over an urban area in Qom city of Iran is used in the experiments. Results shows the superiority of our proposed method over traditional pixel-based, visually and quantitatively.

  3. Problem-Based Learning in Tertiary Education: Teaching Old "Dogs" New Tricks?

    ERIC Educational Resources Information Center

    Yeo, Roland K.

    2005-01-01

    Purpose--The paper sets out to explore the challenges of problem-based learning (PBL) in tertiary education and to propose a framework with implications for practice and learning. Design/Methodology/Approach--A total of 18 tertiary students divided into three groups participated in the focus group discussions. A quantitative instrument was used as…

  4. Reviewing Quantitative Research To Inform Educational Policy Processes. Fundamentals of Educational Planning.

    ERIC Educational Resources Information Center

    Hite, Seven J.

    Educational planners and policymakers are rarely able to base their decision-making on sound information and research, according to this book. Because the situation is even more difficult in developing countries, educational policy often is based on research conducted in others parts of the world. This book provides a practical framework that can…

  5. A Comparative Analysis of New Governance Instruments in the Transnational Educational Space: A Shift to Knowledge-Based Instruments?

    ERIC Educational Resources Information Center

    Ioannidou, Alexandra

    2007-01-01

    In recent years, the ongoing development towards a knowledge-based society--associated with globalization, an aging population, new technologies and organizational changes--has led to a more intensive analysis of education and learning throughout life with regard to quantitative, qualitative and financial aspects. In this framework, education…

  6. Evaluation of breeding strategies for polledness in dairy cattle using a newly developed simulation framework for quantitative and Mendelian traits.

    PubMed

    Scheper, Carsten; Wensch-Dorendorf, Monika; Yin, Tong; Dressel, Holger; Swalve, Herrmann; König, Sven

    2016-06-29

    Intensified selection of polled individuals has recently gained importance in predominantly horned dairy cattle breeds as an alternative to routine dehorning. The status quo of the current polled breeding pool of genetically-closely related artificial insemination sires with lower breeding values for performance traits raises questions regarding the effects of intensified selection based on this founder pool. We developed a stochastic simulation framework that combines the stochastic simulation software QMSim and a self-designed R program named QUALsim that acts as an external extension. Two traits were simulated in a dairy cattle population for 25 generations: one quantitative (QMSim) and one qualitative trait with Mendelian inheritance (i.e. polledness, QUALsim). The assignment scheme for qualitative trait genotypes initiated realistic initial breeding situations regarding allele frequencies, true breeding values for the quantitative trait and genetic relatedness. Intensified selection for polled cattle was achieved using an approach that weights estimated breeding values in the animal best linear unbiased prediction model for the quantitative trait depending on genotypes or phenotypes for the polled trait with a user-defined weighting factor. Selection response for the polled trait was highest in the selection scheme based on genotypes. Selection based on phenotypes led to significantly lower allele frequencies for polled. The male selection path played a significantly greater role for a fast dissemination of polled alleles compared to female selection strategies. Fixation of the polled allele implies selection based on polled genotypes among males. In comparison to a base breeding scenario that does not take polledness into account, intensive selection for polled substantially reduced genetic gain for this quantitative trait after 25 generations. Reducing selection intensity for polled males while maintaining strong selection intensity among females, simultaneously decreased losses in genetic gain and achieved a final allele frequency of 0.93 for polled. A fast transition to a completely polled population through intensified selection for polled was in contradiction to the preservation of high genetic gain for the quantitative trait. Selection on male polled genotypes with moderate weighting, and selection on female polled phenotypes with high weighting, could be a suitable compromise regarding all important breeding aspects.

  7. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  8. Comparison of Pre-Service Physics Teachers' Conceptual Understanding of Dynamics in Model-Based Scientific Inquiry and Scientific Inquiry Environments

    ERIC Educational Resources Information Center

    Arslan Buyruk, Arzu; Ogan Bekiroglu, Feral

    2018-01-01

    The focus of this study was to evaluate the impact of model-based inquiry on pre-service physics teachers' conceptual understanding of dynamics. Theoretical framework of this research was based on models-of-data theory. True-experimental design using quantitative and qualitative research methods was carried out for this research. Participants of…

  9. HCI∧2 framework: a software framework for multimodal human-computer interaction systems.

    PubMed

    Shen, Jie; Pantic, Maja

    2013-12-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a shared-memory-based data transport protocol for message delivery and a TCP-based system management protocol. The latter ensures that the integrity of system structure is maintained at runtime. With the inclusion of bridging modules, the HCI∧2 Framework is interoperable with other software frameworks including Psyclone and ActiveMQ. In addition to the core communication middleware, we also present the integrated development environment (IDE) of the HCI∧2 Framework. It provides a complete graphical environment to support every step in a typical MHCI system development process, including module development, debugging, packaging, and management, as well as the whole system management and testing. The quantitative evaluation indicates that our framework outperforms other similar tools in terms of average message latency and maximum data throughput under a typical single PC scenario. To demonstrate HCI∧2 Framework's capabilities in integrating heterogeneous modules, we present several example modules working with a variety of hardware and software. We also present an example of a full system developed using the proposed HCI∧2 Framework, which is called the CamGame system and represents a computer game based on hand-held marker(s) and low-cost camera(s).

  10. Development of quantitative risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesmeyer, J. M.; Okrent, D.

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  11. Is there a need for a universal benefit-risk assessment framework for medicines? Regulatory and industry perspectives.

    PubMed

    Leong, James; McAuslane, Neil; Walker, Stuart; Salek, Sam

    2013-09-01

    To explore the current status and need for a universal benefit-risk framework for medicines in regulatory agencies and pharmaceutical companies. A questionnaire was developed and sent to 14 mature regulatory agencies and 24 major companies. The data were analysed using descriptive statistics, for a minority of questions preceded by manual grouping of the responses. Overall response rate was 82%, and study participants included key decision makers from agencies and companies. None used a fully quantitative system, most companies preferring a qualitative method. The major reasons for this group not using semi-quantitative or quantitative systems were lack of a universal and scientifically validated framework. The main advantages of a benefit-risk framework were that it provided a systematic standardised approach to decision-making and that it acted as a tool to enhance quality of communication. It was also reported that a framework should be of value to both agencies and companies throughout the life cycle of a product. They believed that it is possible to develop an overarching benefit-risk framework that should involve relevant stakeholders in the development, validation and application of a universal framework. The entire cohort indicated common barriers to implementing a framework were resource limitations, a lack of knowledge and a scientifically validated and acceptable framework. Stakeholders prefer a semi-quantitative, overarching framework that incorporates a toolbox of different methodologies. A coordinating committee of relevant stakeholders should be formed to guide its development and implementation. Through engaging the stakeholders, these outcomes confirm sentiments and need for developing a universal benefit-risk assessment framework. Copyright © 2013 John Wiley & Sons, Ltd.

  12. An index-based robust decision making framework for watershed management in a changing climate.

    PubMed

    Kim, Yeonjoo; Chung, Eun-Sung

    2014-03-01

    This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  14. A metal-organic framework based on nanosized hexagonal channels as fluorescent indicator for detection of nitroaromatic explosives

    NASA Astrophysics Data System (ADS)

    Hu, Xiao-Li; Wang, Xin-Long; Su, Zhong-Min

    2018-02-01

    A novel Zn-MOF (metal organic framework) [Zn3(NTB)2(DMA)2]·12DMA (NTB = 4,4‧,4″-nitrilotrisbenzoic acid; DMA = N,N-dimethylacetamide) (1) was obtained under solvothermal condition. The resulted MOF which is based on {Zn3} SBU displays an interesting (3,6)-connected three-dimensional net with nanosized, hexagonal channels. Additionally, 1 can be a useful fluorescent indicator for the detection of nitroaromatic explosives qualitatively and quantitatively via a strong quenching effect, especially for picric acid (PA). With increasing - NO2 groups, energy transfer from the electron-donating framework to high electron deficiency becomes more, making the effect of fluorescence quenching more obvious. The result demonstrates that the photo-induced electron transfer (PET) is responsible for the emission quenching.

  15. Maintenance = reuse-oriented software development

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1989-01-01

    Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.

  16. A framework to assess management performance in district health systems: a qualitative and quantitative case study in Iran.

    PubMed

    Tabrizi, Jafar Sadegh; Gholipour, Kamal; Iezadi, Shabnam; Farahbakhsh, Mostafa; Ghiasi, Akbar

    2018-01-01

    The aim was to design a district health management performance framework for Iran's healthcare system. The mixed-method study was conducted between September 2015 and May 2016 in Tabriz, Iran. In this study, the indicators of district health management performance were obtained by analyzing the 45 semi-structured surveys of experts in the public health system. Content validity of performance indicators which were generated in qualitative part were reviewed and confirmed based on content validity index (CVI). Also content validity ratio (CVR) was calculated using data acquired from a survey of 21 experts in quantitative part. The result of this study indicated that, initially, 81 indicators were considered in framework of district health management performance and, at the end, 53 indicators were validated and confirmed. These indicators were classified in 11 categories which include: human resources and organizational creativity, management and leadership, rules and ethics, planning and evaluation, district managing, health resources management and economics, community participation, quality improvement, research in health system, health information management, epidemiology and situation analysis. The designed framework model can be used to assess the district health management and facilitates performance improvement at the district level.

  17. Quantitative diagnostics of soft tissue through viscoelastic characterization using time-based instrumented palpation.

    PubMed

    Palacio-Torralba, Javier; Hammer, Steven; Good, Daniel W; Alan McNeill, S; Stewart, Grant D; Reuben, Robert L; Chen, Yuhang

    2015-01-01

    Although palpation has been successfully employed for centuries to assess soft tissue quality, it is a subjective test, and is therefore qualitative and depends on the experience of the practitioner. To reproduce what the medical practitioner feels needs more than a simple quasi-static stiffness measurement. This paper assesses the capacity of dynamic mechanical palpation to measure the changes in viscoelastic properties that soft tissue can exhibit under certain pathological conditions. A diagnostic framework is proposed to measure elastic and viscous behaviors simultaneously using a reduced set of viscoelastic parameters, giving a reliable index for quantitative assessment of tissue quality. The approach is illustrated on prostate models reconstructed from prostate MRI scans. The examples show that the change in viscoelastic time constant between healthy and cancerous tissue is a key index for quantitative diagnostics using point probing. The method is not limited to any particular tissue or material and is therefore useful for tissue where defining a unique time constant is not trivial. The proposed framework of quantitative assessment could become a useful tool in clinical diagnostics for soft tissue. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less

  19. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Transforming physics educator identities: TAs help TAs become teaching professionals

    NASA Astrophysics Data System (ADS)

    Gretton, Anneke L.; Bridges, Terry; Fraser, James M.

    2017-05-01

    Research-based instructional strategies have been shown to dramatically improve student learning, but widespread adoption of these pedagogies remains limited. Post-secondary teaching assistants (TAs), with their current positions in course delivery and future roles as academic leaders, are an essential target group for teacher training. However, the literature suggests that successful TA professional development must address not only pedagogical practices but also the cultivation of physics educator identity. The primary goal of this study is to build a framework for TA professional development that strengthens the TA's identity as a physics educator. We base this framework on Etienne Wenger's model for communities of practice and Côté and Levine's personality and social structure identity perspective. We explore this framework in the context of a 12-week, low-cost, TA-led and TA-centered professional development intervention. Our qualitative and quantitative data suggest that this efficient community-based intervention strengthened TAs' identification as physics educators.

  1. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  2. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  3. Network Science Based Quantification of Resilience Demonstrated on the Indian Railways Network.

    PubMed

    Bhatia, Udit; Kumar, Devashish; Kodra, Evan; Ganguly, Auroop R

    2015-01-01

    The structure, interdependence, and fragility of systems ranging from power-grids and transportation to ecology, climate, biology and even human communities and the Internet have been examined through network science. While response to perturbations has been quantified, recovery strategies for perturbed networks have usually been either discussed conceptually or through anecdotal case studies. Here we develop a network science based quantitative framework for measuring, comparing and interpreting hazard responses as well as recovery strategies. The framework, motivated by the recently proposed temporal resilience paradigm, is demonstrated with the Indian Railways Network. Simulations inspired by the 2004 Indian Ocean Tsunami and the 2012 North Indian blackout as well as a cyber-physical attack scenario illustrate hazard responses and effectiveness of proposed recovery strategies. Multiple metrics are used to generate various recovery strategies, which are simply sequences in which system components should be recovered after a disruption. Quantitative evaluation of these strategies suggests that faster and more efficient recovery is possible through network centrality measures. Optimal recovery strategies may be different per hazard, per community within a network, and for different measures of partial recovery. In addition, topological characterization provides a means for interpreting the comparative performance of proposed recovery strategies. The methods can be directly extended to other Large-Scale Critical Lifeline Infrastructure Networks including transportation, water, energy and communications systems that are threatened by natural or human-induced hazards, including cascading failures. Furthermore, the quantitative framework developed here can generalize across natural, engineered and human systems, offering an actionable and generalizable approach for emergency management in particular as well as for network resilience in general.

  4. Network Science Based Quantification of Resilience Demonstrated on the Indian Railways Network

    PubMed Central

    Bhatia, Udit; Kumar, Devashish; Kodra, Evan; Ganguly, Auroop R.

    2015-01-01

    The structure, interdependence, and fragility of systems ranging from power-grids and transportation to ecology, climate, biology and even human communities and the Internet have been examined through network science. While response to perturbations has been quantified, recovery strategies for perturbed networks have usually been either discussed conceptually or through anecdotal case studies. Here we develop a network science based quantitative framework for measuring, comparing and interpreting hazard responses as well as recovery strategies. The framework, motivated by the recently proposed temporal resilience paradigm, is demonstrated with the Indian Railways Network. Simulations inspired by the 2004 Indian Ocean Tsunami and the 2012 North Indian blackout as well as a cyber-physical attack scenario illustrate hazard responses and effectiveness of proposed recovery strategies. Multiple metrics are used to generate various recovery strategies, which are simply sequences in which system components should be recovered after a disruption. Quantitative evaluation of these strategies suggests that faster and more efficient recovery is possible through network centrality measures. Optimal recovery strategies may be different per hazard, per community within a network, and for different measures of partial recovery. In addition, topological characterization provides a means for interpreting the comparative performance of proposed recovery strategies. The methods can be directly extended to other Large-Scale Critical Lifeline Infrastructure Networks including transportation, water, energy and communications systems that are threatened by natural or human-induced hazards, including cascading failures. Furthermore, the quantitative framework developed here can generalize across natural, engineered and human systems, offering an actionable and generalizable approach for emergency management in particular as well as for network resilience in general. PMID:26536227

  5. Using enterprise architecture to analyse how organisational structure impact motivation and learning

    NASA Astrophysics Data System (ADS)

    Närman, Pia; Johnson, Pontus; Gingnell, Liv

    2016-06-01

    When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.

  6. Least squares QR-based decomposition provides an efficient way of computing optimal regularization parameter in photoacoustic tomography.

    PubMed

    Shaw, Calvin B; Prakash, Jaya; Pramanik, Manojit; Yalavarthy, Phaneendra K

    2013-08-01

    A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison.

  7. Effects of Peer Tutoring on Reading Self-Concept

    ERIC Educational Resources Information Center

    Flores, Marta; Duran, David

    2013-01-01

    This study investigates the development of the Reading Self-Concept and of the mechanisms underlying it, within a framework of a reading programme based on peer tutoring. The multiple methodological design adopted allowed for a quantitative approach which showed statistically significant changes in the Reading Self-Concept of those students who…

  8. Leadership Strategies of Performance Measures Impacts in Public Sector Management: A National Content Analysis.

    ERIC Educational Resources Information Center

    Kubala, James Joseph

    A quantitative and qualitative study examined three leadership strategies found in performance-based management (human resource, scientific management and political strategies used in public sector management); a framework by which performance measurement (PM) supports leadership strategies; and how the strategies impact PM. It examined leadership…

  9. Developing International Managers: The Contribution of Cultural Experience to Learning

    ERIC Educational Resources Information Center

    Townsend, Peter; Regan, Padraic; Li, Liang Liang

    2015-01-01

    Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…

  10. Parent Social Networks and Parent Responsibility: Implications for School Leadership

    ERIC Educational Resources Information Center

    Curry, Katherine A.; Adams, Curt M.

    2014-01-01

    Family-school partnerships are difficult to initiate and sustain in ways that actually promote student learning, especially in high-poverty communities. This quantitative study was designed to better understand how social forces shape parent responsibility in education. Based on social cognitive theory as the conceptual framework, the…

  11. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  12. Intercultural Education and Academic Achievement: A Framework for School-Based Policies in Multilingual Schools

    ERIC Educational Resources Information Center

    Cummins, Jim

    2015-01-01

    The paper reviews quantitative and qualitative research evidence regarding the relationship between intercultural education and academic achievement among students from socially marginalized communities. Intercultural education is conceptualized as including a focus both on generating understanding and respect for diverse cultural traditions and…

  13. Pattern search in multi-structure data: a framework for the next-generation evidence-based medicine

    NASA Astrophysics Data System (ADS)

    Sukumar, Sreenivas R.; Ainsworth, Keela C.

    2014-03-01

    With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. Addressing this need, we pose and answer the following questions: (i) How can we jointly analyze and explore measurement data in context with qualitative domain knowledge? (ii) How can we search and hypothesize patterns (not known apriori) from such multi-structure data? (iii) How can we build predictive models by integrating weakly-associated multi-relational multi-structure data? We propose a framework towards answering these questions. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  14. Development of a Survey to Examine the Factors That Motivate Secondary Education Teachers' Use of Problem-Based Learning (PBL)

    ERIC Educational Resources Information Center

    Lao, Huei-Chen

    2016-01-01

    In this quantitative study, a survey was developed and administered to middle and high school teachers to examine what factors motivated them to implement problem-based learning (PBL). Using Expectancy-Value Theory by Eccles et al. (1983) and Self-Determination Theory by Ryan and Deci (2000b) as the theoretical framework, this instrument measured…

  15. Evaluating Academic Scientists Collaborating in Team-Based Research: A Proposed Framework

    PubMed Central

    Mazumdar, Madhu; Messinger, Shari; Finkelstein, Dianne M.; Goldberg, Judith D.; Lindsell, Christopher J.; Morton, Sally C.; Pollock, Brad H.; Rahbar, Mohammad H.; Welty, Leah J.; Parker, Robert A.

    2015-01-01

    Criteria for evaluating faculty are traditionally based on a triad of scholarship, teaching, and service. Research scholarship is often measured by first or senior authorship on peer-reviewed scientific publications and being principal investigator on extramural grants. Yet scientific innovation increasingly requires collective rather than individual creativity, which traditional measures of achievement were not designed to capture and, thus, devalue. The authors propose a simple, flexible framework for evaluating team scientists that includes both quantitative and qualitative assessments. An approach for documenting contributions of team scientists in team-based scholarship, non-traditional education, and specialized service activities is also outlined. While biostatisticians are used for illustration, the approach is generalizable to team scientists in other disciplines. PMID:25993282

  16. Quantitative disease resistance: to better understand parasite-mediated selection on major histocompatibility complex

    PubMed Central

    Westerdahl, Helena; Asghar, Muhammad; Hasselquist, Dennis; Bensch, Staffan

    2012-01-01

    We outline a descriptive framework of how candidate alleles of the immune system associate with infectious diseases in natural populations of animals. Three kinds of alleles can be separated when both prevalence of infection and infection intensity are measured—qualitative disease resistance, quantitative disease resistance and susceptibility alleles. Our descriptive framework demonstrates why alleles for quantitative resistance and susceptibility cannot be separated based on prevalence data alone, but are distinguishable on infection intensity. We then present a case study to evaluate a previous finding of a positive association between prevalence of a severe avian malaria infection (GRW2, Plasmodium ashfordi) and a major histocompatibility complex (MHC) class I allele (B4b) in great reed warblers Acrocephalus arundinaceus. Using the same dataset, we find that individuals with allele B4b have lower GRW2 infection intensities than individuals without this allele. Therefore, allele B4b provides quantitative resistance rather than increasing susceptibility to infection. This implies that birds carrying B4b can mount an immune response that suppresses the acute-phase GRW2 infection, while birds without this allele cannot and may die. We argue that it is important to determine whether MHC alleles related to infections are advantageous (quantitative and qualitative resistance) or disadvantageous (susceptibility) to obtain a more complete picture of pathogen-mediated balancing selection. PMID:21733902

  17. Quantitative disease resistance: to better understand parasite-mediated selection on major histocompatibility complex.

    PubMed

    Westerdahl, Helena; Asghar, Muhammad; Hasselquist, Dennis; Bensch, Staffan

    2012-02-07

    We outline a descriptive framework of how candidate alleles of the immune system associate with infectious diseases in natural populations of animals. Three kinds of alleles can be separated when both prevalence of infection and infection intensity are measured--qualitative disease resistance, quantitative disease resistance and susceptibility alleles. Our descriptive framework demonstrates why alleles for quantitative resistance and susceptibility cannot be separated based on prevalence data alone, but are distinguishable on infection intensity. We then present a case study to evaluate a previous finding of a positive association between prevalence of a severe avian malaria infection (GRW2, Plasmodium ashfordi) and a major histocompatibility complex (MHC) class I allele (B4b) in great reed warblers Acrocephalus arundinaceus. Using the same dataset, we find that individuals with allele B4b have lower GRW2 infection intensities than individuals without this allele. Therefore, allele B4b provides quantitative resistance rather than increasing susceptibility to infection. This implies that birds carrying B4b can mount an immune response that suppresses the acute-phase GRW2 infection, while birds without this allele cannot and may die. We argue that it is important to determine whether MHC alleles related to infections are advantageous (quantitative and qualitative resistance) or disadvantageous (susceptibility) to obtain a more complete picture of pathogen-mediated balancing selection.

  18. Monitoring and evaluation framework for hypertension programs. A collaboration between the Pan American Health Organization and World Hypertension League.

    PubMed

    Campbell, Norm R C; Ordunez, Pedro; DiPette, Donald J; Giraldo, Gloria P; Angell, Sonia Y; Jaffe, Marc G; Lackland, Dan; Martinez, Ramón; Valdez, Yamilé; Maldonado Figueredo, Javier I; Paccot, Melanie; Santana, Maria J; Whelton, Paul K

    2018-06-01

    The Pan American Health Organization (PAHO)-World Hypertension League (WHL) Hypertension Monitoring and Evaluation Framework is summarized. Standardized indicators are provided for monitoring and evaluating national or subnational hypertension control programs. Five core indicators from the World Health Organization hearts initiative and a single PAHO-WHL core indicator are recommended to be used in all hypertension control programs. In addition, hypertension control programs are encouraged to select from 14 optional qualitative and 33 quantitative indicators to facilitate progress towards enhanced hypertension control. The intention is for hypertension programs to select quantitative indicators based on the current surveillance mechanisms that are available and what is feasible and to use the framework process indicators as a guide to program management. Programs may wish to increase or refine the number of indicators they use over time. With adaption the indicators can also be implemented at a community or clinic level. The standardized indicators are being pilot tested in Cuba, Colombia, Chile, and Barbados. ©2018 Wiley Periodicals, Inc.

  19. Towards a neuro-computational account of prism adaptation.

    PubMed

    Petitet, Pierre; O'Reilly, Jill X; O'Shea, Jacinta

    2017-12-14

    Prism adaptation has a long history as an experimental paradigm used to investigate the functional and neural processes that underlie sensorimotor control. In the neuropsychology literature, prism adaptation behaviour is typically explained by reference to a traditional cognitive psychology framework that distinguishes putative functions, such as 'strategic control' versus 'spatial realignment'. This theoretical framework lacks conceptual clarity, quantitative precision and explanatory power. Here, we advocate for an alternative computational framework that offers several advantages: 1) an algorithmic explanatory account of the computations and operations that drive behaviour; 2) expressed in quantitative mathematical terms; 3) embedded within a principled theoretical framework (Bayesian decision theory, state-space modelling); 4) that offers a means to generate and test quantitative behavioural predictions. This computational framework offers a route towards mechanistic neurocognitive explanations of prism adaptation behaviour. Thus it constitutes a conceptual advance compared to the traditional theoretical framework. In this paper, we illustrate how Bayesian decision theory and state-space models offer principled explanations for a range of behavioural phenomena in the field of prism adaptation (e.g. visual capture, magnitude of visual versus proprioceptive realignment, spontaneous recovery and dynamics of adaptation memory). We argue that this explanatory framework can advance understanding of the functional and neural mechanisms that implement prism adaptation behaviour, by enabling quantitative tests of hypotheses that go beyond merely descriptive mapping claims that 'brain area X is (somehow) involved in psychological process Y'. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. An ecological framework for informing permitting decisions on scientific activities in protected areas

    PubMed Central

    Saarman, Emily T.; Owens, Brian; Murray, Steven N.; Weisberg, Stephen B.; Field, John C.; Nielsen, Karina J.

    2018-01-01

    There are numerous reasons to conduct scientific research within protected areas, but research activities may also negatively impact organisms and habitats, and thus conflict with a protected area’s conservation goals. We developed a quantitative ecological decision-support framework that estimates these potential impacts so managers can weigh costs and benefits of proposed research projects and make informed permitting decisions. The framework generates quantitative estimates of the ecological impacts of the project and the cumulative impacts of the proposed project and all other projects in the protected area, and then compares the estimated cumulative impacts of all projects with policy-based acceptable impact thresholds. We use a series of simplified equations (models) to assess the impacts of proposed research to: a) the population of any targeted species, b) the major ecological assemblages that make up the community, and c) the physical habitat that supports protected area biota. These models consider both targeted and incidental impacts to the ecosystem and include consideration of the vulnerability of targeted species, assemblages, and habitats, based on their recovery time and ecological role. We parameterized the models for a wide variety of potential research activities that regularly occur in the study area using a combination of literature review and expert judgment with a precautionary approach to uncertainty. We also conducted sensitivity analyses to examine the relationships between model input parameters and estimated impacts to understand the dominant drivers of the ecological impact estimates. Although the decision-support framework was designed for and adopted by the California Department of Fish and Wildlife for permitting scientific studies in the state-wide network of marine protected areas (MPAs), the framework can readily be adapted for terrestrial and freshwater protected areas. PMID:29920527

  1. Preliminary evaluation of a fully automated quantitative framework for characterizing general breast tissue histology via color histogram and color texture analysis

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Gastounioti, Aimilia; Batiste, Rebecca C.; Kontos, Despina; Feldman, Michael D.

    2016-03-01

    Visual characterization of histologic specimens is known to suffer from intra- and inter-observer variability. To help address this, we developed an automated framework for characterizing digitized histology specimens based on a novel application of color histogram and color texture analysis. We perform a preliminary evaluation of this framework using a set of 73 trichrome-stained, digitized slides of normal breast tissue which were visually assessed by an expert pathologist in terms of the percentage of collagenous stroma, stromal collagen density, duct-lobular unit density and the presence of elastosis. For each slide, our algorithm automatically segments the tissue region based on the lightness channel in CIELAB colorspace. Within each tissue region, a color histogram feature vector is extracted using a common color palette for trichrome images generated with a previously described method. Then, using a whole-slide, lattice-based methodology, color texture maps are generated using a set of color co-occurrence matrix statistics: contrast, correlation, energy and homogeneity. The extracted features sets are compared to the visually assessed tissue characteristics. Overall, the extracted texture features have high correlations to both the percentage of collagenous stroma (r=0.95, p<0.001) and duct-lobular unit density (r=0.71, p<0.001) seen in the tissue samples, and several individual features were associated with either collagen density and/or the presence of elastosis (p<=0.05). This suggests that the proposed framework has promise as a means to quantitatively extract descriptors reflecting tissue-level characteristics and thus could be useful in detecting and characterizing histological processes in digitized histology specimens.

  2. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care

    PubMed Central

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988

  3. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.

    PubMed

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.

  4. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  5. Temperature dependent magnon-phonon coupling in bcc Fe from theory and experiment.

    PubMed

    Körmann, F; Grabowski, B; Dutta, B; Hickel, T; Mauger, L; Fultz, B; Neugebauer, J

    2014-10-17

    An ab initio based framework for quantitatively assessing the phonon contribution due to magnon-phonon interactions and lattice expansion is developed. The theoretical results for bcc Fe are in very good agreement with high-quality phonon frequency measurements. For some phonon branches, the magnon-phonon interaction is an order of magnitude larger than the phonon shift due to lattice expansion, demonstrating the strong impact of magnetic short-range order even significantly above the Curie temperature. The framework closes the previous simulation gap between the ferro- and paramagnetic limits.

  6. Towards the design of novel cuprate-based superconductors

    NASA Astrophysics Data System (ADS)

    Yee, Chuck-Hou

    The rapid maturation of materials databases combined with recent development of theories seeking to quantitatively link chemical properties to superconductivity in the cuprates provide the context to design novel superconductors. In this talk, we describe a framework designed to search for new superconductors, which combines chemical rules-of-thumb, insights of transition temperatures from dynamical mean-field theory, first-principles electronic structure tools, materials databases and structure prediction via evolutionary algorithms. We apply the framework to design a family of copper oxysulfides and evaluate the prospects of superconductivity.

  7. Comprehensive framework for visualizing and analyzing spatio-temporal dynamics of racial diversity in the entire United States

    PubMed Central

    Netzel, Pawel

    2017-01-01

    The United States is increasingly becoming a multi-racial society. To understand multiple consequences of this overall trend to our neighborhoods we need a methodology capable of spatio-temporal analysis of racial diversity at the local level but also across the entire U.S. Furthermore, such methodology should be accessible to stakeholders ranging from analysts to decision makers. In this paper we present a comprehensive framework for visualizing and analyzing diversity data that fulfills such requirements. The first component of our framework is a U.S.-wide, multi-year database of race sub-population grids which is freely available for download. These 30 m resolution grids have being developed using dasymetric modeling and are available for 1990-2000-2010. We summarize numerous advantages of gridded population data over commonly used Census tract-aggregated data. Using these grids frees analysts from constructing their own and allows them to focus on diversity analysis. The second component of our framework is a set of U.S.-wide, multi-year diversity maps at 30 m resolution. A diversity map is our product that classifies the gridded population into 39 communities based on their degrees of diversity, dominant race, and population density. It provides spatial information on diversity in a single, easy-to-understand map that can be utilized by analysts and end users alike. Maps based on subsequent Censuses provide information about spatio-temporal dynamics of diversity. Diversity maps are accessible through the GeoWeb application SocScape (http://sil.uc.edu/webapps/socscape_usa/) for an immediate online exploration. The third component of our framework is a proposal to quantitatively analyze diversity maps using a set of landscape metrics. Because of its form, a grid-based diversity map could be thought of as a diversity “landscape” and analyzed quantitatively using landscape metrics. We give a brief summary of most pertinent metrics and demonstrate how they can be applied to diversity maps. PMID:28358862

  8. Monitoring alert and drowsy states by modeling EEG source nonstationarity

    NASA Astrophysics Data System (ADS)

    Hsu, Sheng-Hsiou; Jung, Tzyy-Ping

    2017-10-01

    Objective. As a human brain performs various cognitive functions within ever-changing environments, states of the brain characterized by recorded brain activities such as electroencephalogram (EEG) are inevitably nonstationary. The challenges of analyzing the nonstationary EEG signals include finding neurocognitive sources that underlie different brain states and using EEG data to quantitatively assess the state changes. Approach. This study hypothesizes that brain activities under different states, e.g. levels of alertness, can be modeled as distinct compositions of statistically independent sources using independent component analysis (ICA). This study presents a framework to quantitatively assess the EEG source nonstationarity and estimate levels of alertness. The framework was tested against EEG data collected from 10 subjects performing a sustained-attention task in a driving simulator. Main results. Empirical results illustrate that EEG signals under alert versus drowsy states, indexed by reaction speeds to driving challenges, can be characterized by distinct ICA models. By quantifying the goodness-of-fit of each ICA model to the EEG data using the model deviation index (MDI), we found that MDIs were significantly correlated with the reaction speeds (r  =  -0.390 with alertness models and r  =  0.449 with drowsiness models) and the opposite correlations indicated that the two models accounted for sources in the alert and drowsy states, respectively. Based on the observed source nonstationarity, this study also proposes an online framework using a subject-specific ICA model trained with an initial (alert) state to track the level of alertness. For classification of alert against drowsy states, the proposed online framework achieved an averaged area-under-curve of 0.745 and compared favorably with a classic power-based approach. Significance. This ICA-based framework provides a new way to study changes of brain states and can be applied to monitoring cognitive or mental states of human operators in attention-critical settings or in passive brain-computer interfaces.

  9. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging

    PubMed Central

    Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A.; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-01-01

    Background Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. Methods We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow’s disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. Results On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Conclusions Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment. PMID:28540065

  10. Vocal development in a Waddington landscape

    PubMed Central

    Teramoto, Yayoi; Takahashi, Daniel Y; Holmes, Philip; Ghazanfar, Asif A

    2017-01-01

    Vocal development is the adaptive coordination of the vocal apparatus, muscles, the nervous system, and social interaction. Here, we use a quantitative framework based on optimal control theory and Waddington’s landscape metaphor to provide an integrated view of this process. With a biomechanical model of the marmoset monkey vocal apparatus and behavioral developmental data, we show that only the combination of the developing vocal tract, vocal apparatus muscles and nervous system can fully account for the patterns of vocal development. Together, these elements influence the shape of the monkeys’ vocal developmental landscape, tilting, rotating or shifting it in different ways. We can thus use this framework to make quantitative predictions regarding how interfering factors or experimental perturbations can change the landscape within a species, or to explain comparative differences in vocal development across species DOI: http://dx.doi.org/10.7554/eLife.20782.001 PMID:28092262

  11. Universal Design for Instruction in Postsecondary Education: A Systematic Review of Empirically Based Articles

    ERIC Educational Resources Information Center

    Roberts, Kelly D.; Park, Hye Jin; Brown, Steven; Cook, Bryan

    2011-01-01

    Universal Design for Instruction (UDI) in postsecondary education is a relatively new concept/framework that has generated significant support. The purpose of this literature review was to examine existing empirical research, including qualitative, quantitative, and mixed methods, on the use of UDI (and related terms) in postsecondary education.…

  12. Developing a Web-Based Hiring Resource at a State Medical College

    ERIC Educational Resources Information Center

    Drane, Daniel, III

    2017-01-01

    This study uses a sequential, mixed method, action research, quantitative to qualitative research design. The purpose of this study was to develop a useful standardized hiring process at a state medical college that brings clarity to the hiring process and policies. Two conceptual frameworks guided the innovations in this study--communities of…

  13. The Evidence for Student-Focused Motivational Interviewing in Educational Settings: A Review of the Literature

    ERIC Educational Resources Information Center

    Snape, Laura; Atkinson, Cathy

    2016-01-01

    The current systematic literature review sought to determine the effectiveness of Motivational Interviewing (MI) in educational settings. Student-focused school-based MI (SBMI) studies were assessed using qualitative and quantitative assessment frameworks and data were reported using PRISMA guidelines. Eleven studies met the inclusion criteria,…

  14. Technology Integration in K-12 Science Classrooms: An Analysis of Barriers and Implications

    ERIC Educational Resources Information Center

    Hechter, Richard P.; Vermette, Laurie Anne

    2013-01-01

    This paper examines the barriers to technology integration for Manitoban K-12 inservice science educators (n = 430) based on a 10-item online survey; results are analyzed according to teaching stream using the Technology, Pedagogy, and Content Knowledge (TPACK) framework. Quantitative descriptive statistics indicated that the leading barriers…

  15. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  16. Deep Borehole Disposal Safety Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, Geoffrey A.; Stein, Emily; Price, Laura L.

    This report presents a preliminary safety analysis for the deep borehole disposal (DBD) concept, using a safety case framework. A safety case is an integrated collection of qualitative and quantitative arguments, evidence, and analyses that substantiate the safety, and the level of confidence in the safety, of a geologic repository. This safety case framework for DBD follows the outline of the elements of a safety case, and identifies the types of information that will be required to satisfy these elements. At this very preliminary phase of development, the DBD safety case focuses on the generic feasibility of the DBD concept.more » It is based on potential system designs, waste forms, engineering, and geologic conditions; however, no specific site or regulatory framework exists. It will progress to a site-specific safety case as the DBD concept advances into a site-specific phase, progressing through consent-based site selection and site investigation and characterization.« less

  17. Quantitative diagnosis and prognosis framework for concrete degradation due to alkali-silica reaction

    NASA Astrophysics Data System (ADS)

    Mahadevan, Sankaran; Neal, Kyle; Nath, Paromita; Bao, Yanqing; Cai, Guowei; Orme, Peter; Adams, Douglas; Agarwal, Vivek

    2017-02-01

    This research is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in nuclear power plants that are subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements: monitoring, data analytics, uncertainty quantification, and prognosis. The current work focuses on degradation caused by ASR (alkali-silica reaction). Controlled concrete specimens with reactive aggregate are prepared to develop accelerated ASR degradation. Different monitoring techniques — infrared thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) — are studied for ASR diagnosis of the specimens. Both DIC and mechanical measurements record the specimen deformation caused by ASR gel expansion. Thermography is used to compare the thermal response of pristine and damaged concrete specimens and generate a 2-D map of the damage (i.e., ASR gel and cracked area), thus facilitating localization and quantification of damage. NIRAS and VAM are two separate vibration-based techniques that detect nonlinear changes in dynamic properties caused by the damage. The diagnosis results from multiple techniques are then fused using a Bayesian network, which also helps to quantify the uncertainty in the diagnosis. Prognosis of ASR degradation is then performed based on the current state of degradation obtained from diagnosis, by using a coupled thermo-hydro-mechanical-chemical (THMC) model for ASR degradation. This comprehensive approach of monitoring, data analytics, and uncertainty-quantified diagnosis and prognosis will facilitate the development of a quantitative, risk informed framework that will support continuous assessment and risk management of structural health and performance.

  18. Dependence of quantitative accuracy of CT perfusion imaging on system parameters

    NASA Astrophysics Data System (ADS)

    Li, Ke; Chen, Guang-Hong

    2017-03-01

    Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.

  19. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    PubMed

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method frequently correspond to subregions of visible spots that may represent post-translational modifications or co-migrating proteins that cannot be visually resolved from adjacent, more abundant proteins on the gel image. Thus, it is possible that this image-based approach may actually improve the realized resolution of the gel, revealing differentially expressed proteins that would not have even been detected as spots by modern spot-based analyses.

  20. Corra: Computational framework and tools for LC-MS discovery and targeted mass spectrometry-based proteomics

    PubMed Central

    Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D

    2008-01-01

    Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345

  1. Integrated national-scale assessment of wildfire risk to human and ecological values

    Treesearch

    Matthew P. Thompson; David E. Calkin; Mark A. Finney; Alan A. Ager; Julie W. Gilbertson-Day

    2011-01-01

    The spatial, temporal, and social dimensions of wildfire risk are challenging U.S. federal land management agencies to meet societal needs while maintaining the health of the lands they manage. In this paper we present a quantitative, geospatial wildfire risk assessment tool, developed in response to demands for improved risk-based decision frameworks. The methodology...

  2. Multiple Case Study on Cyberbullying's Impacts on Adolescent Technology Use

    ERIC Educational Resources Information Center

    Thompson, Kent W.

    2013-01-01

    This multiple case study focused on whether and how cyberbullying had an impact on students' use of technology. Analysis of the lived experiences of the participants in this study added depth to the quantitative research previously conducted by others in this area. The conceptual framework was based on social learning theory, which suggested that…

  3. Researching the Impact of Teacher Professional Development Programmes Based on Action Research, Constructivism, and Systems Theory

    ERIC Educational Resources Information Center

    Zehetmeier, Stefan; Andreitz, Irina; Erlacher, Willibald; Rauch, Franz

    2015-01-01

    This paper deals with the topic of professional development programmes' impact. Concepts and ideas of action research, constructivism, and systems theory are used as a theoretical framework and are combined to describe and analyse an exemplary professional development programme in Austria. Empirical findings from both quantitative and qualitative…

  4. The Impact of the Digital Divide on First-Year Community College Students

    ERIC Educational Resources Information Center

    Mansfield, Malinda

    2017-01-01

    Some students do not possess the learning management system (LMS) and basic computer skills needed for success in first-year experience (FYE) courses. The purpose of this quantitative study, based on the Integrative Learning Design Framework and theory of transactional distance, was to identify what basic computer skills and LMS skills are needed…

  5. Examining the Impact of Critical Feedback on Learner Engagement in Secondary Mathematics Classrooms: A Multi-Level Analysis

    ERIC Educational Resources Information Center

    Kearney, W. Sean; Webb, Michael; Goldhorn, Jeff; Peters, Michelle L.

    2013-01-01

    This article presents a quantitative study utilizing HLM to analyze classroom walkthrough data completed by principals within 87 secondary mathematics classrooms across 9 public schools in Texas. This research is based on the theoretical framework of learner engagement as established by Argryis & Schon (1996), and refined by Marks (2000). It…

  6. A methodology for evaluation of a markup-based specification of clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a three-phase, nine-step methodology for specification of clinical guidelines (GLs) by expert physicians, clinical editors, and knowledge engineers, and for quantitative evaluation of the specification's quality. We applied this methodology to a particular framework for incremental GL structuring (mark-up) and to GLs in three clinical domains with encouraging results.

  7. 360-degree physician performance assessment.

    PubMed

    Dubinsky, Isser; Jennings, Kelly; Greengarten, Moshe; Brans, Amy

    2010-01-01

    Few jurisdictions have a robust common approach to assessing the quantitative and qualitative dimensions of physician performance. In this article, we examine the need for 360-degree physician performance assessment and review the literature supporting comprehensive physician assessment. An evidence-based, "best practice" approach to the development of a 360-degree physician performance assessment framework is presented, including an overview of a tool kit to support implementation. The focus of the framework is to support physician career planning and to enhance the quality of patient care. Finally, the legal considerations related to implementing 360-degree physician performance assessment are explored.

  8. An “ADME Module” in the Adverse Outcome Pathway ...

    EPA Pesticide Factsheets

    The Adverse Outcome Pathway (AOP) framework has generated intense interest for its utility to organize knowledge on the toxicity mechanisms, starting from a molecular initiating event (MIE) to an adverse outcome across various levels of biological organization. While the AOP framework is designed to be chemical agnostic, it is widely recognized that considering chemicals’ absorption, distribution, metabolism, and excretion (ADME) behaviors is critical in applying the AOP framework in chemical-specific risk assessment. Currently, information being generated as part of the Organisation for Economic Co-operation and Development (OECD) AOP Development Programme is being consolidated into an AOP Knowledgebase (http://aopwiki.org). To enhance the use of this Knowledgebase in risk assessment, an ADME Module has been developed to contain the ADME information needed to connect MIEs and other key events in an AOP for specific chemicals. The conceptual structure of this module characterizes the potential of a chemical to reach the target MIE based on either its structure-based features or relative rates of ADME. The key features of this module include (1) a framework for connecting biology-based AOP to biochemical-based ADME and chemical/human activity-based exposure pathways; (2) links to qualitative tools (e.g., structure-based cheminformatic model) that screen for chemicals that could potentially reach the target MIE; (3) links to quantitative tools (e.g., dose-r

  9. The sociogeometry of inequality: Part I

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2015-05-01

    The study of socioeconomic inequality is of prime economic and social importance, and the key quantitative gauges of socioeconomic inequality are Lorenz curves and inequality indices-the most notable of the latter being the popular Gini index. In this series of papers we present a sociogeometric framework to the study of socioeconomic inequality. In this part we shift from the notion of Lorenz curves to the notion of Lorenz sets, define inequality indices in terms of Lorenz sets, and introduce and explore a collection of distance-based and width-based inequality indices stemming from the geometry of Lorenz sets. In particular, three principle diameters of Lorenz sets are established as meaningful quantitative gauges of socioeconomic inequality-thus indeed providing a geometric quantification of socioeconomic inequality.

  10. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  11. The point-of-care colorimetric detection of the biomarker of phenylamine in the human urine based on Tb3+ functionalized metal-organic framework.

    PubMed

    Qin, Si-Jia; Yan, Bing

    2018-07-05

    Phenylamine has been recognized as one of the most important industrially relevant ingredient and a crucial intermediate in chemical products. Yet, its internal exposure detection in human remains largely elusive due to the lack of potent monitoring method. Hereby this issue is addressed with a probe based on lanthanide functionalized organic-inorganic hybrid material Al(OH)(bpydc) (1) through post-synthetically modified metal-organic framework. The as-synthesized Tb 3+ @1 exhibits the strong luminescence of Tb 3+ originated from efficient energy transfer from the ligand, which can sense the biological metabolite p-aminophenol (PAP) of the phenylamine in the human urine. Linear correlation between the integrated fluorescence intensity and the concentration of PAP was investigated, enabling quantitative analysis of PAP in physiologically ranges (0.005-5 mg mL -1 ) with low detection limit (5 μg mL -1 ). This probe demonstrates excellent sensitivity, high selectivity, good reusability and quick response to PAP. Furthermore, a simple and rapid smartphone-based medical portable test paper was developed, whose quantitative color change can be easily distinguished visually. Hence, the PAP sensing platform can serve as a potential diagnostic tool for home monitoring of PAP. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. An audience research study to disseminate evidence about comprehensive state mental health parity legislation to US State policymakers: protocol.

    PubMed

    Purtle, Jonathan; Lê-Scherban, Félice; Shattuck, Paul; Proctor, Enola K; Brownson, Ross C

    2017-06-26

    A large proportion of the US population has limited access to mental health treatments because insurance providers limit the utilization of mental health services in ways that are more restrictive than for physical health services. Comprehensive state mental health parity legislation (C-SMHPL) is an evidence-based policy intervention that enhances mental health insurance coverage and improves access to care. Implementation of C-SMHPL, however, is limited. State policymakers have the exclusive authority to implement C-SMHPL, but sparse guidance exists to inform the design of strategies to disseminate evidence about C-SMHPL, and more broadly, evidence-based treatments and mental illness, to this audience. The aims of this exploratory audience research study are to (1) characterize US State policymakers' knowledge and attitudes about C-SMHPL and identify individual- and state-level attributes associated with support for C-SMHPL; and (2) integrate quantitative and qualitative data to develop a conceptual framework to disseminate evidence about C-SMHPL, evidence-based treatments, and mental illness to US State policymakers. The study uses a multi-level (policymaker, state), mixed method (QUAN→qual) approach and is guided by Kingdon's Multiple Streams Framework, adapted to incorporate constructs from Aarons' Model of Evidence-Based Implementation in Public Sectors. A multi-modal survey (telephone, post-mail, e-mail) of 600 US State policymakers (500 legislative, 100 administrative) will be conducted and responses will be linked to state-level variables. The survey will span domains such as support for C-SMHPL, knowledge and attitudes about C-SMHPL and evidence-based treatments, mental illness stigma, and research dissemination preferences. State-level variables will measure factors associated with C-SMHPL implementation, such as economic climate and political environment. Multi-level regression will determine the relative strength of individual- and state-level variables on policymaker support for C-SMHPL. Informed by survey results, semi-structured interviews will be conducted with approximately 50 US State policymakers to elaborate upon quantitative findings. Then, using a systematic process, quantitative and qualitative data will be integrated and a US State policymaker-focused C-SMHPL dissemination framework will be developed. Study results will provide the foundation for hypothesis-driven, experimental studies testing the effects of different dissemination strategies on state policymakers' support for, and implementation of, evidence-based mental health policy interventions.

  13. Biphasic dose responses in biology, toxicology and medicine: accounting for their generalizability and quantitative features.

    PubMed

    Calabrese, Edward J

    2013-11-01

    The most common quantitative feature of the hormetic-biphasic dose response is its modest stimulatory response which at maximum is only 30-60% greater than control values, an observation that is consistently independent of biological model, level of organization (i.e., cell, organ or individual), endpoint measured, chemical/physical agent studied, or mechanism. This quantitative feature suggests an underlying "upstream" mechanism common across biological systems, therefore basic and general. Hormetic dose response relationships represent an estimate of the peak performance of integrative biological processes that are allometrically based. Hormetic responses reflect both direct stimulatory or overcompensation responses to damage induced by relatively low doses of chemical or physical agents. The integration of the hormetic dose response within an allometric framework provides, for the first time, an explanation for both the generality and the quantitative features of the hormetic dose response. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Protecting group and switchable pore-discriminating adsorption properties of a hydrophilic-hydrophobic metal-organic framework.

    PubMed

    Mohideen, M Infas H; Xiao, Bo; Wheatley, Paul S; McKinlay, Alistair C; Li, Yang; Slawin, Alexandra M Z; Aldous, David W; Cessford, Naomi F; Düren, Tina; Zhao, Xuebo; Gill, Rachel; Thomas, K Mark; Griffin, John M; Ashbrook, Sharon E; Morris, Russell E

    2011-04-01

    Formed by linking metals or metal clusters through organic linkers, metal-organic frameworks are a class of solids with structural and chemical properties that mark them out as candidates for many emerging gas storage, separation, catalysis and biomedical applications. Important features of these materials include their high porosity and their flexibility in response to chemical or physical stimuli. Here, a copper-based metal-organic framework has been prepared in which the starting linker (benzene-1,3,5-tricarboxylic acid) undergoes selective monoesterification during synthesis to produce a solid with two different channel systems, lined by hydrophilic and hydrophobic surfaces, respectively. The material reacts differently to gases or vapours of dissimilar chemistry, some stimulating subtle framework flexibility or showing kinetic adsorption effects. Adsorption can be switched between the two channels by judicious choice of the conditions. The monoesterified linker is recoverable in quantitative yield, demonstrating possible uses of metal-organic frameworks in molecular synthetic chemistry as 'protecting groups' to accomplish selective transformations that are difficult using standard chemistry techniques.

  15. Learning framework of “Integrating Techniques” for Solving Problems and Its Empirical Application in Doctoral Course in Mechanical Engineering

    NASA Astrophysics Data System (ADS)

    Otsuka, Yuichi; Ohta, Kazuhide; Noguchi, Hiroshi

    The 21st century Center of Excellence (COE) program in Department of Mechanical Engineering Science at Kyushu University construct the training framework of learning “Integrating Techniques” by research presentations for students in different majors and accident analyses for practical cases by Ph.D course students. The training framework is composed of three processes : 1) Peer review among Ph.D course students for the presentations, 2) Instructions by teachers in order to improve the quality of the presentations based on the result of the peer-reviews, 3) Final evaluation for the improved presentations by teachers and the students. This research has elucidated the quantitative effectiveness of the framework by the evaluations using questionnaires for the presentations. Furthermore, the result of investigation for the course students has observed positive correlation between the significance of integration techniques and the enthusiasm for participating the course, which reveals the efficacy of the learning framework proposed.

  16. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  17. Leading for the long haul: a mixed-method evaluation of the Sustainment Leadership Scale (SLS).

    PubMed

    Ehrhart, Mark G; Torres, Elisa M; Green, Amy E; Trott, Elise M; Willging, Cathleen E; Moullin, Joanna C; Aarons, Gregory A

    2018-01-19

    Despite our progress in understanding the organizational context for implementation and specifically the role of leadership in implementation, its role in sustainment has received little attention. This paper took a mixed-method approach to examine leadership during the sustainment phase of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Utilizing the Implementation Leadership Scale as a foundation, we sought to develop a short, practical measure of sustainment leadership that can be used for both applied and research purposes. Data for this study were collected as a part of a larger mixed-method study of evidence-based intervention, SafeCare®, sustainment. Quantitative data were collected from 157 providers using web-based surveys. Confirmatory factor analysis was used to examine the factor structure of the Sustainment Leadership Scale (SLS). Qualitative data were collected from 95 providers who participated in one of 15 focus groups. A framework approach guided qualitative data analysis. Mixed-method integration was also utilized to examine convergence of quantitative and qualitative findings. Confirmatory factor analysis supported the a priori higher order factor structure of the SLS with subscales indicating a single higher order sustainment leadership factor. The SLS demonstrated excellent internal consistency reliability. Qualitative analyses offered support for the dimensions of sustainment leadership captured by the quantitative measure, in addition to uncovering a fifth possible factor, available leadership. This study found qualitative and quantitative support for the pragmatic SLS measure. The SLS can be used for assessing leadership of first-level leaders to understand how staff perceive leadership during sustainment and to suggest areas where leaders could direct more attention in order to increase the likelihood that EBIs are institutionalized into the normal functioning of the organization.

  18. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    ERIC Educational Resources Information Center

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  19. Support for School-to-School Networks: How Networking Teachers Perceive Support Activities of a Local Coordinating Agency

    ERIC Educational Resources Information Center

    Sartory, Katharina; Jungermann, Anja-Kristin; Järvinen, Hanna

    2017-01-01

    External support by a local coordinating agency facilitates the work of school-to-school networks. This study provides an innovative theoretical framework to analyse how support provided by local education offices for school-to-school networks is perceived by the participating teachers. Based on a quantitative survey and qualitative interview data…

  20. Planning and Monitoring the Quality of Primary Education in Sub-Saharan Africa. AFTHR Technical Note No. 14.

    ERIC Educational Resources Information Center

    Heneveld, Ward

    This report is based on the conviction that improvements in the quality of education must focus on the school as the unit of change. Through a review of the qualitative research literature on school improvement and the more quantitative literature on school effectiveness, a conceptual framework that identifies generic factors that determine school…

  1. Applying national survey results for strategic planning and program improvement: the National Diabetes Education Program.

    PubMed

    Griffey, Susan; Piccinino, Linda; Gallivan, Joanne; Lotenberg, Lynne Doner; Tuncer, Diane

    2015-02-01

    Since the 1970s, the federal government has spearheaded major national education programs to reduce the burden of chronic diseases in the United States. These prevention and disease management programs communicate critical information to the public, those affected by the disease, and health care providers. The National Diabetes Education Program (NDEP), the leading federal program on diabetes sponsored by the National Institutes of Health (NIH) and the Centers for Disease Control and Prevention (CDC), uses primary and secondary quantitative data and qualitative audience research to guide program planning and evaluation. Since 2006, the NDEP has filled the gaps in existing quantitative data sources by conducting its own population-based survey, the NDEP National Diabetes Survey (NNDS). The NNDS is conducted every 2–3 years and tracks changes in knowledge, attitudes and practice indicators in key target audiences. This article describes how the NDEP has used the NNDS as a key component of its evaluation framework and how it applies the survey results for strategic planning and program improvement. The NDEP's use of the NNDS illustrates how a program evaluation framework that includes periodic population-based surveys can serve as an evaluation model for similar national health education programs.

  2. General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.

    PubMed

    Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng

    2017-05-02

    As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.

  3. Quantitative analysis of intra-Golgi transport shows intercisternal exchange for all cargo

    PubMed Central

    Dmitrieff, Serge; Rao, Madan; Sens, Pierre

    2013-01-01

    The mechanisms controlling the transport of proteins through the Golgi stack of mammalian and plant cells is the subject of intense debate, with two models, cisternal progression and intercisternal exchange, emerging as major contenders. A variety of transport experiments have claimed support for each of these models. We reevaluate these experiments using a single quantitative coarse-grained framework of intra-Golgi transport that accounts for both transport models and their many variants. Our analysis makes a definitive case for the existence of intercisternal exchange both for small membrane proteins and large protein complexes––this implies that membrane structures larger than the typical protein-coated vesicles must be involved in transport. Notwithstanding, we find that current observations on protein transport cannot rule out cisternal progression as contributing significantly to the transport process. To discriminate between the different models of intra-Golgi transport, we suggest experiments and an analysis based on our extended theoretical framework that compare the dynamics of transiting and resident proteins. PMID:24019488

  4. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  5. A Multi-responsive Regenerable Europium-Organic Framework Luminescent Sensor for Fe3+ , CrVI Anions, and Picric Acid.

    PubMed

    Liu, Wei; Huang, Xin; Xu, Cong; Chen, Chunyang; Yang, Lizi; Dou, Wei; Chen, Wanmin; Yang, Huan; Liu, Weisheng

    2016-12-23

    A novel luminescent microporous lanthanide metal-organic framework (Ln-MOF) based on a urea-containing ligand has been successfully assembled. Structural analysis revealed that the framework features two types of 1D channels, with urea N-H bonds projecting into the pores. Luminescence studies have revealed that the Ln-MOF exhibits high sensitivity, good selectivity, and a fast luminescence quenching response towards Fe 3+ , Cr VI anions, and picric acid. In particular, in the detection of Cr 2 O 7 2- and picric acid, the Ln-MOF can be simply and quickly regenerated, thus exhibiting excellent recyclability. To the best of our knowledge, this is the first example of a multi-responsive luminescent Ln-MOF sensor for Fe 3+ , Cr VI anions, and picric acid based on a urea derivative. This Ln-MOF may potentially be used as a multi-responsive regenerable luminescent sensor for the quantitative detection of toxic and harmful substances. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Maximum entropy estimation of a Benzene contaminated plume using ecotoxicological assays.

    PubMed

    Wahyudi, Agung; Bartzke, Mariana; Küster, Eberhard; Bogaert, Patrick

    2013-01-01

    Ecotoxicological bioassays, e.g. based on Danio rerio teratogenicity (DarT) or the acute luminescence inhibition with Vibrio fischeri, could potentially lead to significant benefits for detecting on site contaminations on qualitative or semi-quantitative bases. The aim was to use the observed effects of two ecotoxicological assays for estimating the extent of a Benzene groundwater contamination plume. We used a Maximum Entropy (MaxEnt) method to rebuild a bivariate probability table that links the observed toxicity from the bioassays with Benzene concentrations. Compared with direct mapping of the contamination plume as obtained from groundwater samples, the MaxEnt concentration map exhibits on average slightly higher concentrations though the global pattern is close to it. This suggest MaxEnt is a valuable method to build a relationship between quantitative data, e.g. contaminant concentrations, and more qualitative or indirect measurements, in a spatial mapping framework, which is especially useful when clear quantitative relation is not at hand. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A Clustering-Based Approach to Enriching Code Foraging Environment.

    PubMed

    Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu

    2016-09-01

    Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.

  8. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  9. Numerical simulation of the casting process of titanium removable partial denture frameworks.

    PubMed

    Wu, Menghuai; Wagner, Ingo; Sahm, Peter R; Augthun, Michael

    2002-03-01

    The objective of this work was to study the filling incompleteness and porosity defects in titanium removal partial denture frameworks by means of numerical simulation. Two frameworks, one for lower jaw and one for upper jaw, were chosen according to dentists' recommendation to be simulated. Geometry of the frameworks were laser-digitized and converted into a simulation software (MAGMASOFT). Both mold filling and solidification of the castings with different sprue designs (e.g. tree, ball, and runner-bar) were numerically calculated. The shrinkage porosity was quantitatively predicted by a feeding criterion, the potential filling defect and gas pore sensitivity were estimated based on the filling and solidification results. A satisfactory sprue design with process parameters was finally recommended for real casting trials (four replica for each frameworks). All the frameworks were successfully cast. Through X-ray radiographic inspections it was found that all the castings were acceptably sound except for only one case in which gas bubbles were detected in the grasp region of the frame. It is concluded that numerical simulation aids to achieve understanding of the casting process and defect formation in titanium frameworks, hence to minimize the risk of producing defect casting by improving the sprue design and process parameters.

  10. Control volume based hydrocephalus research; analysis of human data

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  11. An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.

    PubMed

    Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong

    2016-01-01

    With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.

  12. How ecology shapes exploitation: a framework to predict the behavioural response of human and animal foragers along exploration-exploitation trade-offs.

    PubMed

    Monk, Christopher T; Barbier, Matthieu; Romanczuk, Pawel; Watson, James R; Alós, Josep; Nakayama, Shinnosuke; Rubenstein, Daniel I; Levin, Simon A; Arlinghaus, Robert

    2018-06-01

    Understanding how humans and other animals behave in response to changes in their environments is vital for predicting population dynamics and the trajectory of coupled social-ecological systems. Here, we present a novel framework for identifying emergent social behaviours in foragers (including humans engaged in fishing or hunting) in predator-prey contexts based on the exploration difficulty and exploitation potential of a renewable natural resource. A qualitative framework is introduced that predicts when foragers should behave territorially, search collectively, act independently or switch among these states. To validate it, we derived quantitative predictions from two models of different structure: a generic mathematical model, and a lattice-based evolutionary model emphasising exploitation and exclusion costs. These models independently identified that the exploration difficulty and exploitation potential of the natural resource controls the social behaviour of resource exploiters. Our theoretical predictions were finally compared to a diverse set of empirical cases focusing on fisheries and aquatic organisms across a range of taxa, substantiating the framework's predictions. Understanding social behaviour for given social-ecological characteristics has important implications, particularly for the design of governance structures and regulations to move exploited systems, such as fisheries, towards sustainability. Our framework provides concrete steps in this direction. © 2018 John Wiley & Sons Ltd/CNRS.

  13. Improved cardiac motion detection from ultrasound images using TDIOF: a combined B-mode/ tissue Doppler approach

    NASA Astrophysics Data System (ADS)

    Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.

    2013-03-01

    Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.

  14. Obesity prevention: Comparison of techniques and potential solution

    NASA Astrophysics Data System (ADS)

    Zulkepli, Jafri; Abidin, Norhaslinda Zainal; Zaibidi, Nerda Zura

    2014-12-01

    Over the years, obesity prevention has been a broadly studied subject by both academicians and practitioners. It is one of the most serious public health issue as it can cause numerous chronic health and psychosocial problems. Research is needed to suggest a population-based strategy for obesity prevention. In the academic environment, the importance of obesity prevention has triggered various problem solving approaches. A good obesity prevention model, should comprehend and cater all complex and dynamics issues. Hence, the main purpose of this paper is to discuss the qualitative and quantitative approaches on obesity prevention study and to provide an extensive literature review on various recent modelling techniques for obesity prevention. Based on these literatures, the comparison of both quantitative and qualitative approahes are highlighted and the justification on the used of system dynamics technique to solve the population of obesity is discussed. Lastly, a potential framework solution based on system dynamics modelling is proposed.

  15. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  16. Temporal efficiency evaluation and small-worldness characterization in temporal networks

    PubMed Central

    Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu

    2016-01-01

    Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks. PMID:27682314

  17. Temporal efficiency evaluation and small-worldness characterization in temporal networks

    NASA Astrophysics Data System (ADS)

    Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu

    2016-09-01

    Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks.

  18. An Architecture Framework for Orchestrating Context-Aware IT Ecosystems: A Case Study for Quantitative Evaluation †.

    PubMed

    Park, Soojin; Park, Sungyong; Park, Young B

    2018-02-12

    With the emergence of various forms of smart devices and new paradigms such as the Internet of Things (IoT) concept, the IT (Information Technology) service areas are expanding explosively compared to the provision of services by single systems. A new system operation concept that has emerged in accordance with such technical trends is the IT ecosystem. The IT ecosystem can be considered a special type of system of systems in which multiple systems with various degrees of autonomy achieve common goals while adapting to the given environment. The single systems that participate in the IT ecosystem adapt autonomously to the current situation based on collected data from sensors. Furthermore, to maintain the services supported by the whole IT ecosystem sustainably, the configuration of single systems that participate in the IT ecosystem also changes appropriately in accordance with the changed situation. In order to support the IT ecosystem, this paper proposes an architecture framework that supports dynamic configuration changes to achieve the goal of the whole IT ecosystem, while ensuring the autonomy of single systems through the collection of data from sensors so as to recognize the situational context of individual participating systems. For the feasibility evaluation of the proposed framework, a simulated example of an IT ecosystem for unmanned forest management was constructed, and the quantitative evaluation results are discussed in terms of the extent to which the proposed architecture framework can continuously provide sustainable services in response to diverse environmental context changes.

  19. An Architecture Framework for Orchestrating Context-Aware IT Ecosystems: A Case Study for Quantitative Evaluation †

    PubMed Central

    Park, Young B.

    2018-01-01

    With the emergence of various forms of smart devices and new paradigms such as the Internet of Things (IoT) concept, the IT (Information Technology) service areas are expanding explosively compared to the provision of services by single systems. A new system operation concept that has emerged in accordance with such technical trends is the IT ecosystem. The IT ecosystem can be considered a special type of system of systems in which multiple systems with various degrees of autonomy achieve common goals while adapting to the given environment. The single systems that participate in the IT ecosystem adapt autonomously to the current situation based on collected data from sensors. Furthermore, to maintain the services supported by the whole IT ecosystem sustainably, the configuration of single systems that participate in the IT ecosystem also changes appropriately in accordance with the changed situation. In order to support the IT ecosystem, this paper proposes an architecture framework that supports dynamic configuration changes to achieve the goal of the whole IT ecosystem, while ensuring the autonomy of single systems through the collection of data from sensors so as to recognize the situational context of individual participating systems. For the feasibility evaluation of the proposed framework, a simulated example of an IT ecosystem for unmanned forest management was constructed, and the quantitative evaluation results are discussed in terms of the extent to which the proposed architecture framework can continuously provide sustainable services in response to diverse environmental context changes. PMID:29439540

  20. Integrating the social determinants of health into two interprofessional courses: Findings from a pilot study.

    PubMed

    Lane, Sandra D; Keefe, Robert H; Rubinstein, Robert A; Hall, Meghan; Kelly, Kathleen A; Satterly, Lynn Beth; Shaw, Andrea; Fisher, Julian

    2018-02-07

    Five colleges and universities in Upstate New York, United States, created the 'Route-90 Collaborative' to support faculty implementing the Institute of Medicine's (IOM) Framework for Educating Health Professionals to Address the Social Determinants of Health. The two courses described herein used a flipped classroom approach in which students from 14 different nations were responsible for facilitating individual classes. This descriptive study used an educational intervention in two interprofessional courses - reproductive health and global health - based on the IOM Framework into two courses. The evaluation used quantitative and open-ended text response data from students. Course evaluations indicated the students found the courses helped them to learn more about health issues and service delivery in various countries, expand their knowledge base on sociocultural and ecological influences on health care, and broaden their perspectives on various health topics so they will be able to provide higher quality healthcare. Although this is the first effort of our Collaborative to implement the Framework, given the student feedback, we believe implementing the Framework in various courses has the potential to enhance healthcare service delivery and reduce the negative impact of social determinants of health.

  1. Rethinking research in the medical humanities: a scoping review and narrative synthesis of quantitative outcome studies.

    PubMed

    Dennhardt, Silke; Apramian, Tavis; Lingard, Lorelei; Torabi, Nazi; Arntfield, Shannon

    2016-03-01

    The rise of medical humanities teaching in medical education has introduced pressure to prove efficacy and utility. Review articles on the available evidence have been criticised for poor methodology and unwarranted conclusions. To support a more nuanced discussion of how the medical humanities work, we conducted a scoping review of quantitative studies of medical humanities teaching. Using a search strategy involving MEDLINE, EMBASE and ERIC, and hand searching, our scoping review located 11 045 articles that referred to the use of medical humanities teaching in medical education. Of these, 62 studies using quantitative evaluation methods were selected for review. Three iterations of analysis were performed: descriptive, conceptual, and discursive. Descriptive analysis revealed that the medical humanities as a whole cannot be easily systematised based on simple descriptive categories. Conceptual analysis supported the development of a conceptual framework in which the foci of the arts and humanities in medical education can be mapped alongside their related epistemic functions for teaching and learning. Within the framework, art functioned as expertise, as dialogue or as a means of expression and transformation. In the discursive analysis, we found three main ways in which the relationship between the arts and humanities and medicine was constructed as, respectively, intrinsic, additive and curative. This review offers a nuanced framework of how different types of medical humanities work. The epistemological assumptions and discursive positioning of medical humanities teaching frame the forms of outcomes research that are considered relevant to curriculum decision making, and shed light on why dominant review methodologies make some functions of medical humanities teaching visible and render others invisible. We recommend the use of this framework to improve the rigor and relevance of future explorations of the efficacy and utility of medical humanities teaching. © 2016 John Wiley & Sons Ltd.

  2. Predicting Future Morphological Changes of Lesions from Radiotracer Uptake in 18F-FDG-PET Images

    PubMed Central

    Bagci, Ulas; Yao, Jianhua; Miller-Jaster, Kirsten; Chen, Xinjian; Mollura, Daniel J.

    2013-01-01

    We introduce a novel computational framework to enable automated identification of texture and shape features of lesions on 18F-FDG-PET images through a graph-based image segmentation method. The proposed framework predicts future morphological changes of lesions with high accuracy. The presented methodology has several benefits over conventional qualitative and semi-quantitative methods, due to its fully quantitative nature and high accuracy in each step of (i) detection, (ii) segmentation, and (iii) feature extraction. To evaluate our proposed computational framework, thirty patients received 2 18F-FDG-PET scans (60 scans total), at two different time points. Metastatic papillary renal cell carcinoma, cerebellar hemongioblastoma, non-small cell lung cancer, neurofibroma, lymphomatoid granulomatosis, lung neoplasm, neuroendocrine tumor, soft tissue thoracic mass, nonnecrotizing granulomatous inflammation, renal cell carcinoma with papillary and cystic features, diffuse large B-cell lymphoma, metastatic alveolar soft part sarcoma, and small cell lung cancer were included in this analysis. The radiotracer accumulation in patients' scans was automatically detected and segmented by the proposed segmentation algorithm. Delineated regions were used to extract shape and textural features, with the proposed adaptive feature extraction framework, as well as standardized uptake values (SUV) of uptake regions, to conduct a broad quantitative analysis. Evaluation of segmentation results indicates that our proposed segmentation algorithm has a mean dice similarity coefficient of 85.75±1.75%. We found that 28 of 68 extracted imaging features were correlated well with SUVmax (p<0.05), and some of the textural features (such as entropy and maximum probability) were superior in predicting morphological changes of radiotracer uptake regions longitudinally, compared to single intensity feature such as SUVmax. We also found that integrating textural features with SUV measurements significantly improves the prediction accuracy of morphological changes (Spearman correlation coefficient = 0.8715, p<2e-16). PMID:23431398

  3. A Framework for Integrating Qualitative and Quantitative Data in Knowledge, Attitude, and Practice Studies: A Case Study of Pesticide Usage in Eastern Uganda

    PubMed Central

    Muleme, James; Kankya, Clovice; Ssempebwa, John C.; Mazeri, Stella; Muwonge, Adrian

    2017-01-01

    Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications. PMID:29276703

  4. A Framework for Integrating Qualitative and Quantitative Data in Knowledge, Attitude, and Practice Studies: A Case Study of Pesticide Usage in Eastern Uganda.

    PubMed

    Muleme, James; Kankya, Clovice; Ssempebwa, John C; Mazeri, Stella; Muwonge, Adrian

    2017-01-01

    Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications.

  5. Metrics and Mappings: A Framework for Understanding Real-World Quantitative Estimation.

    ERIC Educational Resources Information Center

    Brown, Norman R.; Siegler, Robert S.

    1993-01-01

    A metrics and mapping framework is proposed to account for how heuristics, domain-specific reasoning, and intuitive statistical induction processes are integrated to generate estimates. Results of 4 experiments involving 188 undergraduates illustrate framework usefulness and suggest when people use heuristics and when they emphasize…

  6. Towards a quantitative description of tunneling conductance of superconductors: Application to LiFeAs

    DOE PAGES

    Kreisel, A.; Nelson, R.; Berlijn, T.; ...

    2016-12-27

    Since the discovery of iron-based superconductors, a number of theories have been put forward to explain the qualitative origin of pairing, but there have been few attempts to make quantitative, material-specific comparisons to experimental results. The spin-fluctuation theory of electronic pairing, based on first-principles electronic structure calculations, makes predictions for the superconducting gap. Within the same framework, the surface wave functions may also be calculated, allowing, e.g., for detailed comparisons between theoretical results and measured scanning tunneling topographs and spectra. We present such a comparison between theory and experiment on the Fe-based superconductor LiFeAs. Our results for the homogeneous surfacemore » as well as impurity states are presented as a benchmark test of the theory. For the homogeneous system, we argue that the maxima of topographic image intensity may be located at positions above either the As or Li atoms, depending on tip height and the setpoint current of the measurement. We further report the experimental observation of transitions between As- and Li-registered lattices as functions of both tip height and setpoint bias, in agreement with this prediction. Next, we give a detailed comparison between the simulated scanning tunneling microscopy images of transition-metal defects with experiment. Finally, we discuss possible extensions of the current framework to obtain a theory with true predictive power for scanning tunneling microscopy in Fe-based systems.« less

  7. Towards a quantitative description of tunneling conductance of superconductors: Application to LiFeAs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreisel, A.; Nelson, R.; Berlijn, T.

    Since the discovery of iron-based superconductors, a number of theories have been put forward to explain the qualitative origin of pairing, but there have been few attempts to make quantitative, material-specific comparisons to experimental results. The spin-fluctuation theory of electronic pairing, based on first-principles electronic structure calculations, makes predictions for the superconducting gap. Within the same framework, the surface wave functions may also be calculated, allowing, e.g., for detailed comparisons between theoretical results and measured scanning tunneling topographs and spectra. We present such a comparison between theory and experiment on the Fe-based superconductor LiFeAs. Our results for the homogeneous surfacemore » as well as impurity states are presented as a benchmark test of the theory. For the homogeneous system, we argue that the maxima of topographic image intensity may be located at positions above either the As or Li atoms, depending on tip height and the setpoint current of the measurement. We further report the experimental observation of transitions between As- and Li-registered lattices as functions of both tip height and setpoint bias, in agreement with this prediction. Next, we give a detailed comparison between the simulated scanning tunneling microscopy images of transition-metal defects with experiment. Finally, we discuss possible extensions of the current framework to obtain a theory with true predictive power for scanning tunneling microscopy in Fe-based systems.« less

  8. Quantitative Theoretical and Conceptual Framework Use in Agricultural Education Research

    ERIC Educational Resources Information Center

    Kitchel, Tracy; Ball, Anna L.

    2014-01-01

    The purpose of this philosophical paper was to articulate the disciplinary tenets for consideration when using theory in agricultural education quantitative research. The paper clarified terminology around the concept of theory in social sciences and introduced inaccuracies of theory use in agricultural education quantitative research. Finally,…

  9. Quantitative Analysis of Rat Dorsal Root Ganglion Neurons Cultured on Microelectrode Arrays Based on Fluorescence Microscopy Image Processing.

    PubMed

    Mari, João Fernando; Saito, José Hiroki; Neves, Amanda Ferreira; Lotufo, Celina Monteiro da Cruz; Destro-Filho, João-Batista; Nicoletti, Maria do Carmo

    2015-12-01

    Microelectrode Arrays (MEA) are devices for long term electrophysiological recording of extracellular spontaneous or evocated activities on in vitro neuron culture. This work proposes and develops a framework for quantitative and morphological analysis of neuron cultures on MEAs, by processing their corresponding images, acquired by fluorescence microscopy. The neurons are segmented from the fluorescence channel images using a combination of segmentation by thresholding, watershed transform, and object classification. The positioning of microelectrodes is obtained from the transmitted light channel images using the circular Hough transform. The proposed method was applied to images of dissociated culture of rat dorsal root ganglion (DRG) neuronal cells. The morphological and topological quantitative analysis carried out produced information regarding the state of culture, such as population count, neuron-to-neuron and neuron-to-microelectrode distances, soma morphologies, neuron sizes, neuron and microelectrode spatial distributions. Most of the analysis of microscopy images taken from neuronal cultures on MEA only consider simple qualitative analysis. Also, the proposed framework aims to standardize the image processing and to compute quantitative useful measures for integrated image-signal studies and further computational simulations. As results show, the implemented microelectrode identification method is robust and so are the implemented neuron segmentation and classification one (with a correct segmentation rate up to 84%). The quantitative information retrieved by the method is highly relevant to assist the integrated signal-image study of recorded electrophysiological signals as well as the physical aspects of the neuron culture on MEA. Although the experiments deal with DRG cell images, cortical and hippocampal cell images could also be processed with small adjustments in the image processing parameter estimation.

  10. L.E.A.D.: a framework for evidence gathering and use for the prevention of obesity and other complex public health problems.

    PubMed

    Chatterji, Madhabi; Green, Lawrence W; Kumanyika, Shiriki

    2014-02-01

    This article summarizes a comprehensive, systems-oriented framework designed to improve the use of a wide variety of evidence sources to address population-wide obesity problems. The L.E.A.D. framework (for Locate the evidence, Evaluate the evidence, Assemble the evidence, and inform Decisions), developed by an expert consensus committee convened by the Institute of Medicine, is broadly applicable to complex, community-wide health problems. The article explains how to use the framework, presenting an evidence typology that helps specify relevant research questions and includes examples of how particular research methodologies and sources of evidence relate to questions that stem from decision-maker needs. The utility of a range of quantitative, qualitative, and mixed method designs and data sources for assembling a broad and credible evidence base is discussed, with a call for ongoing "evidence generation" to fill information gaps using the recommended systems perspective.

  11. Using Bayesian regression to test hypotheses about relationships between parameters and covariates in cognitive models.

    PubMed

    Boehm, Udo; Steingroever, Helen; Wagenmakers, Eric-Jan

    2018-06-01

    An important tool in the advancement of cognitive science are quantitative models that represent different cognitive variables in terms of model parameters. To evaluate such models, their parameters are typically tested for relationships with behavioral and physiological variables that are thought to reflect specific cognitive processes. However, many models do not come equipped with the statistical framework needed to relate model parameters to covariates. Instead, researchers often revert to classifying participants into groups depending on their values on the covariates, and subsequently comparing the estimated model parameters between these groups. Here we develop a comprehensive solution to the covariate problem in the form of a Bayesian regression framework. Our framework can be easily added to existing cognitive models and allows researchers to quantify the evidential support for relationships between covariates and model parameters using Bayes factors. Moreover, we present a simulation study that demonstrates the superiority of the Bayesian regression framework to the conventional classification-based approach.

  12. The concept of "buffering" in systems and control theory: from metaphor to math.

    PubMed

    Schmitt, Bernhard M

    2004-10-04

    The paradigm of "buffering" is used increasingly for the description of diverse "systemic" phenomena encountered in evolutionary genetics, ecology, integrative physiology, and other areas. However, in this new context, the paradigm has not yet matured into a truly quantitative concept inasmuch as it lacks a corresponding quantitative measure of "systems-level buffering strength". Here, I develop such measures on the basis of a formal and general approach to the quantitation of buffering action. "Systems-level buffering" is shown to be synonymous with "disturbance rejection" in feedback-control systems, and can be quantitated by means of dimensionless proportions between partial flows in two-partitioned systems. The units allow either the time-independent, "static" buffering properties or the time-dependent, "dynamic" ones to be measured. Analogous to this "resistance to change", one can define and measure the "conductance to change"; this quantity corresponds to "set-point tracking" in feedback-control systems. Together, these units provide a systematic framework for the quantitation of buffering action in systems biology, and reveal the common principle behind systems-level buffering, classical acid-base buffering, and multiple other manifestations of buffering.

  13. Design and analysis of quantitative differential proteomics investigations using LC-MS technology.

    PubMed

    Bukhman, Yury V; Dharsee, Moyez; Ewing, Rob; Chu, Peter; Topaloglou, Thodoros; Le Bihan, Thierry; Goh, Theo; Duewel, Henry; Stewart, Ian I; Wisniewski, Jacek R; Ng, Nancy F

    2008-02-01

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics is becoming an increasingly important tool in characterizing the abundance of proteins in biological samples of various types and across conditions. Effects of disease or drug treatments on protein abundance are of particular interest for the characterization of biological processes and the identification of biomarkers. Although state-of-the-art instrumentation is available to make high-quality measurements and commercially available software is available to process the data, the complexity of the technology and data presents challenges for bioinformaticians and statisticians. Here, we describe a pipeline for the analysis of quantitative LC-MS data. Key components of this pipeline include experimental design (sample pooling, blocking, and randomization) as well as deconvolution and alignment of mass chromatograms to generate a matrix of molecular abundance profiles. An important challenge in LC-MS-based quantitation is to be able to accurately identify and assign abundance measurements to members of protein families. To address this issue, we implement a novel statistical method for inferring the relative abundance of related members of protein families from tryptic peptide intensities. This pipeline has been used to analyze quantitative LC-MS data from multiple biomarker discovery projects. We illustrate our pipeline here with examples from two of these studies, and show that the pipeline constitutes a complete workable framework for LC-MS-based differential quantitation. Supplementary material is available at http://iec01.mie.utoronto.ca/~thodoros/Bukhman/.

  14. Ion Channel ElectroPhysiology Ontology (ICEPO) - a case study of text mining assisted ontology development.

    PubMed

    Elayavilli, Ravikumar Komandur; Liu, Hongfang

    2016-01-01

    Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.

  15. Analysing task design and students' responses to context-based problems through different analytical frameworks

    NASA Astrophysics Data System (ADS)

    Broman, Karolina; Bernholt, Sascha; Parchmann, Ilka

    2015-05-01

    Background:Context-based learning approaches are used to enhance students' interest in, and knowledge about, science. According to different empirical studies, students' interest is improved by applying these more non-conventional approaches, while effects on learning outcomes are less coherent. Hence, further insights are needed into the structure of context-based problems in comparison to traditional problems, and into students' problem-solving strategies. Therefore, a suitable framework is necessary, both for the analysis of tasks and strategies. Purpose:The aim of this paper is to explore traditional and context-based tasks as well as students' responses to exemplary tasks to identify a suitable framework for future design and analyses of context-based problems. The paper discusses different established frameworks and applies the Higher-Order Cognitive Skills/Lower-Order Cognitive Skills (HOCS/LOCS) taxonomy and the Model of Hierarchical Complexity in Chemistry (MHC-C) to analyse traditional tasks and students' responses. Sample:Upper secondary students (n=236) at the Natural Science Programme, i.e. possible future scientists, are investigated to explore learning outcomes when they solve chemistry tasks, both more conventional as well as context-based chemistry problems. Design and methods:A typical chemistry examination test has been analysed, first the test items in themselves (n=36), and thereafter 236 students' responses to one representative context-based problem. Content analysis using HOCS/LOCS and MHC-C frameworks has been applied to analyse both quantitative and qualitative data, allowing us to describe different problem-solving strategies. Results:The empirical results show that both frameworks are suitable to identify students' strategies, mainly focusing on recall of memorized facts when solving chemistry test items. Almost all test items were also assessing lower order thinking. The combination of frameworks with the chemistry syllabus has been found successful to analyse both the test items as well as students' responses in a systematic way. The framework can therefore be applied in the design of new tasks, the analysis and assessment of students' responses, and as a tool for teachers to scaffold students in their problem-solving process. Conclusions:This paper gives implications for practice and for future research to both develop new context-based problems in a structured way, as well as providing analytical tools for investigating students' higher order thinking in their responses to these tasks.

  16. 1:1 Computing Programs: An Analysis of the Stages of Concerns of 1:1 Integration, Professional Development Training and Level of Classroom Use by Illinois High School Teachers

    ERIC Educational Resources Information Center

    Detering, Brad

    2017-01-01

    This research study, grounded in the theoretical framework of education change, used the Concerns-Based Adoption Model of change to examine the concerns of Illinois high school teachers and administrators regarding the implementation of 1:1 computing programs. A quantitative study of educators investigated the stages of concern and the mathematics…

  17. A Semiquantitative Framework for Gene Regulatory Networks: Increasing the Time and Quantitative Resolution of Boolean Networks

    PubMed Central

    Kerkhofs, Johan; Geris, Liesbet

    2015-01-01

    Boolean models have been instrumental in predicting general features of gene networks and more recently also as explorative tools in specific biological applications. In this study we introduce a basic quantitative and a limited time resolution to a discrete (Boolean) framework. Quantitative resolution is improved through the employ of normalized variables in unison with an additive approach. Increased time resolution stems from the introduction of two distinct priority classes. Through the implementation of a previously published chondrocyte network and T helper cell network, we show that this addition of quantitative and time resolution broadens the scope of biological behaviour that can be captured by the models. Specifically, the quantitative resolution readily allows models to discern qualitative differences in dosage response to growth factors. The limited time resolution, in turn, can influence the reachability of attractors, delineating the likely long term system behaviour. Importantly, the information required for implementation of these features, such as the nature of an interaction, is typically obtainable from the literature. Nonetheless, a trade-off is always present between additional computational cost of this approach and the likelihood of extending the model’s scope. Indeed, in some cases the inclusion of these features does not yield additional insight. This framework, incorporating increased and readily available time and semi-quantitative resolution, can help in substantiating the litmus test of dynamics for gene networks, firstly by excluding unlikely dynamics and secondly by refining falsifiable predictions on qualitative behaviour. PMID:26067297

  18. Basic research in evolution and ecology enhances forensics.

    PubMed

    Tomberlin, Jeffery K; Benbow, M Eric; Tarone, Aaron M; Mohr, Rachel M

    2011-02-01

    In 2009, the National Research Council recommended that the forensic sciences strengthen their grounding in basic empirical research to mitigate against criticism and improve accuracy and reliability. For DNA-based identification, this goal was achieved under the guidance of the population genetics community. This effort resulted in DNA analysis becoming the 'gold standard' of the forensic sciences. Elsewhere, we proposed a framework for streamlining research in decomposition ecology, which promotes quantitative approaches to collecting and applying data to forensic investigations involving decomposing human remains. To extend the ecological aspects of this approach, this review focuses on forensic entomology, although the framework can be extended to other areas of decomposition. Published by Elsevier Ltd.

  19. Nonlocal means-based speckle filtering for ultrasound images

    PubMed Central

    Coupé, Pierrick; Hellier, Pierre; Kervrann, Charles; Barillot, Christian

    2009-01-01

    In image processing, restoration is expected to improve the qualitative inspection of the image and the performance of quantitative image analysis techniques. In this paper, an adaptation of the Non Local (NL-) means filter is proposed for speckle reduction in ultrasound (US) images. Originally developed for additive white Gaussian noise, we propose to use a Bayesian framework to derive a NL-means filter adapted to a relevant ultrasound noise model. Quantitative results on synthetic data show the performances of the proposed method compared to well-established and state-of-the-art methods. Results on real images demonstrate that the proposed method is able to preserve accurately edges and structural details of the image. PMID:19482578

  20. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.

  1. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976

  2. A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics

    NASA Astrophysics Data System (ADS)

    Haaf, Ezra; Heudorfer, Benedikt; Stahl, Kerstin; Barthel, Roland

    2017-04-01

    Fluctuation patterns in groundwater hydrographs are generally assumed to contain information on aquifer characteristics, climate and environmental controls. However, attempts to disentangle this information and map the dominant controls have been few. This is due to the substantial heterogeneity and complexity of groundwater systems, which is reflected in the abundance of morphologies of groundwater time series. To describe the structure and shape of hydrographs, descriptive terms like "slow"/ "fast" or "flashy"/ "inert" are frequently used, which are subjective, irreproducible and limited. This lack of objective and refined concepts limit approaches for regionalization of hydrogeological characteristics as well as our understanding of dominant processes controlling groundwater dynamics. Therefore, we propose a novel framework for groundwater hydrograph characterization in an attempt to categorize morphologies explicitly and quantitatively based on perceptual concepts of aspects of the dynamics. This quantitative framework is inspired by the existing and operational eco-hydrological classification frameworks for streamflow. The need for a new framework for groundwater systems is justified by the fundamental differences between the state variable groundwater head and the flow variable streamflow. Conceptually, we extracted exemplars of specific dynamic patterns, attributing descriptive terms for means of systematisation. Metrics, primarily taken from streamflow literature, were subsequently adapted to groundwater and assigned to the described patterns for means of quantification. In this study, we focused on the particularities of groundwater as a state variable. Furthermore, we investigated the descriptive skill of individual metrics as well as their usefulness for groundwater hydrographs. The ensemble of categorized metrics result in a framework, which can be used to describe and quantify groundwater dynamics. It is a promising tool for the setup of a successful similarity classification framework for groundwater hydrographs. However, the overabundance of metrics available calls for a systematic redundancy analysis of the metrics, which we describe in a second study (Heudorfer et al., 2017). Heudorfer, B., Haaf, E., Barthel, R., Stahl, K., 2017. A framework for quantification of groundwater dynamics - redundancy and transferability of hydro(geo-)logical metrics. EGU General Assembly 2017, Vienna, Austria.

  3. Toward a Unified Validation Framework in Mixed Methods Research

    ERIC Educational Resources Information Center

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  4. Quantitative Procedures for the Assessment of Quality in Higher Education Institutions.

    ERIC Educational Resources Information Center

    Moran, Tom; Rowse, Glenwood

    The development of procedures designed to provide quantitative assessments of quality in higher education institutions are reviewed. These procedures employ a systems framework and utilize quantitative data to compare institutions or programs of similar types with one another. Three major elements essential in the development of models focusing on…

  5. Quantitative imaging test approval and biomarker qualification: interrelated but distinct activities.

    PubMed

    Buckler, Andrew J; Bresolin, Linda; Dunnick, N Reed; Sullivan, Daniel C; Aerts, Hugo J W L; Bendriem, Bernard; Bendtsen, Claus; Boellaard, Ronald; Boone, John M; Cole, Patricia E; Conklin, James J; Dorfman, Gary S; Douglas, Pamela S; Eidsaunet, Willy; Elsinger, Cathy; Frank, Richard A; Gatsonis, Constantine; Giger, Maryellen L; Gupta, Sandeep N; Gustafson, David; Hoekstra, Otto S; Jackson, Edward F; Karam, Lisa; Kelloff, Gary J; Kinahan, Paul E; McLennan, Geoffrey; Miller, Colin G; Mozley, P David; Muller, Keith E; Patt, Rick; Raunig, David; Rosen, Mark; Rupani, Haren; Schwartz, Lawrence H; Siegel, Barry A; Sorensen, A Gregory; Wahl, Richard L; Waterton, John C; Wolf, Walter; Zahlmann, Gudrun; Zimmerman, Brian

    2011-06-01

    Quantitative imaging biomarkers could speed the development of new treatments for unmet medical needs and improve routine clinical care. However, it is not clear how the various regulatory and nonregulatory (eg, reimbursement) processes (often referred to as pathways) relate, nor is it clear which data need to be collected to support these different pathways most efficiently, given the time- and cost-intensive nature of doing so. The purpose of this article is to describe current thinking regarding these pathways emerging from diverse stakeholders interested and active in the definition, validation, and qualification of quantitative imaging biomarkers and to propose processes to facilitate the development and use of quantitative imaging biomarkers. A flexible framework is described that may be adapted for each imaging application, providing mechanisms that can be used to develop, assess, and evaluate relevant biomarkers. From this framework, processes can be mapped that would be applicable to both imaging product development and to quantitative imaging biomarker development aimed at increasing the effectiveness and availability of quantitative imaging. http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.10100800/-/DC1. RSNA, 2011

  6. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting

    PubMed Central

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen; Wald, Lawrence L.

    2017-01-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization. PMID:26915119

  7. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    PubMed

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  8. A new framework for evaluating the impacts of drought on net primary productivity of grassland.

    PubMed

    Lei, Tianjie; Wu, Jianjun; Li, Xiaohan; Geng, Guangpo; Shao, Changliang; Zhou, Hongkui; Wang, Qianfeng; Liu, Leizhen

    2015-12-01

    This paper presented a valuable framework for evaluating the impacts of droughts (single factor) on grassland ecosystems. This framework was defined as the quantitative magnitude of drought impact that unacceptable short-term and long-term effects on ecosystems may experience relative to the reference standard. Long-term effects on ecosystems may occur relative to the reference standard. Net primary productivity (NPP) was selected as the response indicator of drought to assess the quantitative impact of drought on Inner Mongolia grassland based on the Standardized Precipitation Index (SPI) and BIOME-BGC model. The framework consists of six main steps: 1) clearly defining drought scenarios, such as moderate, severe and extreme drought; 2) selecting an appropriate indicator of drought impact; 3) selecting an appropriate ecosystem model and verifying its capabilities, calibrating the bias and assessing the uncertainty; 4) assigning a level of unacceptable impact of drought on the indicator; 5) determining the response of the indicator to drought and normal weather state under global-change; and 6) investigating the unacceptable impact of drought at different spatial scales. We found NPP losses assessed using the new framework were more sensitive to drought and had higher precision than the long-term average method. Moreover, the total and average losses of NPP are different in different grassland types during the drought years from 1961-2009. NPP loss was significantly increased along a gradient of increasing drought levels. Meanwhile, NPP loss variation under the same drought level was different in different grassland types. The operational framework was particularly suited for integrative assessing the effects of different drought events and long-term droughts at multiple spatial scales, which provided essential insights for sciences and societies that must develop coping strategies for ecosystems for such events. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Using multi-criteria risk ranking methodology to select case studies for a generic risk assessment framework for exotic disease incursion and spread through Europe.

    PubMed

    Horigan, V; De Nardi, M; Simons, R R L; Bertolini, S; Crescio, M I; Estrada-Peña, A; Léger, A; Maurella, C; Ru, G; Schuppers, M; Stärk, K D C; Adkin, A

    2018-05-01

    We present a novel approach of using the multi-criteria pathogen prioritisation methodology as a basis for selecting the most appropriate case studies for a generic risk assessment framework. The approach uses selective criteria to rank exotic animal health pathogens according to the likelihood of introduction and the impact of an outbreak if it occurred in the European Union (EU). Pathogens were evaluated based on their impact on production at the EU level and international trade. A subsequent analysis included criteria of relevance to quantitative risk assessment case study selection, such as the availability of data for parameterisation, the need for further research and the desire for the case studies to cover different routes of transmission. The framework demonstrated is flexible with the ability to adjust both the criteria and their weightings to the user's requirements. A web based tool has been developed using the RStudio shiny apps software, to facilitate this. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  10. Predicting Player Position for Talent Identification in Association Football

    NASA Astrophysics Data System (ADS)

    Razali, Nazim; Mustapha, Aida; Yatim, Faiz Ahmad; Aziz, Ruhaya Ab

    2017-08-01

    This paper is set to introduce a new framework from the perspective of Computer Science for identifying talents in the sport of football based on the players’ individual qualities; physical, mental, and technical. The combination of qualities as assessed by coaches are then used to predict the players’ position in a match that suits the player the best in a particular team formation. Evaluation of the proposed framework is two-fold; quantitatively via classification experiments to predict player position, and qualitatively via a Talent Identification Site developed to achieve the same goal. Results from the classification experiments using Bayesian Networks, Decision Trees, and K-Nearest Neighbor have shown an average of 98% accuracy, which will promote consistency in decision-making though elimination of personal bias in team selection. The positive reviews on the Football Identification Site based on user acceptance evaluation also indicates that the framework is sufficient to serve as the basis of developing an intelligent team management system in different sports, whereby growth and performance of sport players can be monitored and identified.

  11. Non-lambertian reflectance modeling and shape recovery of faces using tensor splines.

    PubMed

    Kumar, Ritwik; Barmpoutis, Angelos; Banerjee, Arunava; Vemuri, Baba C

    2011-03-01

    Modeling illumination effects and pose variations of a face is of fundamental importance in the field of facial image analysis. Most of the conventional techniques that simultaneously address both of these problems work with the Lambertian assumption and thus fall short of accurately capturing the complex intensity variation that the facial images exhibit or recovering their 3D shape in the presence of specularities and cast shadows. In this paper, we present a novel Tensor-Spline-based framework for facial image analysis. We show that, using this framework, the facial apparent BRDF field can be accurately estimated while seamlessly accounting for cast shadows and specularities. Further, using local neighborhood information, the same framework can be exploited to recover the 3D shape of the face (to handle pose variation). We quantitatively validate the accuracy of the Tensor Spline model using a more general model based on the mixture of single-lobed spherical functions. We demonstrate the effectiveness of our technique by presenting extensive experimental results for face relighting, 3D shape recovery, and face recognition using the Extended Yale B and CMU PIE benchmark data sets.

  12. Organizational factors affecting successful adoption of innovative eHealth services: a case study employing the FITT framework.

    PubMed

    Tsiknakis, Manolis; Kouroubali, Angelina

    2009-01-01

    The paper presents an application of the "Fit between Individuals, Task and Technology" (FITT) framework to analyze the socio-organizational-technical factors that influence IT adoption in the healthcare domain. The FITT framework was employed as the theoretical instrument for a retrospective analysis of a 15-year effort in implementing IT systems and eHealth services in the context of a Regional Health Information Network in Crete. Quantitative and qualitative research methods, interviews and participant observations were employed to gather data from a case study that involved the entire region of Crete. The detailed analysis of the case study based on the FITT framework, showed common features, but also differences of IT adoption within the various health organizations. The emerging picture is a complex nexus of factors contributing to IT adoption, and multi-level interventional strategies to promote IT use. The work presented in this paper shows the applicability of the FITT framework in explaining the complexity of aspects observed in the implementation of healthcare information systems. The reported experiences reveal that fit management can be viewed as a system with a feedback loop that is never really stable, but ever changing based on external factors or deliberate interventions. Management of fit, therefore, becomes a constant and complex task for the whole life cycle of IT systems.

  13. Generalized PSF modeling for optimized quantitation in PET imaging.

    PubMed

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF modeling does not offer optimized PET quantitation, and that PSF overestimation may provide enhanced SUV quantitation. Furthermore, generalized PSF modeling may provide a valuable approach for quantitative tasks such as treatment-response assessment and prognostication.

  14. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  15. Fragment-based quantitative structure-activity relationship (FB-QSAR) for fragment-based drug design.

    PubMed

    Du, Qi-Shi; Huang, Ri-Bo; Wei, Yu-Tuo; Pang, Zong-Wen; Du, Li-Qin; Chou, Kuo-Chen

    2009-01-30

    In cooperation with the fragment-based design a new drug design method, the so-called "fragment-based quantitative structure-activity relationship" (FB-QSAR) is proposed. The essence of the new method is that the molecular framework in a family of drug candidates are divided into several fragments according to their substitutes being investigated. The bioactivities of molecules are correlated with the physicochemical properties of the molecular fragments through two sets of coefficients in the linear free energy equations. One coefficient set is for the physicochemical properties and the other for the weight factors of the molecular fragments. Meanwhile, an iterative double least square (IDLS) technique is developed to solve the two sets of coefficients in a training data set alternately and iteratively. The IDLS technique is a feedback procedure with machine learning ability. The standard Two-dimensional quantitative structure-activity relationship (2D-QSAR) is a special case, in the FB-QSAR, when the whole molecule is treated as one entity. The FB-QSAR approach can remarkably enhance the predictive power and provide more structural insights into rational drug design. As an example, the FB-QSAR is applied to build a predictive model of neuraminidase inhibitors for drug development against H5N1 influenza virus. (c) 2008 Wiley Periodicals, Inc.

  16. Multimodality Data Integration in Epilepsy

    PubMed Central

    Muzik, Otto; Chugani, Diane C.; Zou, Guangyu; Hua, Jing; Lu, Yi; Lu, Shiyong; Asano, Eishi; Chugani, Harry T.

    2007-01-01

    An important goal of software development in the medical field is the design of methods which are able to integrate information obtained from various imaging and nonimaging modalities into a cohesive framework in order to understand the results of qualitatively different measurements in a larger context. Moreover, it is essential to assess the various features of the data quantitatively so that relationships in anatomical and functional domains between complementing modalities can be expressed mathematically. This paper presents a clinically feasible software environment for the quantitative assessment of the relationship among biochemical functions as assessed by PET imaging and electrophysiological parameters derived from intracranial EEG. Based on the developed software tools, quantitative results obtained from individual modalities can be merged into a data structure allowing a consistent framework for advanced data mining techniques and 3D visualization. Moreover, an effort was made to derive quantitative variables (such as the spatial proximity index, SPI) characterizing the relationship between complementing modalities on a more generic level as a prerequisite for efficient data mining strategies. We describe the implementation of this software environment in twelve children (mean age 5.2 ± 4.3 years) with medically intractable partial epilepsy who underwent both high-resolution structural MR and functional PET imaging. Our experiments demonstrate that our approach will lead to a better understanding of the mechanisms of epileptogenesis and might ultimately have an impact on treatment. Moreover, our software environment holds promise to be useful in many other neurological disorders, where integration of multimodality data is crucial for a better understanding of the underlying disease mechanisms. PMID:17710251

  17. An ice sheet model validation framework for the Greenland ice sheet.

    PubMed

    Price, Stephen F; Hoffman, Matthew J; Bonin, Jennifer A; Howat, Ian M; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P; Evans, Katherine J; Kennedy, Joseph H; Lenaerts, Jan; Lipscomb, William H; Perego, Mauro; Salinger, Andrew G; Tuminaro, Raymond S; van den Broeke, Michiel R; Nowicki, Sophie M J

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  18. A comparison of fit of CNC-milled titanium and zirconia frameworks to implants.

    PubMed

    Abduo, Jaafar; Lyons, Karl; Waddell, Neil; Bennani, Vincent; Swain, Michael

    2012-05-01

    Computer numeric controlled (CNC) milling was proven to be predictable method to fabricate accurately fitting implant titanium frameworks. However, no data are available regarding the fit of CNC-milled implant zirconia frameworks. To compare the precision of fit of implant frameworks milled from titanium and zirconia and relate it to peri-implant strain development after framework fixation. A partially edentulous epoxy resin models received two Branemark implants in the areas of the lower left second premolar and second molar. From this model, 10 identical frameworks were fabricated by mean of CNC milling. Half of them were made from titanium and the other half from zirconia. Strain gauges were mounted close to the implants to qualitatively and quantitatively assess strain development as a result of framework fitting. In addition, the fit of the framework implant interface was measured using an optical microscope, when only one screw was tightened (passive fit) and when all screws were tightened (vertical fit). The data was statistically analyzed using the Mann-Whitney test. All frameworks produced measurable amounts of peri-implant strain. The zirconia frameworks produced significantly less strain than titanium. Combining the qualitative and quantitative information indicates that the implants were under vertical displacement rather than horizontal. The vertical fit was similar for zirconia (3.7 µm) and titanium (3.6 µm) frameworks; however, the zirconia frameworks exhibited a significantly finer passive fit (5.5 µm) than titanium frameworks (13.6 µm). CNC milling produced zirconia and titanium frameworks with high accuracy. The difference between the two materials in terms of fit is expected to be of minimal clinical significance. The strain developed around the implants was more related to the framework fit rather than framework material. © 2011 Wiley Periodicals, Inc.

  19. Race, Ethnicity, and Higher Education Policy: The Use of Critical Quantitative Research

    ERIC Educational Resources Information Center

    Teranishi, Robert T.

    2007-01-01

    Cross-sectional frameworks, or between-group approaches, in quantitative research in higher education have limitations that hinder what we know about the intersection of race and educational opportunities and outcomes. (Contains 5 figures.)

  20. Post-event reviews: Using a quantitative approach for analysing incident response to demonstrate the value of business continuity programmes and increase planning efficiency.

    PubMed

    Vaidyanathan, Karthik

    2017-01-01

    Business continuity management is often thought of as a proactive planning process for minimising impact from large-scale incidents and disasters. While this is true, and it is critical to plan for the worst, consistently validating plan effectiveness against smaller disruptions can enable an organisation to gain key insights about its business continuity readiness, drive programme improvements, reduce costs and provide an opportunity to quantitatively demonstrate the value of the programme to management. This paper describes a post mortem framework which is used as a continuous improvement mechanism for tracking, reviewing and learning from real-world events at Microsoft Customer Service & Support. This approach was developed and adopted because conducting regular business continuity exercises proved difficult and expensive in a complex and distributed operations environment with high availability requirements. Using a quantitative approach to measure response to incidents, and categorising outcomes based on such responses, enables business continuity teams to provide data-driven insights to leadership, change perceptions of incident root cause, and instil a higher level of confidence towards disaster response readiness and incident management. The scope of the framework discussed here is specific to reviewing and driving improvements from operational incidents. However, the concept can be extended to learning and evolving readiness plans for other types of incidents.

  1. Investigating Adult Language Input and Young Children's Responses in Naturalistic Environments: An Observational Framework

    ERIC Educational Resources Information Center

    Marinac, Julie V.; Woodyatt, Gail C.; Ozanne, Anne E.

    2008-01-01

    This paper reports the design and trial of an original Observational Framework for quantitative investigation of young children's responses to adult language in their typical language learning environments. The Framework permits recording of both the response expectation of the adult utterances, and the degree of compliance in the child's…

  2. A framework for grouping nanoparticles based on their measurable characteristics.

    PubMed

    Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V

    2013-01-01

    There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.

  3. Mapping of quantitative trait loci controlling adaptive traits in coastal Douglas-fir

    Treesearch

    Nicholas C. Wheeler; Kathleen D. Jermstad; Konstantin V. Krutovsky; Sally N. Aitken; Glenn T. Howe; Jodie Krakowski; David B. Neale

    2005-01-01

    Quantitative trait locus (QTL) analyses are used by geneticists to characterize the genetic architecture of quantitative traits, provide a foundation for marker-aided-selection (MAS), and provide a framework for positional selection of candidate genes. The most useful QTL for breeding applications are those that have been verified in time, space, and/or genetic...

  4. Transcending the Quantitative-Qualitative Divide with Mixed Methods Research: A Multidimensional Framework for Understanding Congruence and Completeness in the Study of Values

    ERIC Educational Resources Information Center

    McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.

    2010-01-01

    Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…

  5. Emergence of grouping in multi-resource minority game dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Zi-Gang; Zhang, Ji-Qiang; Dong, Jia-Qi; Huang, Liang; Lai, Ying-Cheng

    2012-10-01

    Complex systems arising in a modern society typically have many resources and strategies available for their dynamical evolutions. To explore quantitatively the behaviors of such systems, we propose a class of models to investigate Minority Game (MG) dynamics with multiple strategies. In particular, agents tend to choose the least used strategies based on available local information. A striking finding is the emergence of grouping states defined in terms of distinct strategies. We develop an analytic theory based on the mean-field framework to understand the ``bifurcations'' of the grouping states. The grouping phenomenon has also been identified in the Shanghai Stock-Market system, and we discuss its prevalence in other real-world systems. Our work demonstrates that complex systems obeying the MG rules can spontaneously self-organize themselves into certain divided states, and our model represents a basic and general mathematical framework to address this kind of phenomena in social, economical and political systems.

  6. Using convolutional neural networks to explore the microbiome.

    PubMed

    Reiman, Derek; Metwally, Ahmed; Yang Dai

    2017-07-01

    The microbiome has been shown to have an impact on the development of various diseases in the host. Being able to make an accurate prediction of the phenotype of a genomic sample based on its microbial taxonomic abundance profile is an important problem for personalized medicine. In this paper, we examine the potential of using a deep learning framework, a convolutional neural network (CNN), for such a prediction. To facilitate the CNN learning, we explore the structure of abundance profiles by creating the phylogenetic tree and by designing a scheme to embed the tree to a matrix that retains the spatial relationship of nodes in the tree and their quantitative characteristics. The proposed CNN framework is highly accurate, achieving a 99.47% of accuracy based on the evaluation on a dataset 1967 samples of three phenotypes. Our result demonstrated the feasibility and promising aspect of CNN in the classification of sample phenotype.

  7. An integrative review of the impact of mobile technologies used by healthcare professionals to support education and practice.

    PubMed

    Guo, Ping; Watts, Kim; Wharrad, Heather

    2016-04-01

    The aim of this study was to provide evidence of the impact of mobile technologies among healthcare professionals in education and practice settings. Integrative literature review. Electronic databases including MEDLINE, CINAHL, PsycINFO, EMBASE, ERIC and Web of Science were searched for papers published between 2002-2012. Quantitative studies were critically evaluated based on Thomas et al .'s framework, while the consolidated criteria for reporting qualitative research was used to appraise the rigour of the qualitative studies. Seventeen quantitative and three qualitative studies were included. The findings suggest a largely positive influence of mobile technologies on various clinical practice and educational outcomes. However, robust evidence was limited. Use of mobile technologies in health care are associated with improvements in access to information, accuracy and efficiency, evidence-based decision making at the point of care and enhancement in performance, confidence and engagement in different contexts.

  8. Examination of the solventlike nature of zeolites based on a solvatochromic indicator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, P.K.; Turbeville, W.

    1991-05-16

    Zeolites are crystalline aluminosilicates with cages and channel systems that can host a variety of organic transformations. This intracrystalline space is akin to a solvent, and description of this space in terms of solventlike properties is appropriate. The concept of solvatochromic indicators has been successfully used to define the physicochemical properties of organic solvents. In this study, the authors have investigated the electronic and Raman spectroscopy of the molecule N-(2-hydroxybenzylidene)aniline and established a quantitative correlation between the spectral intensities of the benzenoid and zwitterionic forms of this molecule and the {alpha}-value of various hydroxylic solvents. The {alpha} value is amore » measure of the hydrogen bond donor ability of the solvent. This correlation has been used to establish an {alpha} value scale for a series of faujasitic zeolites with varying Si/Al ratios. It was found that the {alpha} value of the zeolite increased with Si/Al ratio to reach a maximum around 7.8, followed by a decrease at higher Si/Al ratios. Since Na{sup +}-exchanged zeolites were examined in all cases, the interaction of the anil molecule in its zwitterionic form with Lewis acids (Na{sup +}) and bases (oxygen of the framework) was considered to be responsible for its formation. The Si/Al ratio of the framework determines the acid-base character of the zeolite and is reflected in a quantitative manner by the {alpha} value determined in this study.« less

  9. Atlas-based liver segmentation and hepatic fat-fraction assessment for clinical trials.

    PubMed

    Yan, Zhennan; Zhang, Shaoting; Tan, Chaowei; Qin, Hongxing; Belaroussi, Boubakeur; Yu, Hui Jing; Miller, Colin; Metaxas, Dimitris N

    2015-04-01

    Automated assessment of hepatic fat-fraction is clinically important. A robust and precise segmentation would enable accurate, objective and consistent measurement of hepatic fat-fraction for disease quantification, therapy monitoring and drug development. However, segmenting the liver in clinical trials is a challenging task due to the variability of liver anatomy as well as the diverse sources the images were acquired from. In this paper, we propose an automated and robust framework for liver segmentation and assessment. It uses single statistical atlas registration to initialize a robust deformable model to obtain fine segmentation. Fat-fraction map is computed by using chemical shift based method in the delineated region of liver. This proposed method is validated on 14 abdominal magnetic resonance (MR) volumetric scans. The qualitative and quantitative comparisons show that our proposed method can achieve better segmentation accuracy with less variance comparing with two other atlas-based methods. Experimental results demonstrate the promises of our assessment framework. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Using BCG as a framework for setting goals and communicating progress toward those goals

    EPA Science Inventory

    This 5 minute Lightning Talk will discuss the benefits of stakeholder-supported quantitative targets in measuring progress, and will describe the Biological Condition Gradient (BCG) as one way to develop these quantitative targets.

  11. Ocean Heat Content Reveals Secrets of Fish Migrations

    PubMed Central

    Luo, Jiangang; Ault, Jerald S.; Shay, Lynn K.; Hoolihan, John P.; Prince, Eric D.; Brown, Craig A.; Rooker, Jay R.

    2015-01-01

    For centuries, the mechanisms surrounding spatially complex animal migrations have intrigued scientists and the public. We present a new methodology using ocean heat content (OHC), a habitat metric that is normally a fundamental part of hurricane intensity forecasting, to estimate movements and migration of satellite-tagged marine fishes. Previous satellite-tagging research of fishes using archival depth, temperature and light data for geolocations have been too coarse to resolve detailed ocean habitat utilization. We combined tag data with OHC estimated from ocean circulation and transport models in an optimization framework that substantially improved geolocation accuracy over SST-based tracks. The OHC-based movement track provided the first quantitative evidence that many of the tagged highly migratory fishes displayed affinities for ocean fronts and eddies. The OHC method provides a new quantitative tool for studying dynamic use of ocean habitats, migration processes and responses to environmental changes by fishes, and further, improves ocean animal tracking and extends satellite-based animal tracking data for other potential physical, ecological, and fisheries applications. PMID:26484541

  12. Web-Enabled Distributed Health-Care Framework for Automated Malaria Parasite Classification: an E-Health Approach.

    PubMed

    Maity, Maitreya; Dhane, Dhiraj; Mungle, Tushar; Maiti, A K; Chakraborty, Chandan

    2017-10-26

    Web-enabled e-healthcare system or computer assisted disease diagnosis has a potential to improve the quality and service of conventional healthcare delivery approach. The article describes the design and development of a web-based distributed healthcare management system for medical information and quantitative evaluation of microscopic images using machine learning approach for malaria. In the proposed study, all the health-care centres are connected in a distributed computer network. Each peripheral centre manages its' own health-care service independently and communicates with the central server for remote assistance. The proposed methodology for automated evaluation of parasites includes pre-processing of blood smear microscopic images followed by erythrocytes segmentation. To differentiate between different parasites; a total of 138 quantitative features characterising colour, morphology, and texture are extracted from segmented erythrocytes. An integrated pattern classification framework is designed where four feature selection methods viz. Correlation-based Feature Selection (CFS), Chi-square, Information Gain, and RELIEF are employed with three different classifiers i.e. Naive Bayes', C4.5, and Instance-Based Learning (IB1) individually. Optimal features subset with the best classifier is selected for achieving maximum diagnostic precision. It is seen that the proposed method achieved with 99.2% sensitivity and 99.6% specificity by combining CFS and C4.5 in comparison with other methods. Moreover, the web-based tool is entirely designed using open standards like Java for a web application, ImageJ for image processing, and WEKA for data mining considering its feasibility in rural places with minimal health care facilities.

  13. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.

    PubMed

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.

  14. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator

    PubMed Central

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730

  15. The design and testing of a caring teaching model based on the theoretical framework of caring in the Chinese Context: a mixed-method study.

    PubMed

    Guo, Yujie; Shen, Jie; Ye, Xuchun; Chen, Huali; Jiang, Anli

    2013-08-01

    This paper aims to report the design and test the effectiveness of an innovative caring teaching model based on the theoretical framework of caring in the Chinese context. Since the 1970's, caring has been a core value in nursing education. In a previous study, a theoretical framework of caring in the Chinese context is explored employing a grounded theory study, considered beneficial for caring education. A caring teaching model was designed theoretically and a one group pre- and post-test quasi-experimental study was administered to test its effectiveness. From Oct, 2009 to Jul, 2010, a cohort of grade-2 undergraduate nursing students (n=64) in a Chinese medical school was recruited to participate in the study. Data were gathered through quantitative and qualitative methods to evaluate the effectiveness of the caring teaching model. The caring teaching model created an esthetic situation and experiential learning style for teaching caring that was integrated within the curricula. Quantitative data from the quasi-experimental study showed that the post-test scores of each item were higher than those on the pre-test (p<0.01). Thematic analysis of 1220 narratives from students' caring journals and reports of participant class observation revealed two main thematic categories, which reflected, from the students' points of view, the development of student caring character and the impact that the caring teaching model had on this regard. The model could be used as an integrated approach to teach caring in nursing curricula. It would also be beneficial for nursing administrators in cultivating caring nurse practitioners. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Psychosocial framework for understanding psychological distress among survivors of the November 26, 2008 Mumbai terror attack: beyond traumatic experiences and emergency medical care.

    PubMed

    Joseph, Jacquleen; Jaswal, Surinder

    2014-06-01

    The field of "Public Health in Disasters and Complex Emergencies" is replete with either epidemiological studies or studies in the area of hospital preparedness and emergency care. The field is dominated by hospital-based or emergency phase-related literature, with very little attention on long-term health and mental health consequences. The social science, or the public mental health perspective, too, is largely missing. It is in this context that the case report of the November 26, 2008 Mumbai terror attack survivors is presented to bring forth the multi-dimensional and dynamic long-term impacts, and their consequences for psychological well-being, two years after the incident. Based on literature, the report formulates a theoretical framework through which the lived experiences of the survivors is analyzed and understood from a social science perspective. This report is an outcome of the ongoing work with the survivors over a period of two years. A mixed methodology was used. It quantitatively captures the experience of 231 families following the attack, and also uses a self-reporting questionnaire (SRQ), SRQ20, to understand the psychological distress. In-depth qualitative case studies constructed from the process records and in-depth interviews focus on lived experiences of the survivors and explain the patterns emerging from the quantitative analysis. This report outlines the basic profile of the survivors, the immediate consequences of the attack, the support received, psychological consequences, and the key factors contributing to psychological distress. Through analysis of the key factors and the processes emerging from the lived experiences that explain the progression of vulnerability to psychological distress, this report puts forth a psychosocial framework for understanding psychological distress among survivors of the November 26, 2008 Mumbai terror attack.

  17. Conceptual framework for drought phenotyping during molecular breeding.

    PubMed

    Salekdeh, Ghasem Hosseini; Reynolds, Matthew; Bennett, John; Boyer, John

    2009-09-01

    Drought is a major threat to agricultural production and drought tolerance is a prime target for molecular approaches to crop improvement. To achieve meaningful results, these approaches must be linked with suitable phenotyping protocols at all stages, such as the screening of germplasm collections, mutant libraries, mapping populations, transgenic lines and breeding materials and the design of OMICS and quantitative trait loci (QTLs) experiments. Here we present a conceptual framework for molecular breeding for drought tolerance based on the Passioura equation of expressing yield as the product of water use (WU), water use efficiency (WUE) and harvest index (HI). We identify phenotyping protocols that address each of these factors, describe their key features and illustrate their integration with different molecular approaches.

  18. An Evaluation Framework and Comparative Analysis of the Widely Used First Programming Languages

    PubMed Central

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores. PMID:24586449

  19. An evaluation framework and comparative analysis of the widely used first programming languages.

    PubMed

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  20. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  1. A systematic review on how to conduct evaluations in community-based rehabilitation.

    PubMed

    Grandisson, Marie; Hébert, Michèle; Thibeault, Rachel

    2014-01-01

    Community-based rehabilitation (CBR) must prove that it is making a significant difference for people with disabilities in low- and middle-income countries. Yet, evaluation is not a common practice and the evidence for its effectiveness is fragmented and largely insufficient. The objective of this article was to review the literature on best practices in program evaluation in CBR in relation to the evaluative process, the frameworks, and the methods of data collection. A systematic search was conducted on five rehabilitation databases and the World Health Organization website with keywords associated with CBR and program evaluation. Two independent researchers selected the articles. Twenty-two documents were included. The results suggest that (1) the evaluative process needs to be conducted in close collaboration with the local community, including people with disabilities, and to be followed by sharing the findings and taking actions, (2) many frameworks have been proposed to evaluate CBR but no agreement has been reached, and (3) qualitative methodologies have dominated the scene in CBR so far, but their combination with quantitative methods has a lot of potential to better capture the effectiveness of this strategy. In order to facilitate and improve evaluations in CBR, there is an urgent need to agree on a common framework, such as the CBR matrix, and to develop best practice guidelines based on the literature available and consensus among a group of experts. These will need to demonstrate a good balance between community development and standards for effective evaluations. Implications for Rehabilitation In the quest for evidence of the effectiveness of community-based rehabilitation (CBR), a shared program evaluation framework would better enable the combination of findings from different studies. The evaluation of CBR programs should always include sharing findings and taking action for the sake of the local community. Although qualitative methodologies have dominated the scene in CBR and remain highly relevant, there is also a call for the inclusion of quantitative indicators in order to capture the progress made by people participating in CBR programs. The production of best practice guidelines for evaluation in CBR could foster accountable and empowering program evaluations that are congruent with the principles at the heart of CBR and the standards for effective evaluations.

  2. A systematic review on how to conduct evaluations in community-based rehabilitation

    PubMed Central

    Hébert, Michèle; Thibeault, Rachel

    2014-01-01

    Purpose Community-based rehabilitation (CBR) must prove that it is making a significant difference for people with disabilities in low- and middle-income countries. Yet, evaluation is not a common practice and the evidence for its effectiveness is fragmented and largely insufficient. The objective of this article was to review the literature on best practices in program evaluation in CBR in relation to the evaluative process, the frameworks, and the methods of data collection. Method A systematic search was conducted on five rehabilitation databases and the World Health Organization website with keywords associated with CBR and program evaluation. Two independent researchers selected the articles. Results Twenty-two documents were included. The results suggest that (1) the evaluative process needs to be conducted in close collaboration with the local community, including people with disabilities, and to be followed by sharing the findings and taking actions, (2) many frameworks have been proposed to evaluate CBR but no agreement has been reached, and (3) qualitative methodologies have dominated the scene in CBR so far, but their combination with quantitative methods has a lot of potential to better capture the effectiveness of this strategy. Conclusions In order to facilitate and improve evaluations in CBR, there is an urgent need to agree on a common framework, such as the CBR matrix, and to develop best practice guidelines based on the literature available and consensus among a group of experts. These will need to demonstrate a good balance between community development and standards for effective evaluations. Implications for Rehabilitation In the quest for evidence of the effectiveness of community-based rehabilitation (CBR), a shared program evaluation framework would better enable the combination of findings from different studies. The evaluation of CBR programs should always include sharing findings and taking action for the sake of the local community. Although qualitative methodologies have dominated the scene in CBR and remain highly relevant, there is also a call for the inclusion of quantitative indicators in order to capture the progress made by people participating in CBR programs. The production of best practice guidelines for evaluation in CBR could foster accountable and empowering program evaluations that are congruent with the principles at the heart of CBR and the standards for effective evaluations. PMID:23614357

  3. Customized Body Mapping to Facilitate the Ergonomic Design of Sportswear.

    PubMed

    Cao, Mingliang; Li, Yi; Guo, Yueping; Yao, Lei; Pan, Zhigeng

    2016-01-01

    A successful high-performance sportswear design that considers human factors should result in a significant increase in thermal comfort and reduce energy loss. The authors describe a body-mapping approach that facilitates the effective ergonomic design of sportswear. Their general framework can be customized based on the functional requirements of various sports and sportswear, the desired combination and selection of mapping areas for the human body, and customized quantitative data distribution of target physiological indicators.

  4. Integrating prediction, provenance, and optimization into high energy workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schram, M.; Bansal, V.; Friese, R. D.

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  5. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  6. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.

    PubMed

    Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier

    2010-01-28

    Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.

  7. Effects of environmental change on agriculture, nutrition and health: A framework with a focus on fruits and vegetables

    PubMed Central

    Tuomisto, Hanna L.; Scheelbeek, Pauline F.D.; Chalabi, Zaid; Green, Rosemary; Smith, Richard D.; Haines, Andy; Dangour, Alan D.

    2017-01-01

    Environmental changes are likely to affect agricultural production over the next  decades. The interactions between environmental change, agricultural yields and crop quality, and the critical pathways to future diets and health outcomes are largely undefined. There are currently no quantitative models to test the impact of multiple environmental changes on nutrition and health outcomes. Using an interdisciplinary approach, we developed a framework to link the multiple interactions between environmental change, agricultural productivity and crop quality, population-level food availability, dietary intake and health outcomes, with a specific focus on fruits and vegetables. The main components of the framework consist of: i) socio-economic and societal factors, ii) environmental change stressors, iii) interventions and policies, iv) food system activities, v) food and nutrition security, and vi) health and well-being outcomes. The framework, based on currently available evidence, provides an overview of the multidimensional and complex interactions with feedback between environmental change, production of fruits and vegetables, diets and health, and forms the analytical basis for future modelling and scenario testing. PMID:29511740

  8. Structural Determinants for Naturally Evolving H5N1 Hemagglutinin to Switch its Receptor Specificity

    PubMed Central

    Tharakaraman, Kannan; Raman, Rahul; Viswanathan, Karthik; Stebbins, Nathan W.; Jayaraman, Akila; Krishnan, Arvind; Sasisekharan, V.; Sasisekharan, Ram

    2013-01-01

    SUMMARY Of the factors governing human-to-human transmission of the highly pathogenic avian-adapted H5N1 virus, the most critical is the acquisition of mutations on the viral hemagglutinin (HA) to “quantitatively switch” its binding from avian to human glycan receptors. Herein, we describe a structural framework that outlines a necessary set of H5 HA receptor binding site (RBS) features required for the H5 HA to quantitatively switch its preference to human receptors. We show here that the same RBS HA mutations that lead to aerosol transmission of A/Vietnam/1203/04 and A/Indonesia/5/05 viruses, when introduced in currently circulating H5N1, do not lead to quantitative switch in receptor preference. We demonstrate that HAs from circulating clades require as few as a single base-pair mutation to quantitatively switch their binding to human receptors. The mutations identified by this study can be used to monitor the emergence of strains having human-to-human transmission potential. PMID:23746829

  9. Students' Partitive Reasoning

    ERIC Educational Resources Information Center

    Norton, Anderson; Wilkins, Jesse L. M.

    2010-01-01

    In building models of students' fractions knowledge, two prominent frameworks have arisen: Kieren's rational number subconstructs, and Steffe's fractions schemes. The purpose of this paper is to clarify and reconcile aspects of those frameworks through a quantitative analysis. In particular, we focus on the measurement subconstruct and the…

  10. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  11. Community readiness for adopting mHealth in rural Bangladesh: A qualitative exploration.

    PubMed

    Khatun, Fatema; Heywood, Anita E; Ray, Pradeep K; Bhuiya, Abbas; Liaw, Siaw-Teng

    2016-09-01

    There are increasing numbers of mHealth initiatives in middle and low income countries aimed at improving health outcomes. Bangladesh is no exception with more than 20 mobile health (mHealth) initiatives in place. A recent study in Bangladesh examined community readiness for mHealth using a framework based on quantitative data. Given the importance of a framework and the complementary role of qualitative exploration, this paper presents data from a qualitative study which complements findings from the quantitative study. The study was conducted in the Chakaria sub-district of Bangladesh. In total, 37 in-depth interviews were conducted between December 2012 and March 2013. Participants included the general public, students, community leaders, school teachers, and formal and informal healthcare providers. Thematic analysis was used to develop a logical and relevant framework to examine community readiness. As in the quantitative exploration, this study approached the investigation with four types of readiness in mind: core readiness, technological readiness, human resource readiness and motivational readiness. Community members, community leaders and healthcare providers expressed their interest in the use of mHealth in rural Bangladesh. Awareness of mHealth and its advantages was low among uneducated people. Participants who have used mHealth were attracted to the speed of access to qualified healthcare providers, time savings and low cost. Some participants did not see the value of using mobile phones for healthcare compared to a face-to-face consultation. Illiteracy, lack of English language proficiency, lack of trust and technological incapability were identified as barriers to mHealth use. However, a sense of ownership, evidence of utility, a positive attitude to the use of mHealth, and intentions towards future use of mHealth were driving forces in the adoption of mHealth services. This study re-affirmed the mHealth readiness conceptual framework with different dimensions of readiness and identified potential barriers and possible solutions for mHealth. Moving forward, emphasis should be placed on training users, providing low-cost services and improving trust of users. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. A framework for community ownership of a text messaging programme to improve adherence to antiretroviral therapy and client-provider communication: a mixed methods study.

    PubMed

    Mbuagbaw, Lawrence; Bonono-Momnougui, Renee-Cecile; Thabane, Lehana; Kouanfack, Charles; Smieja, Marek; Ongolo-Zogo, Pierre

    2014-09-26

    Mobile phone text messaging has been shown to improve adherence to antiretroviral therapy and to improve communication between patients and health care workers. It is unclear which strategies are most appropriate for scaling up text messaging programmes. We sought to investigate acceptability and readiness for ownership (community members designing, sending and receiving text messages) of a text message programme among a community of clients living with human immunodeficiency virus (HIV) in Yaoundé, Cameroon and to develop a framework for implementation. We used the mixed-methods sequential exploratory design. In the qualitative strand we conducted 7 focus group discussions (57 participants) to elicit themes related to acceptability and readiness. In the quantitative strand we explored the generalizability of these themes in a survey of 420 clients. Qualitative and quantitative data were merged to generate meta-inferences. Both qualitative and quantitative strands showed high levels of acceptability and readiness despite low rates of participation in other community-led projects. In the qualitative strand, compared to the quantitative strand, more potential service users were willing to pay for a text messaging service, preferred participation of health personnel in managing the project and preferred that the project be based in the hospital rather than in the community. Some of the limitations identified to implementing a community-owned project were lack of management skills in the community, financial, technical and literacy challenges. Participants who were willing to pay were more likely to find the project acceptable and expressed positive feelings about community readiness to own a text messaging project. Community ownership of a text messaging programme is acceptable to the community of clients at the Yaoundé Central Hospital. Our framework for implementation includes components for community members who take on roles as services users (demonstrating clear benefits, allowing a trial period and ensuring high levels of confidentiality) or service providers (training in project management and securing sustainable funding). Such a project can be evaluated using participation rate, clinical outcomes, satisfaction with the service, cost and feedback from users.

  13. A unified material decomposition framework for quantitative dual- and triple-energy CT imaging.

    PubMed

    Zhao, Wei; Vernekohl, Don; Han, Fei; Han, Bin; Peng, Hao; Yang, Yong; Xing, Lei; Min, James K

    2018-04-21

    Many clinical applications depend critically on the accurate differentiation and classification of different types of materials in patient anatomy. This work introduces a unified framework for accurate nonlinear material decomposition and applies it, for the first time, in the concept of triple-energy CT (TECT) for enhanced material differentiation and classification as well as dual-energy CT (DECT). We express polychromatic projection into a linear combination of line integrals of material-selective images. The material decomposition is then turned into a problem of minimizing the least-squares difference between measured and estimated CT projections. The optimization problem is solved iteratively by updating the line integrals. The proposed technique is evaluated by using several numerical phantom measurements under different scanning protocols. The triple-energy data acquisition is implemented at the scales of micro-CT and clinical CT imaging with commercial "TwinBeam" dual-source DECT configuration and a fast kV switching DECT configuration. Material decomposition and quantitative comparison with a photon counting detector and with the presence of a bow-tie filter are also performed. The proposed method provides quantitative material- and energy-selective images examining realistic configurations for both DECT and TECT measurements. Compared to the polychromatic kV CT images, virtual monochromatic images show superior image quality. For the mouse phantom, quantitative measurements show that the differences between gadodiamide and iodine concentrations obtained using TECT and idealized photon counting CT (PCCT) are smaller than 8 and 1 mg/mL, respectively. TECT outperforms DECT for multicontrast CT imaging and is robust with respect to spectrum estimation. For the thorax phantom, the differences between the concentrations of the contrast map and the corresponding true reference values are smaller than 7 mg/mL for all of the realistic configurations. A unified framework for both DECT and TECT imaging has been established for the accurate extraction of material compositions using currently available commercial DECT configurations. The novel technique is promising to provide an urgently needed solution for several CT-based diagnostic and therapy applications, especially for the diagnosis of cardiovascular and abdominal diseases where multicontrast imaging is involved. © 2018 American Association of Physicists in Medicine.

  14. Feasibility of the Simultaneous Determination of Monomer Concentrations and Particle Size in Emulsion Polymerization Using in Situ Raman Spectroscopy

    PubMed Central

    2015-01-01

    An immersion Raman probe was used in emulsion copolymerization reactions to measure monomer concentrations and particle sizes. Quantitative determination of monomer concentrations is feasible in two-monomer copolymerizations, but only the overall conversion could be measured by Raman spectroscopy in a four-monomer copolymerization. The feasibility of measuring monomer conversion and particle size was established using partial least-squares (PLS) calibration models. A simplified theoretical framework for the measurement of particle sizes based on photon scattering is presented, based on the elastic-sphere-vibration and surface-tension models. PMID:26900256

  15. Study protocol Implementation of the Veder contact method (VCM) in daily nursing home care for people with dementia: an evaluation based on the RE-AIM framework.

    PubMed

    Boersma, Petra; Van Weert, Julia C M; van Meijel, Berno; van de Ven, Peter M; Dröes, Rose-Marie

    2017-07-01

    People with dementia in nursing homes benefit from person-centred care methods. Studies examining the effect of these methods often fail to report about the implementation of these methods. The present study aims to describe the implementation of the Veder contact method (VCM) in daily nursing home care. A process analysis will be conducted based on qualitative data from focus groups with caregivers and interviews with key figures. To investigate whether the implementation of VCM is reflected in the attitude and behaviour of caregivers and in the behaviour and quality of life of people with dementia, a controlled observational cohort study will be conducted. Six nursing home wards implementing VCM will be compared with six control wards providing Care As Usual. Quantitative data from caregivers and residents will be collected before (T0), and 9-12 months after the implementation (T1). Qualitative analysis and multilevel analyses will be carried out on the collected data and structured based on the constructs of the RE-AIM framework (Reach, Effectiveness, Adoption, Implementation, Maintenance). By using the RE-AIM framework this study introduces a structured and comprehensive way of investigating the implementation process and implementation effectiveness of person-centred care methods in daily dementia care.

  16. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  17. Development and application of the RE-AIM QuEST mixed methods framework for program evaluation.

    PubMed

    Forman, Jane; Heisler, Michele; Damschroder, Laura J; Kaselitz, Elizabeth; Kerr, Eve A

    2017-06-01

    To increase the likelihood of successful implementation of interventions and promote dissemination across real-world settings, it is essential to evaluate outcomes related to dimensions other than Effectiveness alone. Glasgow and colleagues' RE-AIM framework specifies four additional types of outcomes that are important to decision-makers: Reach, Adoption, Implementation (including cost), and Maintenance. To further strengthen RE-AIM, we propose integrating qualitative assessments in an expanded framework: RE-AIM Qualitative Evaluation for Systematic Translation (RE-AIM QuEST), a mixed methods framework. RE-AIM QuEST guides formative evaluation to identify real-time implementation barriers and explain how implementation context may influence translation to additional settings. RE-AIM QuEST was used to evaluate a pharmacist-led hypertension management intervention at 3 VA facilities in 2008-2009. We systematically reviewed each of the five RE-AIM dimensions and created open-ended companion questions to quantitative measures and identified qualitative and quantitative data sources, measures, and analyses. To illustrate use of the RE-AIM QuEST framework, we provide examples of real-time, coordinated use of quantitative process measures and qualitative methods to identify site-specific issues, and retrospective use of these data sources and analyses to understand variation across sites and explain outcomes. For example, in the Reach dimension, we conducted real-time measurement of enrollment across sites and used qualitative data to better understand and address barriers at a low-enrollment site. The RE-AIM QuEST framework may be a useful tool for improving interventions in real-time, for understanding retrospectively why an intervention did or did not work, and for enhancing its sustainability and translation to other settings.

  18. Determining Selection across Heterogeneous Landscapes: A Perturbation-Based Method and Its Application to Modeling Evolution in Space.

    PubMed

    Wickman, Jonas; Diehl, Sebastian; Blasius, Bernd; Klausmeier, Christopher A; Ryabov, Alexey B; Brännström, Åke

    2017-04-01

    Spatial structure can decisively influence the way evolutionary processes unfold. To date, several methods have been used to study evolution in spatial systems, including population genetics, quantitative genetics, moment-closure approximations, and individual-based models. Here we extend the study of spatial evolutionary dynamics to eco-evolutionary models based on reaction-diffusion equations and adaptive dynamics. Specifically, we derive expressions for the strength of directional and stabilizing/disruptive selection that apply both in continuous space and to metacommunities with symmetrical dispersal between patches. For directional selection on a quantitative trait, this yields a way to integrate local directional selection across space and determine whether the trait value will increase or decrease. The robustness of this prediction is validated against quantitative genetics. For stabilizing/disruptive selection, we show that spatial heterogeneity always contributes to disruptive selection and hence always promotes evolutionary branching. The expression for directional selection is numerically very efficient and hence lends itself to simulation studies of evolutionary community assembly. We illustrate the application and utility of the expressions for this purpose with two examples of the evolution of resource utilization. Finally, we outline the domain of applicability of reaction-diffusion equations as a modeling framework and discuss their limitations.

  19. Refining Intervention Targets in Family-Based Research: Lessons From Quantitative Behavioral Genetics

    PubMed Central

    Leve, Leslie D.; Harold, Gordon T.; Ge, Xiaojia; Neiderhiser, Jenae M.; Patterson, Gerald

    2010-01-01

    The results from a large body of family-based research studies indicate that modifying the environment (specifically dimensions of the social environment) through intervention is an effective mechanism for achieving positive outcomes. Parallel to this work is a growing body of evidence from genetically informed studies indicating that social environmental factors are central to enhancing or offsetting genetic influences. Increased precision in the understanding of the role of the social environment in offsetting genetic risk might provide new information about environmental mechanisms that could be applied to prevention science. However, at present, the multifaceted conceptualization of the environment in prevention science is mismatched with the more limited measurement of the environment in many genetically informed studies. A framework for translating quantitative behavioral genetic research to inform the development of preventive interventions is presented in this article. The measurement of environmental indices amenable to modification is discussed within the context of quantitative behavioral genetic studies. In particular, emphasis is placed on the necessary elements that lead to benefits in prevention science, specifically the development of evidence-based interventions. An example from an ongoing prospective adoption study is provided to illustrate the potential of this translational process to inform the selection of preventive intervention targets. PMID:21188273

  20. A proposed framework for assessing risk from less-than-lifetime exposures to carcinogens.

    PubMed

    Felter, Susan P; Conolly, Rory B; Bercu, Joel P; Bolger, P Michael; Boobis, Alan R; Bos, Peter M J; Carthew, Philip; Doerrer, Nancy G; Goodman, Jay I; Harrouk, Wafa A; Kirkland, David J; Lau, Serrine S; Llewellyn, G Craig; Preston, R Julian; Schoeny, Rita; Schnatter, A Robert; Tritscher, Angelika; van Velsen, Frans; Williams, Gary M

    2011-07-01

    Quantitative methods for estimation of cancer risk have been developed for daily, lifetime human exposures. There are a variety of studies or methodologies available to address less-than-lifetime exposures. However, a common framework for evaluating risk from less-than-lifetime exposures (including short-term and/or intermittent exposures) does not exist, which could result in inconsistencies in risk assessment practice. To address this risk assessment need, a committee of the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute conducted a multisector workshop in late 2009 to discuss available literature, different methodologies, and a proposed framework. The proposed framework provides a decision tree and guidance for cancer risk assessments for less-than-lifetime exposures based on current knowledge of mode of action and dose-response. Available data from rodent studies and epidemiological studies involving less-than-lifetime exposures are considered, in addition to statistical approaches described in the literature for evaluating the impact of changing the dose rate and exposure duration for exposure to carcinogens. The decision tree also provides for scenarios in which an assumption of potential carcinogenicity is appropriate (e.g., based on structural alerts or genotoxicity data), but bioassay or other data are lacking from which a chemical-specific cancer potency can be determined. This paper presents an overview of the rationale for the workshop, reviews historical background, describes the proposed framework for assessing less-than-lifetime exposures to potential human carcinogens, and suggests next steps.

  1. The SAM framework: modeling the effects of management factors on human behavior in risk analysis.

    PubMed

    Murphy, D M; Paté-Cornell, M E

    1996-08-01

    Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.

  2. Neurobiological model of stimulated dopamine neurotransmission to interpret fast-scan cyclic voltammetry data.

    PubMed

    Harun, Rashed; Grassi, Christine M; Munoz, Miranda J; Torres, Gonzalo E; Wagner, Amy K

    2015-03-02

    Fast-scan cyclic voltammetry (FSCV) is an electrochemical method that can assess real-time in vivo dopamine (DA) concentration changes to study the kinetics of DA neurotransmission. Electrical stimulation of dopaminergic (DAergic) pathways can elicit FSCV DA responses that largely reflect a balance of DA release and reuptake. Interpretation of these evoked DA responses requires a framework to discern the contribution of DA release and reuptake. The current, widely implemented interpretive framework for doing so is the Michaelis-Menten (M-M) model, which is grounded on two assumptions- (1) DA release rate is constant during stimulation, and (2) DA reuptake occurs through dopamine transporters (DAT) in a manner consistent with M-M enzyme kinetics. Though the M-M model can simulate evoked DA responses that rise convexly, response types that predominate in the ventral striatum, the M-M model cannot simulate dorsal striatal responses that rise concavely. Based on current neurotransmission principles and experimental FSCV data, we developed a novel, quantitative, neurobiological framework to interpret DA responses that assumes DA release decreases exponentially during stimulation and continues post-stimulation at a diminishing rate. Our model also incorporates dynamic M-M kinetics to describe DA reuptake as a process of decreasing reuptake efficiency. We demonstrate that this quantitative, neurobiological model is an extension of the traditional M-M model that can simulate heterogeneous regional DA responses following manipulation of stimulation duration, frequency, and DA pharmacology. The proposed model can advance our interpretive framework for future in vivo FSCV studies examining regional DA kinetics and their alteration by disease and DA pharmacology. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. 'CHEATS': a generic information communication technology (ICT) evaluation framework.

    PubMed

    Shaw, Nicola T

    2002-05-01

    This paper describes a generic framework for the evaluation of information communication technologies. This framework, CHEATS, utilises both qualitative and quantitative research methods and has proved appropriate in multiple clinical settings including telepsychiatry, teledermatology and teleeducation. The paper demonstrates how a multidisciplinary approach is essential when evaluating new and emerging technologies, particularly when such systems are implemented in real service as opposed to a research setting.

  4. Using an Integrated, Multi-disciplinary Framework to Support Quantitative Microbial Risk Assessments

    EPA Science Inventory

    The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) provides the infrastructure to link disparate models and databases seamlessly, giving an assessor the ability to construct an appropriate conceptual site model from a host of modeling choices, so a numbe...

  5. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    NASA Astrophysics Data System (ADS)

    Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars

    2013-08-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.

  6. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  7. MO-D-213-06: Quantitative Image Quality Metrics Are for Physicists, Not Radiologists: How to Communicate to Your Radiologists Using Their Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szczykutowicz, T; Rubert, N; Ranallo, F

    Purpose: A framework for explaining differences in image quality to non-technical audiences in medial imaging is needed. Currently, this task is something that is learned “on the job.” The lack of a formal methodology for communicating optimal acquisition parameters into the clinic effectively mitigates many technological advances. As a community, medical physicists need to be held responsible for not only advancing image science, but also for ensuring its proper use in the clinic. This work outlines a framework that bridges the gap between the results from quantitative image quality metrics like detectability, MTF, and NPS and their effect on specificmore » anatomical structures present in diagnostic imaging tasks. Methods: Specific structures of clinical importance were identified for a body, an extremity, a chest, and a temporal bone protocol. Using these structures, quantitative metrics were used to identify the parameter space that should yield optimal image quality constrained within the confines of clinical logistics and dose considerations. The reading room workflow for presenting the proposed changes for imaging each of these structures is presented. The workflow consists of displaying images for physician review consisting of different combinations of acquisition parameters guided by quantitative metrics. Examples of using detectability index, MTF, NPS, noise and noise non-uniformity are provided. During review, the physician was forced to judge the image quality solely on those features they need for diagnosis, not on the overall “look” of the image. Results: We found that in many cases, use of this framework settled mis-agreements between physicians. Once forced to judge images on the ability to detect specific structures inter reader agreement was obtained. Conclusion: This framework will provide consulting, research/industrial, or in-house physicists with clinically relevant imaging tasks to guide reading room image review. This framework avoids use of the overall “look” or “feel” to dictate acquisition parameter selection. Equipment grants GE Healthcare.« less

  8. Brain tumor classification and segmentation using sparse coding and dictionary learning.

    PubMed

    Salman Al-Shaikhli, Saif Dawood; Yang, Michael Ying; Rosenhahn, Bodo

    2016-08-01

    This paper presents a novel fully automatic framework for multi-class brain tumor classification and segmentation using a sparse coding and dictionary learning method. The proposed framework consists of two steps: classification and segmentation. The classification of the brain tumors is based on brain topology and texture. The segmentation is based on voxel values of the image data. Using K-SVD, two types of dictionaries are learned from the training data and their associated ground truth segmentation: feature dictionary and voxel-wise coupled dictionaries. The feature dictionary consists of global image features (topological and texture features). The coupled dictionaries consist of coupled information: gray scale voxel values of the training image data and their associated label voxel values of the ground truth segmentation of the training data. For quantitative evaluation, the proposed framework is evaluated using different metrics. The segmentation results of the brain tumor segmentation (MICCAI-BraTS-2013) database are evaluated using five different metric scores, which are computed using the online evaluation tool provided by the BraTS-2013 challenge organizers. Experimental results demonstrate that the proposed approach achieves an accurate brain tumor classification and segmentation and outperforms the state-of-the-art methods.

  9. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  10. New Zealand evidence for the impact of primary healthcare investment in Capital and Coast District Health Board.

    PubMed

    Tan, Lee; Carr, Julia; Reidy, Johanna

    2012-03-30

    This paper provides New Zealand evidence on the effectiveness of primary care investment, measured through the Capital and Coast District Health Board's (DHB) Primary Health Care Framework. The Framework was developed in 2002/2003 to guide funding decisions at a DHB level, and to provide a transparent basis for evaluation of the implementation of the Primary Health Care Strategy in this district. The Framework used a mixed method approach; analysis was based on quantitative and qualitative data. This article demonstrates the link between investment in primary health care, increased access to primary care for high-need populations, workforce redistribution, and improved health outcomes. Over the study period, ambulatory sensitive hospitalisations and emergency department use reduced for enrolled populations and the District's immunisation coverage improved markedly. Funding and contracting which enhanced both 'mainstream' and 'niche' providers combined with community-based health initiatives resulted in a measurable impact on a range of health indicators and inequalities. Maori primary care providers improved access for Maori but also for their enrolled populations of Pacific and Other ethnicity. Growth and redistribution of primary care workforce was observed, improving the availability of general practitioners, nurses, and community workers in poorer communities.

  11. A Semiparametric Approach for Composite Functional Mapping of Dynamic Quantitative Traits

    PubMed Central

    Yang, Runqing; Gao, Huijiang; Wang, Xin; Zhang, Ji; Zeng, Zhao-Bang; Wu, Rongling

    2007-01-01

    Functional mapping has emerged as a powerful tool for mapping quantitative trait loci (QTL) that control developmental patterns of complex dynamic traits. Original functional mapping has been constructed within the context of simple interval mapping, without consideration of separate multiple linked QTL for a dynamic trait. In this article, we present a statistical framework for mapping QTL that affect dynamic traits by capitalizing on the strengths of functional mapping and composite interval mapping. Within this so-called composite functional-mapping framework, functional mapping models the time-dependent genetic effects of a QTL tested within a marker interval using a biologically meaningful parametric function, whereas composite interval mapping models the time-dependent genetic effects of the markers outside the test interval to control the genome background using a flexible nonparametric approach based on Legendre polynomials. Such a semiparametric framework was formulated by a maximum-likelihood model and implemented with the EM algorithm, allowing for the estimation and the test of the mathematical parameters that define the QTL effects and the regression coefficients of the Legendre polynomials that describe the marker effects. Simulation studies were performed to investigate the statistical behavior of composite functional mapping and compare its advantage in separating multiple linked QTL as compared to functional mapping. We used the new mapping approach to analyze a genetic mapping example in rice, leading to the identification of multiple QTL, some of which are linked on the same chromosome, that control the developmental trajectory of leaf age. PMID:17947431

  12. Cognitive niches: an ecological model of strategy selection.

    PubMed

    Marewski, Julian N; Schooler, Lael J

    2011-07-01

    How do people select among different strategies to accomplish a given task? Across disciplines, the strategy selection problem represents a major challenge. We propose a quantitative model that predicts how selection emerges through the interplay among strategies, cognitive capacities, and the environment. This interplay carves out for each strategy a cognitive niche, that is, a limited number of situations in which the strategy can be applied, simplifying strategy selection. To illustrate our proposal, we consider selection in the context of 2 theories: the simple heuristics framework and the ACT-R (adaptive control of thought-rational) architecture of cognition. From the heuristics framework, we adopt the thesis that people make decisions by selecting from a repertoire of simple decision strategies that exploit regularities in the environment and draw on cognitive capacities, such as memory and time perception. ACT-R provides a quantitative theory of how these capacities adapt to the environment. In 14 simulations and 10 experiments, we consider the choice between strategies that operate on the accessibility of memories and those that depend on elaborate knowledge about the world. Based on Internet statistics, our model quantitatively predicts people's familiarity with and knowledge of real-world objects, the distributional characteristics of the associated speed of memory retrieval, and the cognitive niches of classic decision strategies, including those of the fluency, recognition, integration, lexicographic, and sequential-sampling heuristics. In doing so, the model specifies when people will be able to apply different strategies and how accurate, fast, and effortless people's decisions will be.

  13. Towards a Quantitative Framework for Evaluating Vulnerability of Drinking Water Wells to Contamination from Unconventional Oil & Gas Development

    NASA Astrophysics Data System (ADS)

    Soriano, M., Jr.; Deziel, N. C.; Saiers, J. E.

    2017-12-01

    The rapid expansion of unconventional oil and gas (UO&G) production, made possible by advances in hydraulic fracturing (fracking), has triggered concerns over risks this extraction poses to water resources and public health. Concerns are particularly acute within communities that host UO&G development and rely heavily on shallow aquifers as sources of drinking water. This research aims to develop a quantitative framework to evaluate the vulnerability of drinking water wells to contamination from UO&G activities. The concept of well vulnerability is explored through application of backwards travel time probability modeling to estimate the likelihood that capture zones of drinking water wells circumscribe source locations of UO&G contamination. Sources of UO&G contamination considered in this analysis include gas well pads and documented sites of UO&G wastewater and chemical spills. The modeling approach is illustrated for a portion of Susquehanna County, Pennsylvania, where more than one thousand shale gas wells have been completed since 2005. Data from a network of eight multi-level groundwater monitoring wells installed in the study site in 2015 are used to evaluate the model. The well vulnerability concept is proposed as a physically based quantitative tool for policy-makers dealing with the management of contamination risks of drinking water wells. In particular, the model can be used to identify adequate setback distances of UO&G activities from drinking water wells and other critical receptors.

  14. Fuzzy Hybrid Deliberative/Reactive Paradigm (FHDRP)

    NASA Technical Reports Server (NTRS)

    Sarmadi, Hengameth

    2004-01-01

    This work aims to introduce a new concept for incorporating fuzzy sets in hybrid deliberative/reactive paradigm. After a brief review on basic issues of hybrid paradigm the definition of agent-based fuzzy hybrid paradigm, which enables the agents to proceed and extract their behavior through quantitative numerical and qualitative knowledge and to impose their decision making procedure via fuzzy rule bank, is discussed. Next an example performs a more applied platform for the developed approach and finally an overview of the corresponding agents architecture enhances agents logical framework.

  15. Finite-temperature Gutzwiller approximation from the time-dependent variational principle

    NASA Astrophysics Data System (ADS)

    Lanatà, Nicola; Deng, Xiaoyu; Kotliar, Gabriel

    2015-08-01

    We develop an extension of the Gutzwiller approximation to finite temperatures based on the Dirac-Frenkel variational principle. Our method does not rely on any entropy inequality, and is substantially more accurate than the approaches proposed in previous works. We apply our theory to the single-band Hubbard model at different fillings, and show that our results compare quantitatively well with dynamical mean field theory in the metallic phase. We discuss potential applications of our technique within the framework of first-principle calculations.

  16. A general framework for analysing diversity in science, technology and society.

    PubMed

    Stirling, Andy

    2007-08-22

    This paper addresses the scope for more integrated general analysis of diversity in science, technology and society. It proposes a framework recognizing three necessary but individually insufficient properties of diversity. Based on 10 quality criteria, it suggests a general quantitative non-parametric diversity heuristic. This allows the systematic exploration of diversity under different perspectives, including divergent conceptions of relevant attributes and contrasting weightings on different diversity properties. It is shown how this heuristic may be used to explore different possible trade-offs between diversity and other aspects of interest, including portfolio interactions. The resulting approach offers a way to be more systematic and transparent in the treatment of scientific and technological diversity in a range of fields, including conservation management, research governance, energy policy and sustainable innovation.

  17. Sequential Inverse Problems Bayesian Principles and the Logistic Map Example

    NASA Astrophysics Data System (ADS)

    Duan, Lian; Farmer, Chris L.; Moroz, Irene M.

    2010-09-01

    Bayesian statistics provides a general framework for solving inverse problems, but is not without interpretation and implementation problems. This paper discusses difficulties arising from the fact that forward models are always in error to some extent. Using a simple example based on the one-dimensional logistic map, we argue that, when implementation problems are minimal, the Bayesian framework is quite adequate. In this paper the Bayesian Filter is shown to be able to recover excellent state estimates in the perfect model scenario (PMS) and to distinguish the PMS from the imperfect model scenario (IMS). Through a quantitative comparison of the way in which the observations are assimilated in both the PMS and the IMS scenarios, we suggest that one can, sometimes, measure the degree of imperfection.

  18. Sound Processing Features for Speaker-Dependent and Phrase-Independent Emotion Recognition in Berlin Database

    NASA Astrophysics Data System (ADS)

    Anagnostopoulos, Christos Nikolaos; Vovoli, Eftichia

    An emotion recognition framework based on sound processing could improve services in human-computer interaction. Various quantitative speech features obtained from sound processing of acting speech were tested, as to whether they are sufficient or not to discriminate between seven emotions. Multilayered perceptrons were trained to classify gender and emotions on the basis of a 24-input vector, which provide information about the prosody of the speaker over the entire sentence using statistics of sound features. Several experiments were performed and the results were presented analytically. Emotion recognition was successful when speakers and utterances were “known” to the classifier. However, severe misclassifications occurred during the utterance-independent framework. At least, the proposed feature vector achieved promising results for utterance-independent recognition of high- and low-arousal emotions.

  19. Operationalizing the social-ecological systems framework to assess sustainability.

    PubMed

    Leslie, Heather M; Basurto, Xavier; Nenadovic, Mateja; Sievanen, Leila; Cavanaugh, Kyle C; Cota-Nieto, Juan José; Erisman, Brad E; Finkbeiner, Elena; Hinojosa-Arango, Gustavo; Moreno-Báez, Marcia; Nagavarapu, Sriniketh; Reddy, Sheila M W; Sánchez-Rodríguez, Alexandra; Siegel, Katherine; Ulibarria-Valenzuela, José Juan; Weaver, Amy Hudson; Aburto-Oropeza, Octavio

    2015-05-12

    Environmental governance is more effective when the scales of ecological processes are well matched with the human institutions charged with managing human-environment interactions. The social-ecological systems (SESs) framework provides guidance on how to assess the social and ecological dimensions that contribute to sustainable resource use and management, but rarely if ever has been operationalized for multiple localities in a spatially explicit, quantitative manner. Here, we use the case of small-scale fisheries in Baja California Sur, Mexico, to identify distinct SES regions and test key aspects of coupled SESs theory. Regions that exhibit greater potential for social-ecological sustainability in one dimension do not necessarily exhibit it in others, highlighting the importance of integrative, coupled system analyses when implementing spatial planning and other ecosystem-based strategies.

  20. Quantitative Connection Between Ensemble Thermodynamics and Single-Molecule Kinetics: A Case Study Using Cryo-EM and smFRET Investigations of the Ribosome

    PubMed Central

    Frank, Joachim; Gonzalez, Ruben L.

    2015-01-01

    At equilibrium, thermodynamic and kinetic information can be extracted from biomolecular energy landscapes by many techniques. However, while static, ensemble techniques yield thermodynamic data, often only dynamic, single-molecule techniques can yield the kinetic data that describes transition-state energy barriers. Here we present a generalized framework based upon dwell-time distributions that can be used to connect such static, ensemble techniques with dynamic, single-molecule techniques, and thus characterize energy landscapes to greater resolutions. We demonstrate the utility of this framework by applying it to cryogenic electron microscopy and single-molecule fluorescence resonance energy transfer studies of the bacterial ribosomal pretranslocation complex. Among other benefits, application of this framework to these data explains why two transient, intermediate conformations of the pretranslocation complex, which are observed in a cryogenic electron microscopy study, may not be observed in several single-molecule fluorescence resonance energy transfer studies. PMID:25785884

  1. Architectural Framework for Addressing Legacy Waste from the Cold War - 13611

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Love, Gregory A.; Glazner, Christopher G.; Steckley, Sam

    We present an architectural framework for the use of a hybrid simulation model of enterprise-wide operations used to develop system-level insight into the U.S. Department of Energy's (DOE) environmental cleanup of legacy nuclear waste at the Savannah River Site. We use this framework for quickly exploring policy and architectural options, analyzing plans, addressing management challenges and developing mitigation strategies for DOE Office of Environmental Management (EM). The socio-technical complexity of EM's mission compels the use of a qualitative approach to complement a more a quantitative discrete event modeling effort. We use this model-based analysis to pinpoint pressure and leverage pointsmore » and develop a shared conceptual understanding of the problem space and platform for communication among stakeholders across the enterprise in a timely manner. This approach affords the opportunity to discuss problems using a unified conceptual perspective and is also general enough that it applies to a broad range of capital investment/production operations problems. (authors)« less

  2. TrustBuilder2: A Reconfigurable Framework for Trust Negotiation

    NASA Astrophysics Data System (ADS)

    Lee, Adam J.; Winslett, Marianne; Perano, Kenneth J.

    To date, research in trust negotiation has focused mainly on the theoretical aspects of the trust negotiation process, and the development of proof of concept implementations. These theoretical works and proofs of concept have been quite successful from a research perspective, and thus researchers must now begin to address the systems constraints that act as barriers to the deployment of these systems. To this end, we present TrustBuilder2, a fully-configurable and extensible framework for prototyping and evaluating trust negotiation systems. TrustBuilder2 leverages a plug-in based architecture, extensible data type hierarchy, and flexible communication protocol to provide a framework within which numerous trust negotiation protocols and system configurations can be quantitatively analyzed. In this paper, we discuss the design and implementation of TrustBuilder2, study its performance, examine the costs associated with flexible authorization systems, and leverage this knowledge to identify potential topics for future research, as well as a novel method for attacking trust negotiation systems.

  3. A Framework for Establishing Standard Reference Scale of Texture by Multivariate Statistical Analysis Based on Instrumental Measurement and Sensory Evaluation.

    PubMed

    Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye

    2016-01-13

    A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.

  4. Quantitative Connection between Ensemble Thermodynamics and Single-Molecule Kinetics: A Case Study Using Cryogenic Electron Microscopy and Single-Molecule Fluorescence Resonance Energy Transfer Investigations of the Ribosome.

    PubMed

    Thompson, Colin D Kinz; Sharma, Ajeet K; Frank, Joachim; Gonzalez, Ruben L; Chowdhury, Debashish

    2015-08-27

    At equilibrium, thermodynamic and kinetic information can be extracted from biomolecular energy landscapes by many techniques. However, while static, ensemble techniques yield thermodynamic data, often only dynamic, single-molecule techniques can yield the kinetic data that describe transition-state energy barriers. Here we present a generalized framework based upon dwell-time distributions that can be used to connect such static, ensemble techniques with dynamic, single-molecule techniques, and thus characterize energy landscapes to greater resolutions. We demonstrate the utility of this framework by applying it to cryogenic electron microscopy (cryo-EM) and single-molecule fluorescence resonance energy transfer (smFRET) studies of the bacterial ribosomal pre-translocation complex. Among other benefits, application of this framework to these data explains why two transient, intermediate conformations of the pre-translocation complex, which are observed in a cryo-EM study, may not be observed in several smFRET studies.

  5. Addressing public health risks for cyanobacteria in recreational freshwaters: the Oregon and Vermont framework.

    PubMed

    Stone, David; Bress, William

    2007-01-01

    Toxigenic cyanobacteria, commonly known as blue green algae, are an emerging public health issue. The toxins produced by cyanobacteria have been detected across the United States in marine, freshwater and estuarine systems and associated with adverse health outcomes. The intent of this paper is to focus on how to address risk in a recreational freshwater scenario when toxigenic cyanobacteria are present. Several challenges exist for monitoring, assessing and posting water bodies and advising the public when toxigenic cyanobacteria are present. These include addressing different recreational activities that are associated with varying levels of risk, the dynamic temporal and spatial aspects of blooms, data gaps in toxicological information and the lack of training and resources for adequate surveillance. Without uniform federal guidance, numerous states have taken public health action for cyanobacteria with different criteria. Vermont and Oregon independently developed a tiered decision-making framework to reduce risk to recreational users when toxigenic cyanobacteria are present. This framework is based on a combination of qualitative and quantitative information.

  6. Assessing Learning Quality: Reconciling Institutional, Staff and Educational Demands.

    ERIC Educational Resources Information Center

    Biggs, John

    1996-01-01

    Two frameworks for educational assessment distinguished, which is quantitative, adequate for construing some kinds of learning, and qualitative, which is more appropriate for most objectives in higher education. The paper argues that institutions implicitly encourage quantitative assessment, thus encouraging a surface approach to learning although…

  7. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  8. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  9. Dynamic whole body PET parametric imaging: II. Task-oriented statistical estimation

    PubMed Central

    Karakatsanis, Nicolas A.; Lodge, Martin A.; Zhou, Y.; Wahl, Richard L.; Rahmim, Arman

    2013-01-01

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15–20cm) of a single bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study, was employed along with extensive Monte Carlo simulations and an initial clinical FDG patient dataset to validate and demonstrate the potential of the proposed statistical estimation methods. Both simulated and clinical results suggest that hybrid regression in the context of whole-body Patlak Ki imaging considerably reduces MSE without compromising high CNR. Alternatively, for a given CNR, hybrid regression enables larger reductions than OLS in the number of dynamic frames per bed, allowing for even shorter acquisitions of ~30min, thus further contributing to the clinical adoption of the proposed framework. Compared to the SUV approach, whole body parametric imaging can provide better tumor quantification, and can act as a complement to SUV, for the task of tumor detection. PMID:24080994

  10. Dynamic whole-body PET parametric imaging: II. Task-oriented statistical estimation.

    PubMed

    Karakatsanis, Nicolas A; Lodge, Martin A; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-10-21

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15-20 cm) of a single-bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole-body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study, was employed along with extensive Monte Carlo simulations and an initial clinical (18)F-deoxyglucose patient dataset to validate and demonstrate the potential of the proposed statistical estimation methods. Both simulated and clinical results suggest that hybrid regression in the context of whole-body Patlak Ki imaging considerably reduces MSE without compromising high CNR. Alternatively, for a given CNR, hybrid regression enables larger reductions than OLS in the number of dynamic frames per bed, allowing for even shorter acquisitions of ~30 min, thus further contributing to the clinical adoption of the proposed framework. Compared to the SUV approach, whole-body parametric imaging can provide better tumor quantification, and can act as a complement to SUV, for the task of tumor detection.

  11. Citizen surveillance for environmental monitoring: combining the efforts of citizen science and crowdsourcing in a quantitative data framework.

    PubMed

    Welvaert, Marijke; Caley, Peter

    2016-01-01

    Citizen science and crowdsourcing have been emerging as methods to collect data for surveillance and/or monitoring activities. They could be gathered under the overarching term citizen surveillance . The discipline, however, still struggles to be widely accepted in the scientific community, mainly because these activities are not embedded in a quantitative framework. This results in an ongoing discussion on how to analyze and make useful inference from these data. When considering the data collection process, we illustrate how citizen surveillance can be classified according to the nature of the underlying observation process measured in two dimensions-the degree of observer reporting intention and the control in observer detection effort. By classifying the observation process in these dimensions we distinguish between crowdsourcing, unstructured citizen science and structured citizen science. This classification helps the determine data processing and statistical treatment of these data for making inference. Using our framework, it is apparent that published studies are overwhelmingly associated with structured citizen science, and there are well developed statistical methods for the resulting data. In contrast, methods for making useful inference from purely crowd-sourced data remain under development, with the challenges of accounting for the unknown observation process considerable. Our quantitative framework for citizen surveillance calls for an integration of citizen science and crowdsourcing and provides a way forward to solve the statistical challenges inherent to citizen-sourced data.

  12. Quantitative risk stratification in Markov chains with limiting conditional distributions.

    PubMed

    Chan, David C; Pollett, Philip K; Weinstein, Milton C

    2009-01-01

    Many clinical decisions require patient risk stratification. The authors introduce the concept of limiting conditional distributions, which describe the equilibrium proportion of surviving patients occupying each disease state in a Markov chain with death. Such distributions can quantitatively describe risk stratification. The authors first establish conditions for the existence of a positive limiting conditional distribution in a general Markov chain and describe a framework for risk stratification using the limiting conditional distribution. They then apply their framework to a clinical example of a treatment indicated for high-risk patients, first to infer the risk of patients selected for treatment in clinical trials and then to predict the outcomes of expanding treatment to other populations of risk. For the general chain, a positive limiting conditional distribution exists only if patients in the earliest state have the lowest combined risk of progression or death. The authors show that in their general framework, outcomes and population risk are interchangeable. For the clinical example, they estimate that previous clinical trials have selected the upper quintile of patient risk for this treatment, but they also show that expanded treatment would weakly dominate this degree of targeted treatment, and universal treatment may be cost-effective. Limiting conditional distributions exist in most Markov models of progressive diseases and are well suited to represent risk stratification quantitatively. This framework can characterize patient risk in clinical trials and predict outcomes for other populations of risk.

  13. Toward a model-based cognitive neuroscience of mind wandering.

    PubMed

    Hawkins, G E; Mittner, M; Boekel, W; Heathcote, A; Forstmann, B U

    2015-12-03

    People often "mind wander" during everyday tasks, temporarily losing track of time, place, or current task goals. In laboratory-based tasks, mind wandering is often associated with performance decrements in behavioral variables and changes in neural recordings. Such empirical associations provide descriptive accounts of mind wandering - how it affects ongoing task performance - but fail to provide true explanatory accounts - why it affects task performance. In this perspectives paper, we consider mind wandering as a neural state or process that affects the parameters of quantitative cognitive process models, which in turn affect observed behavioral performance. Our approach thus uses cognitive process models to bridge the explanatory divide between neural and behavioral data. We provide an overview of two general frameworks for developing a model-based cognitive neuroscience of mind wandering. The first approach uses neural data to segment observed performance into a discrete mixture of latent task-related and task-unrelated states, and the second regresses single-trial measures of neural activity onto structured trial-by-trial variation in the parameters of cognitive process models. We discuss the relative merits of the two approaches, and the research questions they can answer, and highlight that both approaches allow neural data to provide additional constraint on the parameters of cognitive models, which will lead to a more precise account of the effect of mind wandering on brain and behavior. We conclude by summarizing prospects for mind wandering as conceived within a model-based cognitive neuroscience framework, highlighting the opportunities for its continued study and the benefits that arise from using well-developed quantitative techniques to study abstract theoretical constructs. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Using extreme phenotype sampling to identify the rare causal variants of quantitative traits in association studies.

    PubMed

    Li, Dalin; Lewinger, Juan Pablo; Gauderman, William J; Murcray, Cassandra Elizabeth; Conti, David

    2011-12-01

    Variants identified in recent genome-wide association studies based on the common-disease common-variant hypothesis are far from fully explaining the hereditability of complex traits. Rare variants may, in part, explain some of the missing hereditability. Here, we explored the advantage of the extreme phenotype sampling in rare-variant analysis and refined this design framework for future large-scale association studies on quantitative traits. We first proposed a power calculation approach for a likelihood-based analysis method. We then used this approach to demonstrate the potential advantages of extreme phenotype sampling for rare variants. Next, we discussed how this design can influence future sequencing-based association studies from a cost-efficiency (with the phenotyping cost included) perspective. Moreover, we discussed the potential of a two-stage design with the extreme sample as the first stage and the remaining nonextreme subjects as the second stage. We demonstrated that this two-stage design is a cost-efficient alternative to the one-stage cross-sectional design or traditional two-stage design. We then discussed the analysis strategies for this extreme two-stage design and proposed a corresponding design optimization procedure. To address many practical concerns, for example measurement error or phenotypic heterogeneity at the very extremes, we examined an approach in which individuals with very extreme phenotypes are discarded. We demonstrated that even with a substantial proportion of these extreme individuals discarded, an extreme-based sampling can still be more efficient. Finally, we expanded the current analysis and design framework to accommodate the CMC approach where multiple rare-variants in the same gene region are analyzed jointly. © 2011 Wiley Periodicals, Inc.

  15. Using Extreme Phenotype Sampling to Identify the Rare Causal Variants of Quantitative Traits in Association Studies

    PubMed Central

    Li, Dalin; Lewinger, Juan Pablo; Gauderman, William J.; Murcray, Cassandra Elizabeth; Conti, David

    2014-01-01

    Variants identified in recent genome-wide association studies based on the common-disease common-variant hypothesis are far from fully explaining the hereditability of complex traits. Rare variants may, in part, explain some of the missing hereditability. Here, we explored the advantage of the extreme phenotype sampling in rare-variant analysis and refined this design framework for future large-scale association studies on quantitative traits. We first proposed a power calculation approach for a likelihood-based analysis method. We then used this approach to demonstrate the potential advantages of extreme phenotype sampling for rare variants. Next, we discussed how this design can influence future sequencing-based association studies from a cost-efficiency (with the phenotyping cost included) perspective. Moreover, we discussed the potential of a two-stage design with the extreme sample as the first stage and the remaining nonextreme subjects as the second stage. We demonstrated that this two-stage design is a cost-efficient alternative to the one-stage cross-sectional design or traditional two-stage design. We then discussed the analysis strategies for this extreme two-stage design and proposed a corresponding design optimization procedure. To address many practical concerns, for example measurement error or phenotypic heterogeneity at the very extremes, we examined an approach in which individuals with very extreme phenotypes are discarded. We demonstrated that even with a substantial proportion of these extreme individuals discarded, an extreme-based sampling can still be more efficient. Finally, we expanded the current analysis and design framework to accommodate the CMC approach where multiple rare-variants in the same gene region are analyzed jointly. PMID:21922541

  16. The nexus between climate change, ecosystem services and human health: Towards a conceptual framework.

    PubMed

    Chiabai, Aline; Quiroga, Sonia; Martinez-Juarez, Pablo; Higgins, Sahran; Taylor, Tim

    2018-09-01

    This paper addresses the impact that changes in natural ecosystems can have on health and wellbeing focusing on the potential co-benefits that green spaces could provide when introduced as climate change adaptation measures. Ignoring such benefits could lead to sub-optimal planning and decision-making. A conceptual framework, building on the ecosystem-enriched Driver, Pressure, State, Exposure, Effect, Action model (eDPSEEA), is presented to aid in clarifying the relational structure between green spaces and human health, taking climate change as the key driver. The study has the double intention of (i) summarising the literature with a special emphasis on the ecosystem and health perspectives, as well as the main theories behind these impacts, and (ii) modelling these findings into a framework that allows for multidisciplinary approaches to the underlying relations between human health and green spaces. The paper shows that while the literature based on the ecosystem perspective presents a well-documented association between climate, health and green spaces, the literature using a health-based perspective presents mixed evidence in some cases. The role of contextual factors and the exposure mechanism are rarely addressed. The proposed framework could serve as a multidisciplinary knowledge platform for multi-perspecitve analysis and discussion among experts and stakeholders, as well as to support the operationalization of quantitative assessment and modelling exercises. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  17. A novel mechanistic modeling framework for analysis of electrode balancing and degradation modes in commercial lithium-ion cells

    NASA Astrophysics Data System (ADS)

    Schindler, Stefan; Danzer, Michael A.

    2017-03-01

    Aiming at a long-term stable and safe operation of rechargeable lithium-ion cells, elementary design aspects and degradation phenomena have to be considered depending on the specific application. Among the degrees of freedom in cell design, electrode balancing is of particular interest and has a distinct effect on useable capacity and voltage range. Concerning intrinsic degradation modes, understanding the underlying electrochemical processes and tracing the overall degradation history are the most crucial tasks. In this study, a model-based, minimal parameter framework for combined elucidation of electrode balancing and degradation pathways in commercial lithium-ion cells is introduced. The framework rests upon the simulation of full cell voltage profiles from the superposition of equivalent, artificially degraded half-cell profiles and allows to separate aging contributions from loss of available lithium and active materials in both electrodes. A physically meaningful coupling between thermodynamic and kinetic degradation modes based on the correlation between altered impedance features and loss of available lithium as well as loss of active material is proposed and validated by a low temperature degradation profile examined in one of our recent publications. The coupled framework is able to determine the electrode balancing within an error range of < 1% and the projected cell degradation is qualitatively and quantitatively in line with experimental observations.

  18. Conceptual Framework for Developing a Diabetes Information Network.

    PubMed

    Riazi, Hossein; Langarizadeh, Mostafa; Larijani, Bagher; Shahmoradi, Leila

    2016-06-01

    To provide a conceptual framework for managing diabetic patient care, and creating an information network for clinical research. A wide range of information technology (IT) based interventions such as distance learning, diabetes registries, personal or electronic health record systems, clinical information systems, and clinical decision support systems have so far been used in supporting diabetic care. Previous studies demonstrated that IT could improve diabetes care at its different aspects. There is however no comprehensive conceptual framework that defines how different IT applications can support diverse aspects of this care. Therefore, a conceptual framework that combines different IT solutions into a wide information network for improving care processes and for research purposes is widely lacking. In this study we describe the theoretical underpin of a big project aiming at building a wide diabetic information network namely DIANET. A literature review and a survey of national programs and existing regulations for diabetes management was conducted in order to define different aspects of diabetic care that should be supported by IT solutions. Both qualitative and quantitative research methods were used in this study. In addition to the results of a previous systematic literature review, two brainstorming and three expert panel sessions were conducted to identify requirements of a comprehensive information technology solution. Based on these inputs, the requirements for creating a diabetes information network were identified and used to create a questionnaire based on 9-point Likert scale. The questionnaire was finalized after removing some items based on calculated content validity ratio and content validity index coefficients. Cronbach's alpha reliability coefficient was also calculated (αTotal= 0.98, P<0.05, CI=0.95). The final questionnaire was containing 45 items. It was sent to 13 clinicians at two diabetes clinics of endocrine and metabolism research institute in order to assess the necessity level of the requirements for diabetes information network conceptual framework. The questionnaires were returned by 10 clinicians. Each requirement item was labeled as essential, semi-essential, or non-essential based on the mean of its scores. All requirement items were identified as essential or semi-essential. Thus, all of them were used to build the conceptual framework. The requirements were allocated into 11 groups each one representing a module in the conceptual framework. Each module was described separately. We proposed a conceptual framework for supporting diabetes care and research. Integrating different and heterogeneous clinical information systems of healthcare facilities and creating a comprehensive diabetics data warehouse for research purposes, would be possible by using the DIANET framework.

  19. Breaking barriers: a competency-based framework for promoting the integration of the pediatrician's education.

    PubMed

    Naghettini, Alessandra V; Bollela, Valdes R; Costa, Nilce M S C; Salgado, Luciana M R

    2011-01-01

    To describe the process of integration and revision of a pediatric program curriculum which resulted in the creation of a competency-based framework recommended in the Brazilian National Curricular Guidelines. Quali-quantitative analysis of an intervention evaluating the students and professors' perception of the pediatric program curriculum (focus groups and semi-structured interviews). Results were discussed during teaching development workshops. A competency-based framework was suggested for the pediatric program from the 3rd to the 6th year. The new curriculum was approved, implemented, and reevaluated six months later. Twelve students (12%) from the 3rd to the 6th year participated in the focus groups, and 11 professors (78.5%) answered the questionnaire. Most participants reported lack of integration among the courses, lack of knowledge about the learning goals of the internships, few opportunities of practice, and predominance of theoretical evaluation. In the training workshops, a competency-based curriculum was created after pediatrics and collective health professors reached an agreement. The new curriculum was focused on general competency, learning goals, opportunities available to learn these goals, and evaluation system. After six months, 93% (104/112) of students and 79% (11/14) of professors reported greater integration of the program and highlighted the inclusion of the clinical performance evaluation. The collective creation of a competency-based curriculum promoted higher satisfaction of students and professors. After being implemented, the new curriculum was considered to integrate the teaching practices and contents, improving the quality of the clinical performance evaluation.

  20. Making sense of complexity in context and implementation: the Context and Implementation of Complex Interventions (CICI) framework.

    PubMed

    Pfadenhauer, Lisa M; Gerhardus, Ansgar; Mozygemba, Kati; Lysdahl, Kristin Bakke; Booth, Andrew; Hofmann, Bjørn; Wahlster, Philip; Polus, Stephanie; Burns, Jacob; Brereton, Louise; Rehfuess, Eva

    2017-02-15

    The effectiveness of complex interventions, as well as their success in reaching relevant populations, is critically influenced by their implementation in a given context. Current conceptual frameworks often fail to address context and implementation in an integrated way and, where addressed, they tend to focus on organisational context and are mostly concerned with specific health fields. Our objective was to develop a framework to facilitate the structured and comprehensive conceptualisation and assessment of context and implementation of complex interventions. The Context and Implementation of Complex Interventions (CICI) framework was developed in an iterative manner and underwent extensive application. An initial framework based on a scoping review was tested in rapid assessments, revealing inconsistencies with respect to the underlying concepts. Thus, pragmatic utility concept analysis was undertaken to advance the concepts of context and implementation. Based on these findings, the framework was revised and applied in several systematic reviews, one health technology assessment (HTA) and one applicability assessment of very different complex interventions. Lessons learnt from these applications and from peer review were incorporated, resulting in the CICI framework. The CICI framework comprises three dimensions-context, implementation and setting-which interact with one another and with the intervention dimension. Context comprises seven domains (i.e., geographical, epidemiological, socio-cultural, socio-economic, ethical, legal, political); implementation consists of five domains (i.e., implementation theory, process, strategies, agents and outcomes); setting refers to the specific physical location, in which the intervention is put into practise. The intervention and the way it is implemented in a given setting and context can occur on a micro, meso and macro level. Tools to operationalise the framework comprise a checklist, data extraction tools for qualitative and quantitative reviews and a consultation guide for applicability assessments. The CICI framework addresses and graphically presents context, implementation and setting in an integrated way. It aims at simplifying and structuring complexity in order to advance our understanding of whether and how interventions work. The framework can be applied in systematic reviews and HTA as well as primary research and facilitate communication among teams of researchers and with various stakeholders.

  1. Deterministic and fuzzy-based methods to evaluate community resilience

    NASA Astrophysics Data System (ADS)

    Kammouh, Omar; Noori, Ali Zamani; Taurino, Veronica; Mahin, Stephen A.; Cimellaro, Gian Paolo

    2018-04-01

    Community resilience is becoming a growing concern for authorities and decision makers. This paper introduces two indicator-based methods to evaluate the resilience of communities based on the PEOPLES framework. PEOPLES is a multi-layered framework that defines community resilience using seven dimensions. Each of the dimensions is described through a set of resilience indicators collected from literature and they are linked to a measure allowing the analytical computation of the indicator's performance. The first method proposed in this paper requires data on previous disasters as an input and returns as output a performance function for each indicator and a performance function for the whole community. The second method exploits a knowledge-based fuzzy modeling for its implementation. This method allows a quantitative evaluation of the PEOPLES indicators using descriptive knowledge rather than deterministic data including the uncertainty involved in the analysis. The output of the fuzzy-based method is a resilience index for each indicator as well as a resilience index for the community. The paper also introduces an open source online tool in which the first method is implemented. A case study illustrating the application of the first method and the usage of the tool is also provided in the paper.

  2. Myocardium Segmentation From DE MRI Using Multicomponent Gaussian Mixture Model and Coupled Level Set.

    PubMed

    Liu, Jie; Zhuang, Xiahai; Wu, Lianming; An, Dongaolei; Xu, Jianrong; Peters, Terry; Gu, Lixu

    2017-11-01

    Objective: In this paper, we propose a fully automatic framework for myocardium segmentation of delayed-enhancement (DE) MRI images without relying on prior patient-specific information. Methods: We employ a multicomponent Gaussian mixture model to deal with the intensity heterogeneity of myocardium caused by the infarcts. To differentiate the myocardium from other tissues with similar intensities, while at the same time maintain spatial continuity, we introduce a coupled level set (CLS) to regularize the posterior probability. The CLS, as a spatial regularization, can be adapted to the image characteristics dynamically. We also introduce an image intensity gradient based term into the CLS, adding an extra force to the posterior probability based framework, to improve the accuracy of myocardium boundary delineation. The prebuilt atlases are propagated to the target image to initialize the framework. Results: The proposed method was tested on datasets of 22 clinical cases, and achieved Dice similarity coefficients of 87.43 ± 5.62% (endocardium), 90.53 ± 3.20% (epicardium) and 73.58 ± 5.58% (myocardium), which have outperformed three variants of the classic segmentation methods. Conclusion: The results can provide a benchmark for the myocardial segmentation in the literature. Significance: DE MRI provides an important tool to assess the viability of myocardium. The accurate segmentation of myocardium, which is a prerequisite for further quantitative analysis of myocardial infarction (MI) region, can provide important support for the diagnosis and treatment management for MI patients. Objective: In this paper, we propose a fully automatic framework for myocardium segmentation of delayed-enhancement (DE) MRI images without relying on prior patient-specific information. Methods: We employ a multicomponent Gaussian mixture model to deal with the intensity heterogeneity of myocardium caused by the infarcts. To differentiate the myocardium from other tissues with similar intensities, while at the same time maintain spatial continuity, we introduce a coupled level set (CLS) to regularize the posterior probability. The CLS, as a spatial regularization, can be adapted to the image characteristics dynamically. We also introduce an image intensity gradient based term into the CLS, adding an extra force to the posterior probability based framework, to improve the accuracy of myocardium boundary delineation. The prebuilt atlases are propagated to the target image to initialize the framework. Results: The proposed method was tested on datasets of 22 clinical cases, and achieved Dice similarity coefficients of 87.43 ± 5.62% (endocardium), 90.53 ± 3.20% (epicardium) and 73.58 ± 5.58% (myocardium), which have outperformed three variants of the classic segmentation methods. Conclusion: The results can provide a benchmark for the myocardial segmentation in the literature. Significance: DE MRI provides an important tool to assess the viability of myocardium. The accurate segmentation of myocardium, which is a prerequisite for further quantitative analysis of myocardial infarction (MI) region, can provide important support for the diagnosis and treatment management for MI patients.

  3. Exploiting mAb structure characteristics for a directed QbD implementation in early process development.

    PubMed

    Karlberg, Micael; von Stosch, Moritz; Glassey, Jarka

    2018-03-07

    In today's biopharmaceutical industries, the lead time to develop and produce a new monoclonal antibody takes years before it can be launched commercially. The reasons lie in the complexity of the monoclonal antibodies and the need for high product quality to ensure clinical safety which has a significant impact on the process development time. Frameworks such as quality by design are becoming widely used by the pharmaceutical industries as they introduce a systematic approach for building quality into the product. However, full implementation of quality by design has still not been achieved due to attrition mainly from limited risk assessment of product properties as well as the large number of process factors affecting product quality that needs to be investigated during the process development. This has introduced a need for better methods and tools that can be used for early risk assessment and predictions of critical product properties and process factors to enhance process development and reduce costs. In this review, we investigate how the quantitative structure-activity relationships framework can be applied to an existing process development framework such as quality by design in order to increase product understanding based on the protein structure of monoclonal antibodies. Compared to quality by design, where the effect of process parameters on the drug product are explored, quantitative structure-activity relationships gives a reversed perspective which investigates how the protein structure can affect the performance in different unit operations. This provides valuable information that can be used during the early process development of new drug products where limited process understanding is available. Thus, quantitative structure-activity relationships methodology is explored and explained in detail and we investigate the means of directly linking the structural properties of monoclonal antibodies to process data. The resulting information as a decision tool can help to enhance the risk assessment to better aid process development and thereby overcome some of the limitations and challenges present in QbD implementation today.

  4. Laboratory-scale in situ bioremediation in heterogeneous porous media: biokinetics-limited scenario.

    PubMed

    Song, Xin; Hong, Eunyoung; Seagren, Eric A

    2014-03-01

    Subsurface heterogeneities influence interfacial mass-transfer processes and affect the application of in situ bioremediation by impacting the availability of substrates to the microorganisms. However, for difficult-to-degrade compounds, and/or cases with inhibitory biodegradation conditions, slow biokinetics may also limit the overall bioremediation rate, or be as limiting as mass-transfer processes. In this work, a quantitative framework based on a set of dimensionless coefficients was used to capture the effects of the competing interfacial and biokinetic processes and define the overall rate-limiting process. An integrated numerical modeling and experimental approach was used to evaluate application of the quantitative framework for a scenario in which slow-biokinetics limited the overall bioremediation rate of a polycyclic aromatic hydrocarbon (naphthalene). Numerical modeling was conducted to simulate the groundwater flow and naphthalene transport and verify the system parameters, which were used in the quantitative framework application. The experiments examined the movement and biodegradation of naphthalene in a saturated, heterogeneous intermediate-scale flow cell with two layers of contrasting hydraulic conductivities. These experiments were conducted in two phases: Phase I, simulating an inhibited slow biodegradation; and Phase II, simulating an engineered bioremediation, with system perturbations selected to enhance the slow biodegradation rate. In Phase II, two engineered perturbations to the system were selected to examine their ability to enhance in situ biodegradation. In the first perturbation, nitrogen and phosphorus in excess of the required stoichiometric amounts were spiked into the influent solution to mimic a common remedial action taken in the field. The results showed that this perturbation had a moderate positive impact, consistent with slow biokinetics being the overall rate-limiting process. However, the second perturbation, which was to alleviate inhibition and increase the biodegradation rate, enhanced the overall biotransformation rate to a greater degree. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. New Educational Services Development: Framework for Technology Entrepreneurship Education at Universities in Egypt

    ERIC Educational Resources Information Center

    Abou-Warda, Sherein Hamed

    2016-01-01

    Purpose: The overall objective of the current study is to explore how universities can better developing new educational services. The purpose of this paper is to develop framework for technology entrepreneurship education (TEPE) within universities. Design/Methodology/Approach: Qualitative and quantitative research approaches were employed. This…

  6. The Conceptual Framework for the Development of a Mathematics Performance Assessment Instrument.

    ERIC Educational Resources Information Center

    Lane, Suzanne

    1993-01-01

    A conceptual framework is presented for the development of the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Cognitive Assessment Instrument (QCAI) that focuses on the ability of middle-school students to problem solve, reason, and communicate mathematically. The instrument will provide programatic rather than…

  7. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  8. A Typology of Mixed Methods Sampling Designs in Social Science Research

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Collins, Kathleen M. T.

    2007-01-01

    This paper provides a framework for developing sampling designs in mixed methods research. First, we present sampling schemes that have been associated with quantitative and qualitative research. Second, we discuss sample size considerations and provide sample size recommendations for each of the major research designs for quantitative and…

  9. Dual-Enrollment High-School Graduates' College-Enrollment Considerations

    ERIC Educational Resources Information Center

    Damrow, Roberta J.

    2017-01-01

    This quantitative study examined college enrollment considerations of dual-enrollment students enrolling at one Wisconsin credit-granting technical college. A combined college-choice theoretical framework guided this quantitative study that addressed two research questions: To what extent, if any, did the number of dual credits predict likelihood…

  10. Filling the knowledge gap: Integrating quantitative genetics and genomics in graduate education and outreach

    USDA-ARS?s Scientific Manuscript database

    The genomics revolution provides vital tools to address global food security. Yet to be incorporated into livestock breeding, molecular techniques need to be integrated into a quantitative genetics framework. Within the U.S., with shrinking faculty numbers with the requisite skills, the capacity to ...

  11. Framework for a Quantitative Systemic Toxicity Model (FutureToxII)

    EPA Science Inventory

    EPA’s ToxCast program profiles the bioactivity of chemicals in a diverse set of ~700 high throughput screening (HTS) assays. In collaboration with L’Oreal, a quantitative model of systemic toxicity was developed using no effect levels (NEL) from ToxRefDB for 633 chemicals with HT...

  12. Expert consensus on best evaluative practices in community-based rehabilitation.

    PubMed

    Grandisson, Marie; Thibeault, Rachel; Hébert, Michèle; Cameron, Debra

    2016-01-01

    The objective of this study was to generate expert consensus on best evaluative practices for community-based rehabilitation (CBR). This consensus includes key features of the evaluation process and methods, and discussion of whether a shared framework should be used to report findings and, if so, which framework should play this role. A Delphi study with two predefined rounds was conducted. Experts in CBR from a wide range of geographical areas and disciplinary backgrounds were recruited to complete the questionnaires. Both quantitative and qualitative analyses were performed to generate the recommendations for best practices in CBR evaluation. A panel of 42 experts reached consensus on 13 recommendations for best evaluative practices in CBR. In regard to the critical qualities of sound CBR evaluation processes, panellists emphasized that these processes should be inclusive, participatory, empowering and respectful of local cultures and languages. The group agreed that evaluators should consider the use of mixed methods and participatory tools, and should combine indicators from a universal list of CBR indicators with locally generated ones. The group also agreed that a common framework should guide CBR evaluations, and that this framework should be a flexible combination between the CBR Matrix and the CBR Principles. An expert panel reached consensus on key features of best evaluative practices in CBR. Knowledge transfer initiatives are now required to develop guidelines, tools and training opportunities to facilitate CBR program evaluations. CBR evaluation processes should strive to be inclusive, participatory, empowering and respectful of local cultures and languages. CBR evaluators should strongly consider using mixed methods, participatory tools, a combination of indicators generated with the local community and with others from a bank of CBR indicators. CBR evaluations should be situated within a shared, but flexible, framework. This shared framework could combine the CBR Matrix and the CBR Principles.

  13. A framework for scalable parameter estimation of gene circuit models using structural information.

    PubMed

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  14. Quantitative imaging biomarker ontology (QIBO) for knowledge representation of biomedical imaging biomarkers.

    PubMed

    Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David

    2013-08-01

    A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.

  15. A Quantitative ADME-base Tool for Exploring Human ...

    EPA Pesticide Factsheets

    Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It

  16. Automated compromised right lung segmentation method using a robust atlas-based active volume model with sparse shape composition prior in CT.

    PubMed

    Zhou, Jinghao; Yan, Zhennan; Lasio, Giovanni; Huang, Junzhou; Zhang, Baoshe; Sharma, Navesh; Prado, Karl; D'Souza, Warren

    2015-12-01

    To resolve challenges in image segmentation in oncologic patients with severely compromised lung, we propose an automated right lung segmentation framework that uses a robust, atlas-based active volume model with a sparse shape composition prior. The robust atlas is achieved by combining the atlas with the output of sparse shape composition. Thoracic computed tomography images (n=38) from patients with lung tumors were collected. The right lung in each scan was manually segmented to build a reference training dataset against which the performance of the automated segmentation method was assessed. The quantitative results of this proposed segmentation method with sparse shape composition achieved mean Dice similarity coefficient (DSC) of (0.72, 0.81) with 95% CI, mean accuracy (ACC) of (0.97, 0.98) with 95% CI, and mean relative error (RE) of (0.46, 0.74) with 95% CI. Both qualitative and quantitative comparisons suggest that this proposed method can achieve better segmentation accuracy with less variance than other atlas-based segmentation methods in the compromised lung segmentation. Published by Elsevier Ltd.

  17. Image-based metrology of porous tissue engineering scaffolds

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Srinivasan; Robb, Richard A.

    2006-03-01

    Tissue engineering is an interdisciplinary effort aimed at the repair and regeneration of biological tissues through the application and control of cells, porous scaffolds and growth factors. The regeneration of specific tissues guided by tissue analogous substrates is dependent on diverse scaffold architectural indices that can be derived quantitatively from the microCT and microMR images of the scaffolds. However, the randomness of pore-solid distributions in conventional stochastic scaffolds presents unique computational challenges. As a result, image-based characterization of scaffolds has been predominantly qualitative. In this paper, we discuss quantitative image-based techniques that can be used to compute the metrological indices of porous tissue engineering scaffolds. While bulk averaged quantities such as porosity and surface are derived directly from the optimal pore-solid delineations, the spatially distributed geometric indices are derived from the medial axis representations of the pore network. The computational framework proposed (to the best of our knowledge for the first time in tissue engineering) in this paper might have profound implications towards unraveling the symbiotic structure-function relationship of porous tissue engineering scaffolds.

  18. System-wide organization of actin cytoskeleton determines organelle transport in hypocotyl plant cells

    PubMed Central

    Nowak, Jacqueline; Ivakov, Alexander; Somssich, Marc; Persson, Staffan; Nikoloski, Zoran

    2017-01-01

    The actin cytoskeleton is an essential intracellular filamentous structure that underpins cellular transport and cytoplasmic streaming in plant cells. However, the system-level properties of actin-based cellular trafficking remain tenuous, largely due to the inability to quantify key features of the actin cytoskeleton. Here, we developed an automated image-based, network-driven framework to accurately segment and quantify actin cytoskeletal structures and Golgi transport. We show that the actin cytoskeleton in both growing and elongated hypocotyl cells has structural properties facilitating efficient transport. Our findings suggest that the erratic movement of Golgi is a stable cellular phenomenon that might optimize distribution efficiency of cell material. Moreover, we demonstrate that Golgi transport in hypocotyl cells can be accurately predicted from the actin network topology alone. Thus, our framework provides quantitative evidence for system-wide coordination of cellular transport in plant cells and can be readily applied to investigate cytoskeletal organization and transport in other organisms. PMID:28655850

  19. Predictive models of safety based on audit findings: Part 1: Model development and reliability.

    PubMed

    Hsiao, Yu-Lin; Drury, Colin; Wu, Changxu; Paquet, Victor

    2013-03-01

    This consecutive study was aimed at the quantitative validation of safety audit tools as predictors of safety performance, as we were unable to find prior studies that tested audit validity against safety outcomes. An aviation maintenance domain was chosen for this work as both audits and safety outcomes are currently prescribed and regulated. In Part 1, we developed a Human Factors/Ergonomics classification framework based on HFACS model (Shappell and Wiegmann, 2001a,b), for the human errors detected by audits, because merely counting audit findings did not predict future safety. The framework was tested for measurement reliability using four participants, two of whom classified errors on 1238 audit reports. Kappa values leveled out after about 200 audits at between 0.5 and 0.8 for different tiers of errors categories. This showed sufficient reliability to proceed with prediction validity testing in Part 2. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. Biologically Informed Individual-Based Network Model for Rift Valley Fever in the US and Evaluation of Mitigation Strategies

    PubMed Central

    Scoglio, Caterina M.

    2016-01-01

    Rift Valley fever (RVF) is a zoonotic disease endemic in sub-Saharan Africa with periodic outbreaks in human and animal populations. Mosquitoes are the primary disease vectors; however, Rift Valley fever virus (RVFV) can also spread by direct contact with infected tissues. The transmission cycle is complex, involving humans, livestock, and multiple species of mosquitoes. The epidemiology of RVFV in endemic areas is strongly affected by climatic conditions and environmental variables. In this research, we adapt and use a network-based modeling framework to simulate the transmission of RVFV among hypothetical cattle operations in Kansas, US. Our model considers geo-located livestock populations at the individual level while incorporating the role of mosquito populations and the environment at a coarse resolution. Extensive simulations show the flexibility of our modeling framework when applied to specific scenarios to quantitatively evaluate the efficacy of mosquito control and livestock movement regulations in reducing the extent and intensity of RVF outbreaks in the United States. PMID:27662585

  1. Biologically Informed Individual-Based Network Model for Rift Valley Fever in the US and Evaluation of Mitigation Strategies.

    PubMed

    Scoglio, Caterina M; Bosca, Claudio; Riad, Mahbubul H; Sahneh, Faryad D; Britch, Seth C; Cohnstaedt, Lee W; Linthicum, Kenneth J

    Rift Valley fever (RVF) is a zoonotic disease endemic in sub-Saharan Africa with periodic outbreaks in human and animal populations. Mosquitoes are the primary disease vectors; however, Rift Valley fever virus (RVFV) can also spread by direct contact with infected tissues. The transmission cycle is complex, involving humans, livestock, and multiple species of mosquitoes. The epidemiology of RVFV in endemic areas is strongly affected by climatic conditions and environmental variables. In this research, we adapt and use a network-based modeling framework to simulate the transmission of RVFV among hypothetical cattle operations in Kansas, US. Our model considers geo-located livestock populations at the individual level while incorporating the role of mosquito populations and the environment at a coarse resolution. Extensive simulations show the flexibility of our modeling framework when applied to specific scenarios to quantitatively evaluate the efficacy of mosquito control and livestock movement regulations in reducing the extent and intensity of RVF outbreaks in the United States.

  2. Metal-organic frameworks as biosensors for luminescence-based detection and imaging

    PubMed Central

    Miller, Sophie E.; Teplensky, Michelle H.; Moghadam, Peyman Z.; Fairen-Jimenez, David

    2016-01-01

    Metal-organic frameworks (MOFs), formed by the self-assembly of metal centres or clusters and organic linkers, possess many key structural and chemical features that have enabled them to be used in sensing platforms for a variety of environmentally, chemically and biomedically relevant compounds. In particular, their high porosity, large surface area, tuneable chemical composition, high degree of crystallinity, and potential for post-synthetic modification for molecular recognition make MOFs promising candidates for biosensing applications. In this review, we separate our discussion of MOF biosensors into two categories: quantitative sensing, focusing specifically on luminescence-based sensors for the direct measurement of a specific analyte, and qualitative sensing, where we describe MOFs used for fluorescence microscopy and as magnetic resonance imaging contrast agents. We highlight several key publications in each of these areas, concluding that MOFs present an exciting, versatile new platform for biosensing applications and imaging, and we expect to see their usage grow as the field progresses. PMID:27499847

  3. Gaseous species as reaction tracers in the solvothermal synthesis of the zinc oxide terephthalate MOF-5.

    PubMed

    Hausdorf, Steffen; Baitalow, Felix; Seidel, Jürgen; Mertens, Florian O R L

    2007-05-24

    Gaseous species emitted during the zinc oxide/zinc hydroxide 1,4-benzenedicarboxylate metal organic framework synthesis (MOF-5, MOF-69c) have been used to investigate the reaction scheme that leads to the framework creation. Changes of the gas-phase composition over time indicate that the decomposition of the solvent diethylformamide occurs at least via two competing reaction pathways that can be linked to the reaction's overall water and pH management. From isotope exchange experiments, we deduce that one of the decomposition pathways leads to the removal of water from the reaction mixture, which sets the conditions when the synthesis of an oxide-based (MOF-5) instead of an hydroxide-based MOF (MOF-69c) occurs. A quantitative account of most reactants and byproducts before and after the MOF-5/MOF-69c synthesis is presented. From the investigation of the reaction intermediates and byproducts, we derive a proposal of a basic reaction scheme for the standard synthesis zinc oxide carboxylate MOFs.

  4. An ice sheet model validation framework for the Greenland ice sheet

    NASA Astrophysics Data System (ADS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of < 1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  5. An ice sheet model validation framework for the Greenland ice sheet

    PubMed Central

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2018-01-01

    We propose a new ice sheet model validation framework – the Cryospheric Model Comparison Tool (CmCt) – that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation. PMID:29697704

  6. An Ice Sheet Model Validation Framework for the Greenland Ice Sheet

    NASA Technical Reports Server (NTRS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas A.; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey R.; Chambers, Don P.; Evans, Katherine J.; hide

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of less than 1 meter). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on Greenland over the past few decades. An extensible design will allow for continued use of the CmCt as future altimetry, gravimetry, and other remotely sensed data become available for use in ice sheet model validation.

  7. AESOP: A Python Library for Investigating Electrostatics in Protein Interactions.

    PubMed

    Harrison, Reed E S; Mohan, Rohith R; Gorham, Ronald D; Kieslich, Chris A; Morikis, Dimitrios

    2017-05-09

    Electric fields often play a role in guiding the association of protein complexes. Such interactions can be further engineered to accelerate complex association, resulting in protein systems with increased productivity. This is especially true for enzymes where reaction rates are typically diffusion limited. To facilitate quantitative comparisons of electrostatics in protein families and to describe electrostatic contributions of individual amino acids, we previously developed a computational framework called AESOP. We now implement this computational tool in Python with increased usability and the capability of performing calculations in parallel. AESOP utilizes PDB2PQR and Adaptive Poisson-Boltzmann Solver to generate grid-based electrostatic potential files for protein structures provided by the end user. There are methods within AESOP for quantitatively comparing sets of grid-based electrostatic potentials in terms of similarity or generating ensembles of electrostatic potential files for a library of mutants to quantify the effects of perturbations in protein structure and protein-protein association. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Variation in Research Designs Used to Test the Effectiveness of Dissemination and Implementation Strategies: A Review.

    PubMed

    Mazzucca, Stephanie; Tabak, Rachel G; Pilar, Meagan; Ramsey, Alex T; Baumann, Ana A; Kryzer, Emily; Lewis, Ericka M; Padek, Margaret; Powell, Byron J; Brownson, Ross C

    2018-01-01

    The need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions. We reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion. Of the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre-post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM ( n  = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework ( n  = 12 each). While several novel designs for D&I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D&I research.

  9. Whole-body direct 4D parametric PET imaging employing nested generalized Patlak expectation-maximization reconstruction

    PubMed Central

    Karakatsanis, Nicolas A.; Casey, Michael E.; Lodge, Martin A.; Rahmim, Arman; Zaidi, Habib

    2016-01-01

    Whole-body (WB) dynamic PET has recently demonstrated its potential in translating the quantitative benefits of parametric imaging to the clinic. Post-reconstruction standard Patlak (sPatlak) WB graphical analysis utilizes multi-bed multi-pass PET acquisition to produce quantitative WB images of the tracer influx rate Ki as a complimentary metric to the semi-quantitative standardized uptake value (SUV). The resulting Ki images may suffer from high noise due to the need for short acquisition frames. Meanwhile, a generalized Patlak (gPatlak) WB post-reconstruction method had been suggested to limit Ki bias of sPatlak analysis at regions with non-negligible 18F-FDG uptake reversibility; however, gPatlak analysis is non-linear and thus can further amplify noise. In the present study, we implemented, within the open-source Software for Tomographic Image Reconstruction (STIR) platform, a clinically adoptable 4D WB reconstruction framework enabling efficient estimation of sPatlak and gPatlak images directly from dynamic multi-bed PET raw data with substantial noise reduction. Furthermore, we employed the optimization transfer methodology to accelerate 4D expectation-maximization (EM) convergence by nesting the fast image-based estimation of Patlak parameters within each iteration cycle of the slower projection-based estimation of dynamic PET images. The novel gPatlak 4D method was initialized from an optimized set of sPatlak ML-EM iterations to facilitate EM convergence. Initially, realistic simulations were conducted utilizing published 18F-FDG kinetic parameters coupled with the XCAT phantom. Quantitative analyses illustrated enhanced Ki target-to-background ratio (TBR) and especially contrast-to-noise ratio (CNR) performance for the 4D vs. the indirect methods and static SUV. Furthermore, considerable convergence acceleration was observed for the nested algorithms involving 10–20 sub-iterations. Moreover, systematic reduction in Ki % bias and improved TBR were observed for gPatlak vs. sPatlak. Finally, validation on clinical WB dynamic data demonstrated the clinical feasibility and superior Ki CNR performance for the proposed 4D framework compared to indirect Patlak and SUV imaging. PMID:27383991

  10. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  11. Whole-body direct 4D parametric PET imaging employing nested generalized Patlak expectation-maximization reconstruction

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Nicolas A.; Casey, Michael E.; Lodge, Martin A.; Rahmim, Arman; Zaidi, Habib

    2016-08-01

    Whole-body (WB) dynamic PET has recently demonstrated its potential in translating the quantitative benefits of parametric imaging to the clinic. Post-reconstruction standard Patlak (sPatlak) WB graphical analysis utilizes multi-bed multi-pass PET acquisition to produce quantitative WB images of the tracer influx rate K i as a complimentary metric to the semi-quantitative standardized uptake value (SUV). The resulting K i images may suffer from high noise due to the need for short acquisition frames. Meanwhile, a generalized Patlak (gPatlak) WB post-reconstruction method had been suggested to limit K i bias of sPatlak analysis at regions with non-negligible 18F-FDG uptake reversibility; however, gPatlak analysis is non-linear and thus can further amplify noise. In the present study, we implemented, within the open-source software for tomographic image reconstruction platform, a clinically adoptable 4D WB reconstruction framework enabling efficient estimation of sPatlak and gPatlak images directly from dynamic multi-bed PET raw data with substantial noise reduction. Furthermore, we employed the optimization transfer methodology to accelerate 4D expectation-maximization (EM) convergence by nesting the fast image-based estimation of Patlak parameters within each iteration cycle of the slower projection-based estimation of dynamic PET images. The novel gPatlak 4D method was initialized from an optimized set of sPatlak ML-EM iterations to facilitate EM convergence. Initially, realistic simulations were conducted utilizing published 18F-FDG kinetic parameters coupled with the XCAT phantom. Quantitative analyses illustrated enhanced K i target-to-background ratio (TBR) and especially contrast-to-noise ratio (CNR) performance for the 4D versus the indirect methods and static SUV. Furthermore, considerable convergence acceleration was observed for the nested algorithms involving 10-20 sub-iterations. Moreover, systematic reduction in K i % bias and improved TBR were observed for gPatlak versus sPatlak. Finally, validation on clinical WB dynamic data demonstrated the clinical feasibility and superior K i CNR performance for the proposed 4D framework compared to indirect Patlak and SUV imaging.

  12. A Mixed-Methods Research Framework for Healthcare Process Improvement.

    PubMed

    Bastian, Nathaniel D; Munoz, David; Ventura, Marta

    2016-01-01

    The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.

  13. Topology Optimization using the Level Set and eXtended Finite Element Methods: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Villanueva Perez, Carlos Hernan

    Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.

  14. Distributed decision-making in electric power system transmission maintenance scheduling using multi-agent systems (MAS)

    NASA Astrophysics Data System (ADS)

    Zhang, Zhong

    In this work, motivated by the need to coordinate transmission maintenance scheduling among a multiplicity of self-interested entities in restructured power industry, a distributed decision support framework based on multiagent negotiation systems (MANS) is developed. An innovative risk-based transmission maintenance optimization procedure is introduced. Several models for linking condition monitoring information to the equipment's instantaneous failure probability are presented, which enable quantitative evaluation of the effectiveness of maintenance activities in terms of system cumulative risk reduction. Methodologies of statistical processing, equipment deterioration evaluation and time-dependent failure probability calculation are also described. A novel framework capable of facilitating distributed decision-making through multiagent negotiation is developed. A multiagent negotiation model is developed and illustrated that accounts for uncertainty and enables social rationality. Some issues of multiagent negotiation convergence and scalability are discussed. The relationships between agent-based negotiation and auction systems are also identified. A four-step MAS design methodology for constructing multiagent systems for power system applications is presented. A generic multiagent negotiation system, capable of inter-agent communication and distributed decision support through inter-agent negotiations, is implemented. A multiagent system framework for facilitating the automated integration of condition monitoring information and maintenance scheduling for power transformers is developed. Simulations of multiagent negotiation-based maintenance scheduling among several independent utilities are provided. It is shown to be a viable alternative solution paradigm to the traditional centralized optimization approach in today's deregulated environment. This multiagent system framework not only facilitates the decision-making among competing power system entities, but also provides a tool to use in studying competitive industry relative to monopolistic industry.

  15. Modeling Synergistic Drug Inhibition of Mycobacterium tuberculosis Growth in Murine Macrophages

    DTIC Science & Technology

    2011-01-01

    important application of metabolic network modeling is the ability to quantitatively model metabolic enzyme inhibition and predict bacterial growth...describe the extensions of this framework to model drug- induced growth inhibition of M. tuberculosis in macrophages.39 Mathematical framework Fig. 1 shows...starting point, we used the previously developed iNJ661v model to represent the metabolic Fig. 1 Mathematical framework: a set of coupled models used to

  16. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    PubMed Central

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  17. Multiobject relative fuzzy connectedness and its implications in image segmentation

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Saha, Punam K.

    2001-07-01

    The notion of fuzzy connectedness captures the idea of hanging-togetherness of image elements in an object by assigning a strength of connectedness to every possible path between every possible pair of image elements. This concept leads to powerful image segmentation algorithms based on dynamic programming whose effectiveness has been demonstrated on 1000s of images in a variety of applications. In a previous framework, we introduced the notion of relative fuzzy connectedness for separating a foreground object from a background object. In this framework, an image element c is considered to belong to that among these two objects with respect to whose reference image element c has the higher strength of connectedness. In fuzzy connectedness, a local fuzzy reflation called affinity is used on the image domain. This relation was required for theoretical reasons to be of fixed form in the previous framework. In the present paper, we generalize relative connectedness to multiple objects, allowing all objects (of importance) to compete among themselves to grab membership of image elements based on their relative strength of connectedness to reference elements. We also allow affinity to be tailored to the individual objects. We present a theoretical and algorithmic framework and demonstrate that the objects defined are independent of the reference elements chosen as long as they are not in the fuzzy boundary between objects. Examples from medical imaging are presented to illustrate visually the effectiveness of multiple object relative fuzzy connectedness. A quantitative evaluation based on 160 mathematical phantom images demonstrates objectively the effectiveness of relative fuzzy connectedness with object- tailored affinity relation.

  18. European consensus on the concepts and measurement of the pathophysiological neuromuscular responses to passive muscle stretch.

    PubMed

    van den Noort, J C; Bar-On, L; Aertbeliën, E; Bonikowski, M; Braendvik, S M; Broström, E W; Buizer, A I; Burridge, J H; van Campenhout, A; Dan, B; Fleuren, J F; Grunt, S; Heinen, F; Horemans, H L; Jansen, C; Kranzl, A; Krautwurst, B K; van der Krogt, M; Lerma Lara, S; Lidbeck, C M; Lin, J-P; Martinez, I; Meskers, C; Metaxiotis, D; Molenaers, G; Patikas, D A; Rémy-Néris, O; Roeleveld, K; Shortland, A P; Sikkens, J; Sloot, L; Vermeulen, R J; Wimmer, C; Schröder, A S; Schless, S; Becher, J G; Desloovere, K; Harlaar, J

    2017-07-01

    To support clinical decision-making in central neurological disorders, a physical examination is used to assess responses to passive muscle stretch. However, what exactly is being assessed is expressed and interpreted in different ways. A clear diagnostic framework is lacking. Therefore, the aim was to arrive at unambiguous terminology about the concepts and measurement around pathophysiological neuromuscular response to passive muscle stretch. During two consensus meetings, 37 experts from 12 European countries filled online questionnaires based on a Delphi approach, followed by plenary discussion after rounds. Consensus was reached for agreement ≥75%. The term hyper-resistance should be used to describe the phenomenon of impaired neuromuscular response during passive stretch, instead of for example 'spasticity' or 'hypertonia'. From there, it is essential to distinguish non-neural (tissue-related) from neural (central nervous system related) contributions to hyper-resistance. Tissue contributions are elasticity, viscosity and muscle shortening. Neural contributions are velocity dependent stretch hyperreflexia and non-velocity dependent involuntary background activation. The term 'spasticity' should only be used next to stretch hyperreflexia, and 'stiffness' next to passive tissue contributions. When joint angle, moment and electromyography are recorded, components of hyper-resistance within the framework can be quantitatively assessed. A conceptual framework of pathophysiological responses to passive muscle stretch is defined. This framework can be used in clinical assessment of hyper-resistance and will improve communication between clinicians. Components within the framework are defined by objective parameters from instrumented assessment. These parameters need experimental validation in order to develop treatment algorithms based on the aetiology of the clinical phenomena. © 2017 EAN.

  19. Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.

    2018-03-01

    An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  20. A framework to measure the value of public health services.

    PubMed

    Jacobson, Peter D; Neumann, Peter J

    2009-10-01

    To develop a framework that public health practitioners could use to measure the value of public health services. Primary data were collected from August 2006 through March 2007. We interviewed (n=46) public health practitioners in four states, leaders of national public health organizations, and academic researchers. Using a semi-structured interview protocol, we conducted a series of qualitative interviews to define the component parts of value for public health services and identify methodologies used to measure value and data collected. The primary form of analysis is descriptive, synthesizing information across respondents as to how they measure the value of their services. Our interviews did not reveal a consensus on how to measure value or a specific framework for doing so. Nonetheless, the interviews identified some potential strategies, such as cost accounting and performance-based contracting mechanisms. The interviews noted implementation barriers, including limits to staff capacity and data availability. We developed a framework that considers four component elements to measure value: external factors that must be taken into account (i.e., mandates); key internal actions that a local health department must take (i.e., staff assessment); using appropriate quantitative measures; and communicating value to elected officials and the public.

  1. Analysis of the Image of Scientists Portrayed in the Lebanese National Science Textbooks

    NASA Astrophysics Data System (ADS)

    Yacoubian, Hagop A.; Al-Khatib, Layan; Mardirossian, Taline

    2017-07-01

    This article presents an analysis of how scientists are portrayed in the Lebanese national science textbooks. The purpose of this study was twofold. First, to develop a comprehensive analytical framework that can serve as a tool to analyze the image of scientists portrayed in educational resources. Second, to analyze the image of scientists portrayed in the Lebanese national science textbooks that are used in Basic Education. An analytical framework, based on an extensive review of the relevant literature, was constructed that served as a tool for analyzing the textbooks. Based on evidence-based stereotypes, the framework focused on the individual and work-related characteristics of scientists. Fifteen science textbooks were analyzed using both quantitative and qualitative measures. Our analysis of the textbooks showed the presence of a number of stereotypical images. The scientists are predominantly white males of European descent. Non-Western scientists, including Lebanese and/or Arab scientists are mostly absent in the textbooks. In addition, the scientists are portrayed as rational individuals who work alone, who conduct experiments in their labs by following the scientific method, and by operating within Eurocentric paradigms. External factors do not influence their work. They are engaged in an enterprise which is objective, which aims for discovering the truth out there, and which involves dealing with direct evidence. Implications for science education are discussed.

  2. Evaluating the Sustainability of School-Based Health Centers.

    PubMed

    Navarro, Stephanie; Zirkle, Dorothy L; Barr, Donald A

    2017-01-01

    The United States is facing a surge in the number of school-based health centers (SBHCs) owing to their success in delivering positive health outcomes and increasing access to care. To preserve this success, experts have developed frameworks for creating sustainable SBHCs; however, little research has affirmed or added to these models. This research seeks to analyze elements of sustainability in a case study of three SBHCs in San Diego, California, with the purpose of creating a research-based framework of SBHC sustainability to supplement expertly derived models. Using a mixed methods study design, data were collected from interviews with SBHC stakeholders, observations in SBHCs, and SBHC budgets. A grounded theory qualitative analysis and a quantitative budget analysis were completed to develop a theoretical framework for the sustainability of SBHCs. Forty-one interviews were conducted, 6 hours of observations were completed, and 3 years of SBHC budgets were analyzed to identify care coordination, community buy-in, community awareness, and SBHC partner cooperation as key themes of sustainability promoting patient retention for sustainable billing and reimbursement levels. These findings highlight the unique ways in which SBHCs gain community buy-in and awareness by becoming trusted sources of comprehensive and coordinated care within communities and among vulnerable populations. Findings also support ideas from expert models of SBHC sustainability calling for well-defined and executed community partnerships and quality coordinated care in the procurement of sustainable SBHC funding.

  3. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  4. Using a Positive Psychology and Family Framework to Understand Mexican American Adolescents' College-Going Beliefs

    ERIC Educational Resources Information Center

    Vela, Javier C.; Lenz, A. Stephen; Sparrow, Gregory Scott; Gonzalez, Stacey Lee

    2017-01-01

    Positive psychology is a useful framework to understand Mexican American adolescents' academic experiences. We used a quantitative, predictive design to explore how presence of meaning in life, search for meaning in life, subjective happiness, hope, and family importance influenced 131 Mexican American adolescents' college-going beliefs. We used…

  5. The Fathering Indicators Framework: A Tool for Quantitative and Qualitative Analysis.

    ERIC Educational Resources Information Center

    Gadsden, Vivian, Ed.; Fagan, Jay, Ed.; Ray, Aisha, Ed.; Davis, James Earl, Ed.

    The Fathering Indicators Framework (FIF) is an evaluation tool designed to help researchers, practitioners, and policymakers conceptualize, examine, and measure change in fathering behaviors in relation to child and family well-being. This report provides a detailed overview of the research and theory informing the development of the FIF. The FIF…

  6. Teaching Experimental Methods: A Framework for Hands-On Modules

    ERIC Educational Resources Information Center

    Doherty, David

    2011-01-01

    Experiments provide a simple and engaging framework for familiarizing students with the process of quantitative social research. In this article, I illustrate how experiments can be used in the classroom environment by describing a module that was implemented in four high school classrooms. The module familiarized students with how the scientific…

  7. The Teaching Excellence Framework: Would You Tell Me, Please, Which Way I Ought to Go from Here

    ERIC Educational Resources Information Center

    Berger, Dan; Wild, Charles

    2016-01-01

    The UK government's Green Paper, "Fulfilling Our Potential: Teaching Excellence, Social Mobility and Student Choice", presents both significant challenges and opportunities for universities. Whilst the quantitative element of the proposed Teaching Excellence Framework (TEF), underpinned by Big Data, offers the tantalizing opportunity to…

  8. High School Students' Informal Reasoning on a Socio-Scientific Issue: Qualitative and Quantitative Analyses

    ERIC Educational Resources Information Center

    Wu, Ying-Tien; Tsai, Chin-Chung

    2007-01-01

    Recently, the significance of learners' informal reasoning on socio-scientific issues has received increasing attention among science educators. To gain deeper insights into this important issue, an integrated analytic framework was developed in this study. With this framework, 71 Grade 10 students' informal reasoning about nuclear energy usage…

  9. Generational Attitudes toward Workplace Fun and Their Relationship to Job Satisfaction

    ERIC Educational Resources Information Center

    Attebery, Esther

    2017-01-01

    Purpose: The purpose of this quantitative study was to examine attitudes toward workplace fun and overall job satisfaction of baby boomer, Generation X, and millennial staff employees at a Christian university in California, and determine if there is a predictive relationship between them. Conceptual Framework: The framework was developed from…

  10. Does the Community of Inquiry Framework Predict Outcomes in Online MBA Courses?

    ERIC Educational Resources Information Center

    Arbaugh, J. B.

    2008-01-01

    While Garrison and colleagues' (2000) Community of Inquiry (CoI) framework has generated substantial interest among online learning researchers, it has yet to be subjected to extensive quantitative verification or tested for external validity. Using a sample of students from 55 online MBA courses, the findings of this study suggest strong…

  11. Benefits of e-Learning Benchmarks: Australian Case Studies

    ERIC Educational Resources Information Center

    Choy, Sarojni

    2007-01-01

    In 2004 the Australian Flexible Learning Framework developed a suite of quantitative and qualitative indicators on the uptake, use and impact of e-learning in the Vocational Education and Training (VET) sector. These indicators were used to design items for a survey to gather quantitative data for benchmarking. A series of four surveys gathered…

  12. A hierarchical-multiobjective framework for risk management

    NASA Technical Reports Server (NTRS)

    Haimes, Yacov Y.; Li, Duan

    1991-01-01

    A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.

  13. Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.

    PubMed

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L

    2015-09-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.

  14. Quantitative and qualitative analysis of student textbook summary writing

    NASA Astrophysics Data System (ADS)

    Demaree, Dedra; Allie, Saalih; Low, Michael; Taylor, Julian

    2008-10-01

    The majority of "special access" students at the University of Cape Town are second language English speakers for whom reading the physics textbook is daunting. As a strategy to encourage meaningful engagement with the text, students wrote textbook summaries due the day material was covered in class. The summaries were returned, and they could bring them or re-write them for use during their examinations. A framework was developed to analyze the summaries based on Waywood, defining three cognitive levels seen in mathematics journaling: recounting, summarizing, and dialoging. This framework was refined, expanded, and tested. Interviews with students were conducted for their views on summary writing and survey questions were included on their final exams. The study was carried out in the 2007 spring semester of the "Foundation Physics Course," a component of the special access program.

  15. One of These Things Is Not Like the Others: The Idea of Precedence in Health Technology Assessment and Coverage Decisions

    PubMed Central

    Giacomini, Mita

    2005-01-01

    Health plans often deliberate covering technologies with challenging purposes, effects, or costs. They must integrate quantitative evidence (e.g., how well a technology works) with qualitative, normative assessments (e.g., whether it works well enough for a worthwhile purpose). Arguments from analogy and precedent help integrate these criteria and establish standards for their policy application. Examples of arguments are described for three technologies (ICSI, genetic tests, and Viagra). Drawing lessons from law, ethics, philosophy, and the social sciences, a framework is developed for case-based evaluation of new technologies. The decision-making cycle includes (1) taking stock of past decisions and formulating precedents, (2) deciding new cases, and (3) assimilating decisions into the case history and evaluation framework. Each stage requires distinctive decision maker roles, information, and methods. PMID:15960769

  16. Using Mixed Methods to Evaluate a Community Intervention for Sexual Assault Survivors: A Methodological Tale.

    PubMed

    Campbell, Rebecca; Patterson, Debra; Bybee, Deborah

    2011-03-01

    This article reviews current epistemological and design issues in the mixed methods literature and then examines the application of one specific design, a sequential explanatory mixed methods design, in an evaluation of a community-based intervention to improve postassault care for sexual assault survivors. Guided by a pragmatist epistemological framework, this study collected quantitative and qualitative data to understand how the implementation of a Sexual Assault Nurse Examiner (SANE) program affected prosecution rates of adult sexual assault cases in a large midwestern community. Quantitative results indicated that the program was successful in affecting legal systems change and the qualitative data revealed the mediating mechanisms of the intervention's effectiveness. Challenges of implementing this design are discussed, including epistemological and practical difficulties that developed from blending methodologies into a single project. © The Author(s) 2011.

  17. Quantitative assessment of submicron scale anisotropy in tissue multifractality by scattering Mueller matrix in the framework of Born approximation

    NASA Astrophysics Data System (ADS)

    Das, Nandan Kumar; Dey, Rajib; Chakraborty, Semanti; Panigrahi, Prasanta K.; Meglinski, Igor; Ghosh, Nirmalya

    2018-04-01

    A number of tissue-like disordered media exhibit local anisotropy of scattering in the scaling behavior. Scaling behavior contains wealth of fractal or multifractal properties. We demonstrate that the spatial dielectric fluctuations in a sample of biological tissue exhibit multifractal anisotropy. Multifractal anisotropy encoded in the wavelength variation of the light scattering Mueller matrix and manifesting as an intriguing spectral diattenuation effect. We developed an inverse method for the quantitative assessment of the multifractal anisotropy. The method is based on the processing of relevant Mueller matrix elements in Fourier domain by using Born approximation, followed by the multifractal analysis. The approach promises for probing subtle micro-structural changes in biological tissues associated with the cancer and precancer, as well as for non-destructive characterization of a wide range of scattering materials.

  18. Empirical evolution of a framework that supports the development of nursing competence.

    PubMed

    Lima, Sally; Jordan, Helen L; Kinney, Sharon; Hamilton, Bridget; Newall, Fiona

    2016-04-01

    The aim of this study was to refine a framework for developing competence, for graduate nurses new to paediatric nursing in a transition programme. A competent healthcare workforce is essential to ensuring quality care. There are strong professional and societal expectations that nurses will be competent. Despite the importance of the topic, the most effective means through which competence develops remains elusive. A qualitative explanatory method was applied as part of a mixed methods design. Twenty-one graduate nurses taking part in a 12-month transition programme participated in semi-structured interviews between October and November 2013. Interviews were informed by data analysed during a preceding quantitative phase. Participants were provided with their quantitative results and a preliminary model for development of competence and asked to explain why their competence had developed as it had. The findings from the interviews, considered in combination with the preliminary model and quantitative results, enabled conceptualization of a Framework for Developing Competence. Key elements include: the individual in the team, identification and interpretation of standards, asking questions, guidance and engaging in endeavours, all taking place in a particular context. Much time and resources are directed at supporting the development of nursing competence, with little evidence as to the most effective means. This study led to conceptualization of a theory thought to underpin the development of nursing competence, particularly in a paediatric setting for graduate nurses. Future research should be directed at investigating the framework in other settings. © 2015 John Wiley & Sons Ltd.

  19. lazar: a modular predictive toxicology framework

    PubMed Central

    Maunz, Andreas; Gütlein, Martin; Rautenberg, Micha; Vorgrimmler, David; Gebele, Denis; Helma, Christoph

    2013-01-01

    lazar (lazy structure–activity relationships) is a modular framework for predictive toxicology. Similar to the read across procedure in toxicological risk assessment, lazar creates local QSAR (quantitative structure–activity relationship) models for each compound to be predicted. Model developers can choose between a large variety of algorithms for descriptor calculation and selection, chemical similarity indices, and model building. This paper presents a high level description of the lazar framework and discusses the performance of example classification and regression models. PMID:23761761

  20. Rapid Delivery of Cyber Capabilities: Evaluation of the Requirement for a Rapid Cyber Acquisition Process

    DTIC Science & Technology

    2012-06-01

    record (PoR) to give both a quantitative and qualitative perspective on the rapid cyber acquisitions framework . It also investigates if cyber operations...acquisition is a complex topic that does not yet have a solidified framework . To scope this research, a comprehensive review of past, present and...for AT&L is working with the DoD cyberspace community to develop a common framework for Services and Agencies to acquire capabilities for cyberspace

  1. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  2. Predicting the effects of polychlorinated biphenyls on cetacean populations through impacts on immunity and calf survival.

    PubMed

    Hall, Ailsa J; McConnell, Bernie J; Schwacke, Lori H; Ylitalo, Gina M; Williams, Rob; Rowles, Teri K

    2018-02-01

    The potential impact of exposure to polychlorinated biphenyls (PCBs) on the health and survival of cetaceans continues to be an issue for conservation and management, yet few quantitative approaches for estimating population level effects have been developed. An individual based model (IBM) for assessing effects on both calf survival and immunity was developed and tested. Three case study species (bottlenose dolphin, humpback whale and killer whale) in four populations were taken as examples and the impact of varying levels of PCB uptake on achievable population growth was assessed. The unique aspect of the model is its ability to evaluate likely effects of immunosuppression in addition to calf survival, enabling consequences of PCB exposure on immune function on all age-classes to be explored. By incorporating quantitative tissue concentration-response functions from laboratory animal model species into an IBM framework, population trajectories were generated. Model outputs included estimated concentrations of PCBs in the blubber of females by age, which were then compared to published empirical data. Achievable population growth rates were more affected by the inclusion of effects of PCBs on immunity than on calf survival, but the magnitude depended on the virulence of any subsequent encounter with a pathogen and the proportion of the population exposed. Since the starting population parameters were from historic studies, which may already be impacted by PCBs, the results should be interpreted on a relative rather than an absolute basis. The framework will assist in providing quantitative risk assessments for populations of concern. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Integration of biotic ligand models (BLM) and bioaccumulation kinetics into a mechanistic framework for metal uptake in aquatic organisms.

    PubMed

    Veltman, Karin; Huijbregts, Mark A J; Hendriks, A Jan

    2010-07-01

    Both biotic ligand models (BLM) and bioaccumulation models aim to quantify metal exposure based on mechanistic knowledge, but key factors included in the description of metal uptake differ between the two approaches. Here, we present a quantitative comparison of both approaches and show that BLM and bioaccumulation kinetics can be merged into a common mechanistic framework for metal uptake in aquatic organisms. Our results show that metal-specific absorption efficiencies calculated from BLM-parameters for freshwater fish are highly comparable, i.e. within a factor of 2.4 for silver, cadmium, copper, and zinc, to bioaccumulation-absorption efficiencies for predominantly marine fish. Conditional affinity constants are significantly related to the metal-specific covalent index. Additionally, the affinity constants of calcium, cadmium, copper, sodium, and zinc are significantly comparable across aquatic species, including molluscs, daphnids, and fish. This suggests that affinity constants can be estimated from the covalent index, and constants can be extrapolated across species. A new model is proposed that integrates the combined effect of metal chemodynamics, as speciation, competition, and ligand affinity, and species characteristics, as size, on metal uptake by aquatic organisms. An important direction for further research is the quantitative comparison of the proposed model with acute toxicity values for organisms belonging to different size classes.

  4. A Synthesis of 20 Years of Research on Sexual Risk Taking Among Asian/Pacific Islander Men Who Have Sex With Men in Western Countries.

    PubMed

    Shi Shiu, Chen; Voisin, Dexter R; Chen, Wet-Ti; Lo, Yi-An; Hardestry, Melissa; Nguyen, Huong

    2016-05-01

    Over the past two decades, there has emerged a body of literature documenting a number of risk factors associated with Asian/Pacific Islander men who have sex with men's unsafe sexual behaviors. This study aims to systematically review existing empirical studies and synthesize research results into a social-ecological framework using a mixed research synthesis. Empirical research articles published in peer-reviewed journals between January 1990 and June 2013 were identified in six databases, including PubMed, Ovid MEDLINE, PsycINFO, Social Work Abstract, CINAL, and Web of Knowledge. Both quantitative and qualitative studies were included. Two analysts independently reviewed the articles, and findings were organized on a social-ecological framework. Twenty-two articles were included in the analysis; among these 13 were quantitative, 8 were qualitative, and 1 was mixed-methods research. Results indicated that demographic characteristics, psychological resources, behavioral patterns, relationships with family and friends, dynamics with romantic or sexual partners, community involvement, culture, discrimination, and institutional factors were related to unprotected anal intercourse. This article presents a critique of this literature and discusses implications for future research with this population. It concludes with prevention/intervention initiatives based on review findings. © The Author(s) 2015.

  5. Information measures for terrain visualization

    NASA Astrophysics Data System (ADS)

    Bonaventura, Xavier; Sima, Aleksandra A.; Feixas, Miquel; Buckley, Simon J.; Sbert, Mateu; Howell, John A.

    2017-02-01

    Many quantitative and qualitative studies in geoscience research are based on digital elevation models (DEMs) and 3D surfaces to aid understanding of natural and anthropogenically-influenced topography. As well as their quantitative uses, the visual representation of DEMs can add valuable information for identifying and interpreting topographic features. However, choice of viewpoints and rendering styles may not always be intuitive, especially when terrain data are augmented with digital image texture. In this paper, an information-theoretic framework for object understanding is applied to terrain visualization and terrain view selection. From a visibility channel between a set of viewpoints and the component polygons of a 3D terrain model, we obtain three polygonal information measures. These measures are used to visualize the information associated with each polygon of the terrain model. In order to enhance the perception of the terrain's shape, we explore the effect of combining the calculated information measures with the supplementary digital image texture. From polygonal information, we also introduce a method to select a set of representative views of the terrain model. Finally, we evaluate the behaviour of the proposed techniques using example datasets. A publicly available framework for both the visualization and the view selection of a terrain has been created in order to provide the possibility to analyse any terrain model.

  6. Discussion of skill improvement in marine ecosystem dynamic models based on parameter optimization and skill assessment

    NASA Astrophysics Data System (ADS)

    Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen

    2016-07-01

    Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.

  7. Hazard Screening Methods for Nanomaterials: A Comparative Study

    PubMed Central

    Murphy, Finbarr; Mullins, Martin; Furxhi, Irini; Costa, Anna L.; Simeone, Felice C.

    2018-01-01

    Hazard identification is the key step in risk assessment and management of manufactured nanomaterials (NM). However, the rapid commercialisation of nano-enabled products continues to out-pace the development of a prudent risk management mechanism that is widely accepted by the scientific community and enforced by regulators. However, a growing body of academic literature is developing promising quantitative methods. Two approaches have gained significant currency. Bayesian networks (BN) are a probabilistic, machine learning approach while the weight of evidence (WoE) statistical framework is based on expert elicitation. This comparative study investigates the efficacy of quantitative WoE and Bayesian methodologies in ranking the potential hazard of metal and metal-oxide NMs—TiO2, Ag, and ZnO. This research finds that hazard ranking is consistent for both risk assessment approaches. The BN and WoE models both utilize physico-chemical, toxicological, and study type data to infer the hazard potential. The BN exhibits more stability when the models are perturbed with new data. The BN has the significant advantage of self-learning with new data; however, this assumes all input data is equally valid. This research finds that a combination of WoE that would rank input data along with the BN is the optimal hazard assessment framework. PMID:29495342

  8. A Synthesis of 20 Years of Research on Sexual Risk Taking Among Asian/Pacific Islander Men Who Have Sex With Men in Western Countries

    PubMed Central

    Shiu, Chen Shi; Voisin, Dexter R.; Chen, Wet-Ti; Lo, Yi-An; Hardestry, Melissa; Nguyen, Huong

    2017-01-01

    Over the past two decades, there has emerged a body of literature documenting a number of risk factors associated with Asian/Pacific Islander men who have sex with men’s unsafe sexual behaviors. This study aims to systematically review existing empirical studies and synthesize research results into a social–ecological framework using a mixed research synthesis. Empirical research articles published in peer-reviewed journals between January 1990 and June 2013 were identified in six databases, including PubMed, Ovid MEDLINE, PsycINFO, Social Work Abstract, CINAL, and Web of Knowledge. Both quantitative and qualitative studies were included. Two analysts independently reviewed the articles, and findings were organized on a social–ecological framework. Twenty-two articles were included in the analysis; among these 13 were quantitative, 8 were qualitative, and 1 was mixed-methods research. Results indicated that demographic characteristics, psychological resources, behavioral patterns, relationships with family and friends, dynamics with romantic or sexual partners, community involvement, culture, discrimination, and institutional factors were related to unprotected anal intercourse. This article presents a critique of this literature and discusses implications for future research with this population. It concludes with prevention/intervention initiatives based on review findings. PMID:25563383

  9. The Species versus Subspecies Conundrum: Quantitative Delimitation from Integrating Multiple Data Types within a Single Bayesian Approach in Hercules Beetles.

    PubMed

    Huang, Jen-Pan; Knowles, L Lacey

    2016-07-01

    With the recent attention and focus on quantitative methods for species delimitation, an overlooked but equally important issue regards what has actually been delimited. This study investigates the apparent arbitrariness of some taxonomic distinctions, and in particular how species and subspecies are assigned. Specifically, we use a recently developed Bayesian model-based approach to show that in the Hercules beetles (genus Dynastes) there is no statistical difference in the probability that putative taxa represent different species, irrespective of whether they were given species or subspecies designations. By considering multiple data types, as opposed to relying exclusively on genetic data alone, we also show that both previously recognized species and subspecies represent a variety of points along the speciation spectrum (i.e., previously recognized species are not systematically further along the continuum than subspecies). For example, based on evolutionary models of divergence, some taxa are statistically distinguishable on more than one axis of differentiation (e.g., along both phenotypic and genetic dimensions), whereas other taxa can only be delimited statistically from a single data type. Because both phenotypic and genetic data are analyzed in a common Bayesian framework, our study provides a framework for investigating whether disagreements in species boundaries among data types reflect (i) actual discordance with the actual history of lineage splitting, or instead (ii) differences among data types in the amount of time required for differentiation to become apparent among the delimited taxa. We discuss what the answers to these questions imply about what characters are used to delimit species, as well as the diverse processes involved in the origin and maintenance of species boundaries. With this in mind, we then reflect more generally on how quantitative methods for species delimitation are used to assign taxonomic status. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Toward Monitoring Parkinson's Through Analysis of Static Handwriting Samples: A Quantitative Analytical Framework.

    PubMed

    Zhi, Naiqian; Jaeger, Beverly Kris; Gouldstone, Andrew; Sipahi, Rifat; Frank, Samuel

    2017-03-01

    Detection of changes in micrographia as a manifestation of symptomatic progression or therapeutic response in Parkinson's disease (PD) is challenging as such changes can be subtle. A computerized toolkit based on quantitative analysis of handwriting samples would be valuable as it could supplement and support clinical assessments, help monitor micrographia, and link it to PD. Such a toolkit would be especially useful if it could detect subtle yet relevant changes in handwriting morphology, thus enhancing resolution of the detection procedure. This would be made possible by developing a set of metrics sensitive enough to detect and discern micrographia with specificity. Several metrics that are sensitive to the characteristics of micrographia were developed, with minimal sensitivity to confounding handwriting artifacts. These metrics capture character size-reduction, ink utilization, and pixel density within a writing sample from left to right. They are used here to "score" handwritten signatures of 12 different individuals corresponding to healthy and symptomatic PD conditions, and sample control signatures that had been artificially reduced in size for comparison purposes. Moreover, metric analyses of samples from ten of the 12 individuals for which clinical diagnosis time is known show considerable informative variations when applied to static signature samples obtained before and after diagnosis. In particular, a measure called pixel density variation showed statistically significant differences ( ) between two comparison groups of remote signature recordings: earlier versus recent, based on independent and paired t-test analyses on a total of 40 signature samples. The quantitative framework developed here has the potential to be used in future controlled experiments to study micrographia and links to PD from various aspects, including monitoring and assessment of applied interventions and treatments. The inherent value in this methodology is further enhanced by its reliance solely on static signatures, not requiring dynamic sampling with specialized equipment.

  11. Making Research Delicious: An Evaluation of Nurses' Knowledge, Attitudes, and Practice Using the Great American Cookie Experiment With Mobile Device Gaming.

    PubMed

    Hayes Lane, Susan; Serafica, Reimund; Huffman, Carolyn; Cuddy, Alyssa

    2016-01-01

    In the current healthcare environment, nurses must have a basic understanding of research to lead change and implement evidence-based practice. The purpose of this study was to evaluate the effectiveness of an educational intervention formulated on the framework of the Great American Cookie Experiment measuring nurses' research knowledge, attitudes, and practice using mobile device gaming. This multisite quantitative study provides insight into promotion of research and information about best practices on innovative teaching strategies for nurses.

  12. Limit of a nonpreferential attachment multitype network model

    NASA Astrophysics Data System (ADS)

    Shang, Yilun

    2017-02-01

    Here, we deal with a model of multitype network with nonpreferential attachment growth. The connection between two nodes depends asymmetrically on their types, reflecting the implication of time order in temporal networks. Based upon graph limit theory, we analytically determined the limit of the network model characterized by a kernel, in the sense that the number of copies of any fixed subgraph converges when network size tends to infinity. The results are confirmed by extensive simulations. Our work thus provides a theoretical framework for quantitatively understanding grown temporal complex networks as a whole.

  13. Astrocytes, Synapses and Brain Function: A Computational Approach

    NASA Astrophysics Data System (ADS)

    Nadkarni, Suhita

    2006-03-01

    Modulation of synaptic reliability is one of the leading mechanisms involved in long- term potentiation (LTP) and long-term depression (LTD) and therefore has implications in information processing in the brain. A recently discovered mechanism for modulating synaptic reliability critically involves recruitments of astrocytes - star- shaped cells that outnumber the neurons in most parts of the central nervous system. Astrocytes until recently were thought to be subordinate cells merely participating in supporting neuronal functions. New evidence, however, made available by advances in imaging technology has changed the way we envision the role of these cells in synaptic transmission and as modulator of neuronal excitability. We put forward a novel mathematical framework based on the biophysics of the bidirectional neuron-astrocyte interactions that quantitatively accounts for two distinct experimental manifestation of recruitment of astrocytes in synaptic transmission: a) transformation of a low fidelity synapse transforms into a high fidelity synapse and b) enhanced postsynaptic spontaneous currents when astrocytes are activated. Such a framework is not only useful for modeling neuronal dynamics in a realistic environment but also provides a conceptual basis for interpreting experiments. Based on this modeling framework, we explore the role of astrocytes for neuronal network behavior such as synchrony and correlations and compare with experimental data from cultured networks.

  14. Young drivers' engagement with social interactive technology on their smartphone: Critical beliefs to target in public education messages.

    PubMed

    Gauld, Cassandra S; Lewis, Ioni M; White, Katherine M; Watson, Barry

    2016-11-01

    The current study forms part of a larger study based on the Step Approach to Message Design and Testing (SatMDT), a new and innovative framework designed to guide the development and evaluation of health communication messages, including road safety messages. This four step framework is based on several theories, including the Theory of Planned Behaviour. The current study followed steps one and two of the SatMDT framework and utilised a quantitative survey to validate salient beliefs (behavioural, normative, and control) about initiating, monitoring/reading, and responding to social interactive technology on smartphones by N=114 (88F, 26M) young drivers aged 17-25 years. These beliefs had been elicited in a prior in-depth qualitative study. A subsequent critical beliefs analysis identified seven beliefs as potential targets for public education messages, including, 'slow-moving traffic' (control belief - facilitator) for both monitoring/reading and responding behaviours; 'feeling at ease that you had received an expected communication' (behavioural belief -advantage) for monitoring/reading behaviour; and 'friends/peers more likely to approve' (normative belief) for responding behaviour. Potential message content targeting these seven critical beliefs is discussed in accordance with the SatMDT. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Scholarometer: a social framework for analyzing impact across disciplines.

    PubMed

    Kaur, Jasleen; Hoang, Diep Thi; Sun, Xiaoling; Possamai, Lino; Jafariasbagh, Mohsen; Patil, Snehal; Menczer, Filippo

    2012-01-01

    The use of quantitative metrics to gauge the impact of scholarly publications, authors, and disciplines is predicated on the availability of reliable usage and annotation data. Citation and download counts are widely available from digital libraries. However, current annotation systems rely on proprietary labels, refer to journals but not articles or authors, and are manually curated. To address these limitations, we propose a social framework based on crowdsourced annotations of scholars, designed to keep up with the rapidly evolving disciplinary and interdisciplinary landscape. We describe a system called Scholarometer, which provides a service to scholars by computing citation-based impact measures. This creates an incentive for users to provide disciplinary annotations of authors, which in turn can be used to compute disciplinary metrics. We first present the system architecture and several heuristics to deal with noisy bibliographic and annotation data. We report on data sharing and interactive visualization services enabled by Scholarometer. Usage statistics, illustrating the data collected and shared through the framework, suggest that the proposed crowdsourcing approach can be successful. Secondly, we illustrate how the disciplinary bibliometric indicators elicited by Scholarometer allow us to implement for the first time a universal impact measure proposed in the literature. Our evaluation suggests that this metric provides an effective means for comparing scholarly impact across disciplinary boundaries.

  16. Mobile-Cloud Assisted Video Summarization Framework for Efficient Management of Remote Sensing Data Generated by Wireless Capsule Sensors

    PubMed Central

    Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook

    2014-01-01

    Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data. PMID:25225874

  17. Watershed Planning within a Quantitative Scenario Analysis Framework.

    PubMed

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  18. Defining resilience within a risk-informed assessment framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, Garill A.; Unwin, Stephen D.; Holter, Gregory M.

    2011-08-01

    The concept of resilience is the subject of considerable discussion in academic, business, and governmental circles. The United States Department of Homeland Security for one has emphasised the need to consider resilience in safeguarding critical infrastructure and key resources. The concept of resilience is complex, multidimensional, and defined differently by different stakeholders. The authors contend that there is a benefit in moving from discussing resilience as an abstraction to defining resilience as a measurable characteristic of a system. This paper proposes defining resilience measures using elements of a traditional risk assessment framework to help clarify the concept of resilience andmore » as a way to provide non-traditional risk information. The authors show various, diverse dimensions of resilience can be quantitatively defined in a common risk assessment framework based on the concept of loss of service. This allows the comparison of options for improving the resilience of infrastructure and presents a means to perform cost-benefit analysis. This paper discusses definitions and key aspects of resilience, presents equations for the risk of loss of infrastructure function that incorporate four key aspects of resilience that could prevent or mitigate that loss, describes proposed resilience factor definitions based on those risk impacts, and provides an example that illustrates how resilience factors would be calculated using a hypothetical scenario.« less

  19. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  20. Mobile-cloud assisted video summarization framework for efficient management of remote sensing data generated by wireless capsule sensors.

    PubMed

    Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook

    2014-09-15

    Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data.

  1. Scholarometer: A Social Framework for Analyzing Impact across Disciplines

    PubMed Central

    Sun, Xiaoling; Possamai, Lino; JafariAsbagh, Mohsen; Patil, Snehal; Menczer, Filippo

    2012-01-01

    The use of quantitative metrics to gauge the impact of scholarly publications, authors, and disciplines is predicated on the availability of reliable usage and annotation data. Citation and download counts are widely available from digital libraries. However, current annotation systems rely on proprietary labels, refer to journals but not articles or authors, and are manually curated. To address these limitations, we propose a social framework based on crowdsourced annotations of scholars, designed to keep up with the rapidly evolving disciplinary and interdisciplinary landscape. We describe a system called Scholarometer, which provides a service to scholars by computing citation-based impact measures. This creates an incentive for users to provide disciplinary annotations of authors, which in turn can be used to compute disciplinary metrics. We first present the system architecture and several heuristics to deal with noisy bibliographic and annotation data. We report on data sharing and interactive visualization services enabled by Scholarometer. Usage statistics, illustrating the data collected and shared through the framework, suggest that the proposed crowdsourcing approach can be successful. Secondly, we illustrate how the disciplinary bibliometric indicators elicited by Scholarometer allow us to implement for the first time a universal impact measure proposed in the literature. Our evaluation suggests that this metric provides an effective means for comparing scholarly impact across disciplinary boundaries. PMID:22984414

  2. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  3. Equivalent formulations of “the equation of life”

    NASA Astrophysics Data System (ADS)

    Ao, Ping

    2014-07-01

    Motivated by progress in theoretical biology a recent proposal on a general and quantitative dynamical framework for nonequilibrium processes and dynamics of complex systems is briefly reviewed. It is nothing but the evolutionary process discovered by Charles Darwin and Alfred Wallace. Such general and structured dynamics may be tentatively named “the equation of life”. Three equivalent formulations are discussed, and it is also pointed out that such a quantitative dynamical framework leads naturally to the powerful Boltzmann-Gibbs distribution and the second law in physics. In this way, the equation of life provides a logically consistent foundation for thermodynamics. This view clarifies a particular outstanding problem and further suggests a unifying principle for physics and biology.

  4. Economic evaluation of occupational health and safety programmes in health care.

    PubMed

    Guzman, J; Tompa, E; Koehoorn, M; de Boer, H; Macdonald, S; Alamgir, H

    2015-10-01

    Evidence-based resource allocation in the public health care sector requires reliable economic evaluations that are different from those needed in the commercial sector. To describe a framework for conducting economic evaluations of occupational health and safety (OHS) programmes in health care developed with sector stakeholders. To define key resources and outcomes to be considered in economic evaluations of OHS programmes and to integrate these into a comprehensive framework. Participatory action research supported by mixed qualitative and quantitative methods, including a multi-stakeholder working group, 25 key informant interviews, a 41-member Delphi panel and structured nominal group discussions. We found three resources had top priority: OHS staff time, training the workers and programme planning, promotion and evaluation. Similarly, five outcomes had top priority: number of injuries, safety climate, job satisfaction, quality of care and work days lost. The resulting framework was built around seven principles of good practice that stakeholders can use to assist them in conducting economic evaluations of OHS programmes. Use of a framework resulting from this participatory action research approach may increase the quality of economic evaluations of OHS programmes and facilitate programme comparisons for evidence-based resource allocation decisions. The principles may be applicable to other service sectors funded from general taxes and more broadly to economic evaluations of OHS programmes in general. © The Author 2015. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Microfluidic paper-based device for colorimetric determination of glucose based on a metal-organic framework acting as peroxidase mimetic.

    PubMed

    Ortiz-Gómez, Inmaculada; Salinas-Castillo, Alfonso; García, Amalia García; Álvarez-Bermejo, José Antonio; de Orbe-Payá, Ignacio; Rodríguez-Diéguez, Antonio; Capitán-Vallvey, Luis Fermín

    2017-12-13

    This work presents a microfluidic paper-based analytical device (μPAD) for glucose determination using a supported metal-organic framework (MOF) acting as a peroxidase mimic. The catalytic action of glucose oxidase (GOx) on glucose causes the formation of H 2 O 2 , and the MOF causes the oxidation of 3,3',5,5'-tetramethylbenzidine (TMB) by H 2 O 2 to form a blue-green product with an absorption peak at 650 nm in the detection zone. A digital camera and the iOS feature of a smartphone are used for the quantitation of glucose with the S coordinate of the HSV color space as the analytical parameter. Different factors such as the concentration of TMB, GOx and MOF, pH and buffer, sample volume, reaction time and reagent position in the μPAD were optimized. Under optimal conditions, the value for the S coordinate increases linearly up to 150 μmol·L -1 glucose concentrations, with a 2.5 μmol·L -1 detection limit. The μPAD remains stable for 21 days under conventional storage conditions. Such an enzyme mimetic-based assay to glucose determination using Fe-MIL-101 MOF implemented in a microfluidic paper-based device possesses advantages over enzyme-based assays in terms of costs, durability and stability compared to other existing glucose determination methods. The procedure was applied to the determination of glucose in (spiked) serum and urine. Graphical abstract Schematic representation of microfluidic paper-based analytical device using metal-organic framework as a peroxidase mimic for colorimetric glucose detection with digital camera or smartphone and iOS app readout.

  6. Towards evidence-based practice in medical training: making evaluations more meaningful.

    PubMed

    Drescher, Uta; Warren, Fiona; Norton, Kingsley

    2004-12-01

    The evaluation of training is problematic and the evidence base inconclusive. This situation may arise for 2 main reasons: training is not understood as a complex intervention and, related to this, the evaluation methods applied are often overly simplistic. This paper makes the case for construing training, especially in the field of specialist medical education, as a complex intervention. It also selectively reviews the available literature in order to match evaluative techniques with the demonstrated complexity. Construing training as a complex intervention can provide a framework for selecting the most appropriate methodology to evaluate a given training intervention and to appraise the evidence base for training fairly, choosing from among both quantitative and qualitative approaches and applying measurement at multiple levels of training impact.

  7. Quantitative framework for prospective motion correction evaluation.

    PubMed

    Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2016-02-01

    Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.

  8. The Evolution of the Social Roletaking and Guided Reflection Framework in Teacher Education: Recent Theory and Quantitative Synthesis of Research.

    ERIC Educational Resources Information Center

    Reiman, Alan J.

    1999-01-01

    Addresses the lack of theory and directing constructs for reflective practice in teacher education, reviewing Vygotskyian and Piagetian theoretical tenets, relating them to a developmental action/reflection framework for adult learners, and summarizing a taxonomy for differentiating reflection according to adult learners' needs. Summarizes the…

  9. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  10. A Framework for Quantitative Evaluation of Care Coordination Effectiveness

    ERIC Educational Resources Information Center

    Liu, Wei

    2017-01-01

    The U.S. healthcare system lacks incentives and quantitative evaluation tools to assess coordination in a patient's care transition process. This is needed because poor care coordination has been identified by many studies as one of the major root causes for the U.S. health system's inefficiency, for poor outcomes, and for high cost. Despite…

  11. Evaluation metrics for bone segmentation in ultrasound

    NASA Astrophysics Data System (ADS)

    Lougheed, Matthew; Fichtinger, Gabor; Ungi, Tamas

    2015-03-01

    Tracked ultrasound is a safe alternative to X-ray for imaging bones. The interpretation of bony structures is challenging as ultrasound has no specific intensity characteristic of bones. Several image segmentation algorithms have been devised to identify bony structures. We propose an open-source framework that would aid in the development and comparison of such algorithms by quantitatively measuring segmentation performance in the ultrasound images. True-positive and false-negative metrics used in the framework quantify algorithm performance based on correctly segmented bone and correctly segmented boneless regions. Ground-truth for these metrics are defined manually and along with the corresponding automatically segmented image are used for the performance analysis. Manually created ground truth tests were generated to verify the accuracy of the analysis. Further evaluation metrics for determining average performance per slide and standard deviation are considered. The metrics provide a means of evaluating accuracy of frames along the length of a volume. This would aid in assessing the accuracy of the volume itself and the approach to image acquisition (positioning and frequency of frame). The framework was implemented as an open-source module of the 3D Slicer platform. The ground truth tests verified that the framework correctly calculates the implemented metrics. The developed framework provides a convenient way to evaluate bone segmentation algorithms. The implementation fits in a widely used application for segmentation algorithm prototyping. Future algorithm development will benefit by monitoring the effects of adjustments to an algorithm in a standard evaluation framework.

  12. How to evaluate population management? Transforming the Care Continuum Alliance population health guide toward a broadly applicable analytical framework.

    PubMed

    Struijs, Jeroen N; Drewes, Hanneke W; Heijink, Richard; Baan, Caroline A

    2015-04-01

    Many countries face the persistent twin challenge of providing high-quality care while keeping health systems affordable and accessible. As a result, the interest for more efficient strategies to stimulate population health is increasing. A possible successful strategy is population management (PM). PM strives to address health needs for the population at-risk and the chronically ill at all points along the health continuum by integrating services across health care, prevention, social care and welfare. The Care Continuum Alliance (CCA) population health guide, which recently changed their name in Population Health Alliance (PHA) provides a useful instrument for implementing and evaluating such innovative approaches. This framework is developed for PM specifically and describes the core elements of the PM-concept on the basis of six subsequent interrelated steps. The aim of this article is to transform the CCA framework into an analytical framework. Quantitative methods are refined and we operationalized a set of indicators to measure the impact of PM in terms of the Triple Aim (population health, quality of care and cost per capita). Additionally, we added a qualitative part to gain insight into the implementation process of PM. This resulted in a broadly applicable analytical framework based on a mixed-methods approach. In the coming years, the analytical framework will be applied within the Dutch Monitor Population Management to derive transferable 'lessons learned' and to methodologically underpin the concept of PM. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. PERCH: A Unified Framework for Disease Gene Prioritization.

    PubMed

    Feng, Bing-Jian

    2017-03-01

    To interpret genetic variants discovered from next-generation sequencing, integration of heterogeneous information is vital for success. This article describes a framework named PERCH (Polymorphism Evaluation, Ranking, and Classification for a Heritable trait), available at http://BJFengLab.org/. It can prioritize disease genes by quantitatively unifying a new deleteriousness measure called BayesDel, an improved assessment of the biological relevance of genes to the disease, a modified linkage analysis, a novel rare-variant association test, and a converted variant call quality score. It supports data that contain various combinations of extended pedigrees, trios, and case-controls, and allows for a reduced penetrance, an elevated phenocopy rate, liability classes, and covariates. BayesDel is more accurate than PolyPhen2, SIFT, FATHMM, LRT, Mutation Taster, Mutation Assessor, PhyloP, GERP++, SiPhy, CADD, MetaLR, and MetaSVM. The overall approach is faster and more powerful than the existing quantitative method pVAAST, as shown by the simulations of challenging situations in finding the missing heritability of a complex disease. This framework can also classify variants of unknown significance (variants of uncertain significance) by quantitatively integrating allele frequencies, deleteriousness, association, and co-segregation. PERCH is a versatile tool for gene prioritization in gene discovery research and variant classification in clinical genetic testing. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.

  14. A generalized quantitative interpretation of dark-field contrast for highly concentrated microsphere suspensions

    PubMed Central

    Gkoumas, Spyridon; Villanueva-Perez, Pablo; Wang, Zhentian; Romano, Lucia; Abis, Matteo; Stampanoni, Marco

    2016-01-01

    In X-ray grating interferometry, dark-field contrast arises due to partial extinction of the detected interference fringes. This is also called visibility reduction and is attributed to small-angle scattering from unresolved structures in the imaged object. In recent years, analytical quantitative frameworks of dark-field contrast have been developed for highly diluted monodisperse microsphere suspensions with maximum 6% volume fraction. These frameworks assume that scattering particles are separated by large enough distances, which make any interparticle scattering interference negligible. In this paper, we start from the small-angle scattering intensity equation and, by linking Fourier and real-space, we introduce the structure factor and thus extend the analytical and experimental quantitative interpretation of dark-field contrast, for a range of suspensions with volume fractions reaching 40%. The structure factor accounts for interparticle scattering interference. Without introducing any additional fitting parameters, we successfully predict the experimental values measured at the TOMCAT beamline, Swiss Light Source. Finally, we apply this theoretical framework to an experiment probing a range of system correlation lengths by acquiring dark-field images at different energies. This proposed method has the potential to be applied in single-shot-mode using a polychromatic X-ray tube setup and a single-photon-counting energy-resolving detector. PMID:27734931

  15. A comparative framework for maneuverability and gust tolerance of aerial microsystems

    NASA Astrophysics Data System (ADS)

    Campbell, Renee

    Aerial microsystems have the potential of navigating low-altitude, cluttered environments such as urban corridors and building interiors. Reliable systems require both agility and tolerance to gusts. While many platform designs are under development, no framework currently exists to quantitatively assess these inherent bare airframe characteristics which are independent of closed loop controllers. This research develops a method to quantify the maneuverability and gust tolerance of vehicles using reachability and disturbance sensitivity sets. The method is applied to a stable flybar helicopter and an unstable flybarless helicopter, whose state space models were formed through system identification. Model-based static H∞ controllers were also implemented on the vehicles and tested in the lab using fan-generated gusts. It is shown that the flybar restricts the bare airframe's ability to maneuver in translational velocity directions. As such, the flybarless helicopter proved more maneuverable and gust tolerant than the flybar helicopter. This approach was specifically applied here to compare stable and unstable helicopter platforms; however, the framework may be used to assess a broad range of aerial microsystems.

  16. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.

    PubMed

    Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-06-23

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.

  17. GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments

    PubMed Central

    Hernandez-Bennetts, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier

    2017-01-01

    This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment. PMID:28644375

  18. Application of preconditioned alternating direction method of multipliers in depth from focal stack

    NASA Astrophysics Data System (ADS)

    Javidnia, Hossein; Corcoran, Peter

    2018-03-01

    Postcapture refocusing effect in smartphone cameras is achievable using focal stacks. However, the accuracy of this effect is totally dependent on the combination of the depth layers in the stack. The accuracy of the extended depth of field effect in this application can be improved significantly by computing an accurate depth map, which has been an open issue for decades. To tackle this issue, a framework is proposed based on a preconditioned alternating direction method of multipliers for depth from the focal stack and synthetic defocus application. In addition to its ability to provide high structural accuracy, the optimization function of the proposed framework can, in fact, converge faster and better than state-of-the-art methods. The qualitative evaluation has been done on 21 sets of focal stacks and the optimization function has been compared against five other methods. Later, 10 light field image sets have been transformed into focal stacks for quantitative evaluation purposes. Preliminary results indicate that the proposed framework has a better performance in terms of structural accuracy and optimization in comparison to the current state-of-the-art methods.

  19. Integrated presentation of ecological risk from multiple stressors

    NASA Astrophysics Data System (ADS)

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-10-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  20. Integrated presentation of ecological risk from multiple stressors.

    PubMed

    Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman

    2016-10-26

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  1. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  2. Who Owns the Content and Who Runs the Risk? Dynamics of Teacher Change in Teacher-Researcher Collaboration

    NASA Astrophysics Data System (ADS)

    Hamza, Karim; Piqueras, Jesús; Wickman, Per-Olof; Angelin, Marcus

    2017-06-01

    We present analyses of teacher professional growth during collaboration between science teachers and science education researchers, with special focus on how the differential assumption of responsibility between teachers and researchers affected the growth processes. The collaboration centered on a new conceptual framework introduced by the researchers, which aimed at empowering teachers to plan teaching in accordance with perceived purposes. Seven joint planning meetings between teachers and researchers were analyzed, both quantitatively concerning the extent to which the introduced framework became part of the discussions and qualitatively through the interconnected model of teacher professional growth. The collaboration went through three distinct phases characterized by how and the extent to which the teachers made use of the new framework. The change sequences identified in relation to each phase show that teacher recognition of salient outcomes from the framework was important for professional growth to occur. Moreover, our data suggest that this recognition may have been facilitated because the researchers, in initial phases of the collaboration, took increased responsibility for the implementation of the new framework. We conclude that although this differential assumption of responsibility may result in unequal distribution of power between teachers and researchers, it may at the same time mean more equal distribution of concrete work required as well as the inevitable risks associated with pedagogical innovation and introduction of research-based knowledge into science teachers' practice.

  3. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  4. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  5. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  6. Simulation of the M13 life cycle I: Assembly of a genetically-structured deterministic chemical kinetic simulation.

    PubMed

    Smeal, Steven W; Schmitt, Margaret A; Pereira, Ronnie Rodrigues; Prasad, Ashok; Fisk, John D

    2017-01-01

    To expand the quantitative, systems level understanding and foster the expansion of the biotechnological applications of the filamentous bacteriophage M13, we have unified the accumulated quantitative information on M13 biology into a genetically-structured, experimentally-based computational simulation of the entire phage life cycle. The deterministic chemical kinetic simulation explicitly includes the molecular details of DNA replication, mRNA transcription, protein translation and particle assembly, as well as the competing protein-protein and protein-nucleic acid interactions that control the timing and extent of phage production. The simulation reproduces the holistic behavior of M13, closely matching experimentally reported values of the intracellular levels of phage species and the timing of events in the M13 life cycle. The computational model provides a quantitative description of phage biology, highlights gaps in the present understanding of M13, and offers a framework for exploring alternative mechanisms of regulation in the context of the complete M13 life cycle. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data.

    PubMed

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi

    2015-04-01

    Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  8. Development and application of the adverse outcome pathway framework for understanding and predicting chronic toxicity: I. Challenges and research needs in ecotoxicology.

    PubMed

    Groh, Ksenia J; Carvalho, Raquel N; Chipman, James K; Denslow, Nancy D; Halder, Marlies; Murphy, Cheryl A; Roelofs, Dick; Rolaki, Alexandra; Schirmer, Kristin; Watanabe, Karen H

    2015-02-01

    To elucidate the effects of chemicals on populations of different species in the environment, efficient testing and modeling approaches are needed that consider multiple stressors and allow reliable extrapolation of responses across species. An adverse outcome pathway (AOP) is a concept that provides a framework for organizing knowledge about the progression of toxicity events across scales of biological organization that lead to adverse outcomes relevant for risk assessment. In this paper, we focus on exploring how the AOP concept can be used to guide research aimed at improving both our understanding of chronic toxicity, including delayed toxicity as well as epigenetic and transgenerational effects of chemicals, and our ability to predict adverse outcomes. A better understanding of the influence of subtle toxicity on individual and population fitness would support a broader integration of sublethal endpoints into risk assessment frameworks. Detailed mechanistic knowledge would facilitate the development of alternative testing methods as well as help prioritize higher tier toxicity testing. We argue that targeted development of AOPs supports both of these aspects by promoting the elucidation of molecular mechanisms and their contribution to relevant toxicity outcomes across biological scales. We further discuss information requirements and challenges in application of AOPs for chemical- and site-specific risk assessment and for extrapolation across species. We provide recommendations for potential extension of the AOP framework to incorporate information on exposure, toxicokinetics and situation-specific ecological contexts, and discuss common interfaces that can be employed to couple AOPs with computational modeling approaches and with evolutionary life history theory. The extended AOP framework can serve as a venue for integration of knowledge derived from various sources, including empirical data as well as molecular, quantitative and evolutionary-based models describing species responses to toxicants. This will allow a more efficient application of AOP knowledge for quantitative chemical- and site-specific risk assessment as well as for extrapolation across species in the future. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Patient and Healthcare Provider Barriers to Hypertension Awareness, Treatment and Follow Up: A Systematic Review and Meta-Analysis of Qualitative and Quantitative Studies

    PubMed Central

    Khatib, Rasha; Schwalm, Jon-David; Yusuf, Salim; Haynes, R. Brian; McKee, Martin; Khan, Maheer; Nieuwlaat, Robby

    2014-01-01

    Background Although the importance of detecting, treating, and controlling hypertension has been recognized for decades, the majority of patients with hypertension remain uncontrolled. The path from evidence to practice contains many potential barriers, but their role has not been reviewed systematically. This review aimed to synthesize and identify important barriers to hypertension control as reported by patients and healthcare providers. Methods Electronic databases MEDLINE, EMBASE and Global Health were searched systematically up to February 2013. Two reviewers independently selected eligible studies. Two reviewers categorized barriers based on a theoretical framework of behavior change. The theoretical framework suggests that a change in behavior requires a strong commitment to change [intention], the necessary skills and abilities to adopt the behavior [capability], and an absence of health system and support constraints. Findings Twenty-five qualitative studies and 44 quantitative studies met the inclusion criteria. In qualitative studies, health system barriers were most commonly discussed in studies of patients and health care providers. Quantitative studies identified disagreement with clinical recommendations as the most common barrier among health care providers. Quantitative studies of patients yielded different results: lack of knowledge was the most common barrier to hypertension awareness. Stress, anxiety and depression were most commonly reported as barriers that hindered or delayed adoption of a healthier lifestyle. In terms of hypertension treatment adherence, patients mostly reported forgetting to take their medication. Finally, priority setting barriers were most commonly reported by patients in terms of following up with their health care providers. Conclusions This review identified a wide range of barriers facing patients and health care providers pursuing hypertension control, indicating the need for targeted multi-faceted interventions. More methodologically rigorous studies that encompass the range of barriers and that include low- and middle-income countries are required in order to inform policies to improve hypertension control. PMID:24454721

  10. Blue intensity matters for cell cycle profiling in fluorescence DAPI-stained images.

    PubMed

    Ferro, Anabela; Mestre, Tânia; Carneiro, Patrícia; Sahumbaiev, Ivan; Seruca, Raquel; Sanches, João M

    2017-05-01

    In the past decades, there has been an amazing progress in the understanding of the molecular mechanisms of the cell cycle. This has been possible largely due to a better conceptualization of the cycle itself, but also as a consequence of technological advances. Herein, we propose a new fluorescence image-based framework targeted at the identification and segmentation of stained nuclei with the purpose to determine DNA content in distinct cell cycle stages. The method is based on discriminative features, such as total intensity and area, retrieved from in situ stained nuclei by fluorescence microscopy, allowing the determination of the cell cycle phase of both single and sub-population of cells. The analysis framework was built on a modified k-means clustering strategy and refined with a Gaussian mixture model classifier, which enabled the definition of highly accurate classification clusters corresponding to G1, S and G2 phases. Using the information retrieved from area and fluorescence total intensity, the modified k-means (k=3) cluster imaging framework classified 64.7% of the imaged nuclei, as being at G1 phase, 12.0% at G2 phase and 23.2% at S phase. Performance of the imaging framework was ascertained with normal murine mammary gland cells constitutively expressing the Fucci2 technology, exhibiting an overall sensitivity of 94.0%. Further, the results indicate that the imaging framework has a robust capacity to both identify a given DAPI-stained nucleus to its correct cell cycle phase, as well as to determine, with very high probability, true negatives. Importantly, this novel imaging approach is a non-disruptive method that allows an integrative and simultaneous quantitative analysis of molecular and morphological parameters, thus awarding the possibility of cell cycle profiling in cytological and histological samples.

  11. A Framework for Image-Based Modeling of Acute Myocardial Ischemia Using Intramurally Recorded Extracellular Potentials.

    PubMed

    Burton, Brett M; Aras, Kedar K; Good, Wilson W; Tate, Jess D; Zenger, Brian; MacLeod, Rob S

    2018-05-21

    The biophysical basis for electrocardiographic evaluation of myocardial ischemia stems from the notion that ischemic tissues develop, with relative uniformity, along the endocardial aspects of the heart. These injured regions of subendocardial tissue give rise to intramural currents that lead to ST segment deflections within electrocardiogram (ECG) recordings. The concept of subendocardial ischemic regions is often used in clinical practice, providing a simple and intuitive description of ischemic injury; however, such a model grossly oversimplifies the presentation of ischemic disease-inadvertently leading to errors in ECG-based diagnoses. Furthermore, recent experimental studies have brought into question the subendocardial ischemia paradigm suggesting instead a more distributed pattern of tissue injury. These findings come from experiments and so have both the impact and the limitations of measurements from living organisms. Computer models have often been employed to overcome the constraints of experimental approaches and have a robust history in cardiac simulation. To this end, we have developed a computational simulation framework aimed at elucidating the effects of ischemia on measurable cardiac potentials. To validate our framework, we simulated, visualized, and analyzed 226 experimentally derived acute myocardial ischemic events. Simulation outcomes agreed both qualitatively (feature comparison) and quantitatively (correlation, average error, and significance) with experimentally obtained epicardial measurements, particularly under conditions of elevated ischemic stress. Our simulation framework introduces a novel approach to incorporating subject-specific, geometric models and experimental results that are highly resolved in space and time into computational models. We propose this framework as a means to advance the understanding of the underlying mechanisms of ischemic disease while simultaneously putting in place the computational infrastructure necessary to study and improve ischemia models aimed at reducing diagnostic errors in the clinic.

  12. A conceptual framework for hydropeaking mitigation.

    PubMed

    Bruder, Andreas; Tonolla, Diego; Schweizer, Steffen P; Vollenweider, Stefan; Langhans, Simone D; Wüest, Alfred

    2016-10-15

    Hydropower plants are an important source of renewable energy. In the near future, high-head storage hydropower plants will gain further importance as a key element of large-scale electricity production systems. However, these power plants can cause hydropeaking which is characterized by intense unnatural discharge fluctuations in downstream river reaches. Consequences on environmental conditions in these sections are diverse and include changes to the hydrology, hydraulics and sediment regime on very short time scales. These altered conditions affect river ecosystems and biota, for instance due to drift and stranding of fishes and invertebrates. Several structural and operational measures exist to mitigate hydropeaking and the adverse effects on ecosystems, but estimating and predicting their ecological benefit remains challenging. We developed a conceptual framework to support the ecological evaluation of hydropeaking mitigation measures based on current mitigation projects in Switzerland and the scientific literature. We refined this framework with an international panel of hydropeaking experts. The framework is based on a set of indicators, which covers all hydrological phases of hydropeaking and the most important affected abiotic and biotic processes. Effects of mitigation measures on these indicators can be predicted quantitatively using prediction tools such as discharge scenarios and numerical habitat models. Our framework allows a comparison of hydropeaking effects among alternative mitigation measures, to the pre-mitigation situation, and to reference river sections. We further identified key issues that should be addressed to increase the efficiency of current and future projects. They include the spatial and temporal context of mitigation projects, the interactions of river morphology with hydropeaking effects, and the role of appropriate monitoring to evaluate the success of mitigation projects. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vernon, Christopher R.; Arntzen, Evan V.; Richmond, Marshall C.

    Assessing the environmental benefits of proposed flow modification to large rivers provides invaluable insight into future hydropower project operations and relicensing activities. Providing a means to quantitatively define flow-ecology relationships is integral in establishing flow regimes that are mutually beneficial to power production and ecological needs. To compliment this effort an opportunity to create versatile tools that can be applied to broad geographic areas has been presented. In particular, integration with efforts standardized within the ecological limits of hydrologic alteration (ELOHA) is highly advantageous (Poff et al. 2010). This paper presents a geographic information system (GIS) framework for large rivermore » classification that houses a base geomorphic classification that is both flexible and accurate, allowing for full integration with other hydrologic models focused on addressing ELOHA efforts. A case study is also provided that integrates publically available National Hydrography Dataset Plus Version 2 (NHDPlusV2) data, Modular Aquatic Simulation System two-dimensional (MASS2) hydraulic data, and field collected data into the framework to produce a suite of flow-ecology related outputs. The case study objective was to establish areas of optimal juvenile salmonid rearing habitat under varying flow regimes throughout an impounded portion of the lower Snake River, USA (Figure 1) as an indicator to determine sites where the potential exists to create additional shallow water habitat. Additionally, an alternative hydrologic classification useable throughout the contiguous United States which can be coupled with the geomorphic aspect of this framework is also presented. This framework provides the user with the ability to integrate hydrologic and ecologic data into the base geomorphic aspect of this framework within a geographic information system (GIS) to output spatiotemporally variable flow-ecology relationship scenarios.« less

  14. Spatially explicit land-use and land-cover scenarios for the Great Plains of the United States

    USGS Publications Warehouse

    Sohl, Terry L.; Sleeter, Benjamin M.; Sayler, Kristi L.; Bouchard, Michelle A.; Reker, Ryan R.; Bennett, Stacie L.; Sleeter, Rachel R.; Kanengieter, Ronald L.; Zhu, Zhi-Liang

    2012-01-01

    The Great Plains of the United States has undergone extensive land-use and land-cover change in the past 150 years, with much of the once vast native grasslands and wetlands converted to agricultural crops, and much of the unbroken prairie now heavily grazed. Future land-use change in the region could have dramatic impacts on ecological resources and processes. A scenario-based modeling framework is needed to support the analysis of potential land-use change in an uncertain future, and to mitigate potentially negative future impacts on ecosystem processes. We developed a scenario-based modeling framework to analyze potential future land-use change in the Great Plains. A unique scenario construction process, using an integrated modeling framework, historical data, workshops, and expert knowledge, was used to develop quantitative demand for future land-use change for four IPCC scenarios at the ecoregion level. The FORE-SCE model ingested the scenario information and produced spatially explicit land-use maps for the region at relatively fine spatial and thematic resolutions. Spatial modeling of the four scenarios provided spatial patterns of land-use change consistent with underlying assumptions and processes associated with each scenario. Economically oriented scenarios were characterized by significant loss of natural land covers and expansion of agricultural and urban land uses. Environmentally oriented scenarios experienced modest declines in natural land covers to slight increases. Model results were assessed for quantity and allocation disagreement between each scenario pair. In conjunction with the U.S. Geological Survey's Biological Carbon Sequestration project, the scenario-based modeling framework used for the Great Plains is now being applied to the entire United States.

  15. Deep neural network-based bandwidth enhancement of photoacoustic data.

    PubMed

    Gutta, Sreedevi; Kadimesetty, Venkata Suryanarayana; Kalva, Sandeep Kumar; Pramanik, Manojit; Ganapathy, Sriram; Yalavarthy, Phaneendra K

    2017-11-01

    Photoacoustic (PA) signals collected at the boundary of tissue are always band-limited. A deep neural network was proposed to enhance the bandwidth (BW) of the detected PA signal, thereby improving the quantitative accuracy of the reconstructed PA images. A least square-based deconvolution method that utilizes the Tikhonov regularization framework was used for comparison with the proposed network. The proposed method was evaluated using both numerical and experimental data. The results indicate that the proposed method was capable of enhancing the BW of the detected PA signal, which inturn improves the contrast recovery and quality of reconstructed PA images without adding any significant computational burden. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  16. Fundamentals and Recent Developments in Approximate Bayesian Computation

    PubMed Central

    Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka

    2017-01-01

    Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922

  17. Quantitative analysis of low-density SNP data for parentage assignment and estimation of family contributions to pooled samples.

    PubMed

    Henshall, John M; Dierens, Leanne; Sellars, Melony J

    2014-09-02

    While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de; Schreiber, Falk; Martin-Luther-University Halle-Wittenberg, Halle

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the contextmore » of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.« less

  19. A novel hybrid metal-organic framework-polymeric monolith for solid-phase microextraction.

    PubMed

    Lin, Chen-Lan; Lirio, Stephen; Chen, Ya-Ting; Lin, Chia-Her; Huang, Hsi-Ya

    2014-03-17

    This study describes the fabrication of a novel hybrid metal-organic framework- organic polymer (MOF-polymer) for use as a stationary phase in fritless solid-phase microextraction (SPME) for validating analytical methods. The MOF-polymer was prepared by using ethylene dimethacrylate (EDMA), butyl methacrylate (BMA), and an imidazolium-based ionic liquid as porogenic solvent followed by microwave-assisted polymerization with the addition of 25 % MOF. This novel hybrid MOF-polymer was used to extract penicillin (penicillin G, penicillin V, oxacillin, cloxacillin, nafcillin, dicloxacillin) under different conditions. Quantitative analysis of the extracted penicillin samples using the MOF-organic polymer for SPME was conducted by using capillary electrochromatography (CEC) coupled with UV analysis. The penicillin recovery was 63-96.2 % with high reproducibility, sensitivity, and reusability. The extraction time with the proposed fabricated SPME was only 34 min. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Organizational cultural competence consultation to a mental health institution.

    PubMed

    Fung, Kenneth; Lo, Hung-Tat Ted; Srivastava, Rani; Andermann, Lisa

    2012-04-01

    Cultural competence is increasingly recognized as an essential component of effective mental health care delivery to address diversity and equity issues. Drawing from the literature and our experience in providing cultural competence consultation and training, the paper will discuss our perspective on the foundational concepts of cultural competence and how it applies to a health care organization, including its programs and services. Based on a recent consultation project, we present a methodology for assessing cultural competence in health care organizations, involving mixed quantitative and qualitative methods. Key findings and recommendations from the resulting cultural competence plan are discussed, including core principles, change strategies, and an Organizational Cultural Competence Framework, which may be applicable to other health care institutions seeking such changes. This framework, consisting of eight domains, can be used for organizational assessment and cultural competence planning, ultimately aiming at enhancing mental health care service to the diverse patients, families, and communities.

  1. Estimation of brain network ictogenicity predicts outcome from epilepsy surgery

    NASA Astrophysics Data System (ADS)

    Goodfellow, M.; Rummel, C.; Abela, E.; Richardson, M. P.; Schindler, K.; Terry, J. R.

    2016-07-01

    Surgery is a valuable option for pharmacologically intractable epilepsy. However, significant post-operative improvements are not always attained. This is due in part to our incomplete understanding of the seizure generating (ictogenic) capabilities of brain networks. Here we introduce an in silico, model-based framework to study the effects of surgery within ictogenic brain networks. We find that factors conventionally determining the region of tissue to resect, such as the location of focal brain lesions or the presence of epileptiform rhythms, do not necessarily predict the best resection strategy. We validate our framework by analysing electrocorticogram (ECoG) recordings from patients who have undergone epilepsy surgery. We find that when post-operative outcome is good, model predictions for optimal strategies align better with the actual surgery undertaken than when post-operative outcome is poor. Crucially, this allows the prediction of optimal surgical strategies and the provision of quantitative prognoses for patients undergoing epilepsy surgery.

  2. Ag/AgO Nanoparticles Grown via Time Dependent Double Mechanism in a 2D Layered Ni-PCP and Their Antibacterial Efficacy

    NASA Astrophysics Data System (ADS)

    Agarwal, Rashmi A.; Gupta, Neeraj K.; Singh, Rajan; Nigam, Shivansh; Ateeq, Bushra

    2017-03-01

    A simple synthesis route for growth of Ag/AgO nanoparticles (NPs) in large quantitative yields with narrow size distribution from a functional, non-activated, Ni (II) based highly flexible porous coordination polymer (PCP) as a template has been demonstrated. This template is a stable storage media for the NPs larger than the pore diameters of the PCP. From EPR study it was concluded that NPs were synthesized via two mechanisms i.e. acid formation and the redox activity of the framework. Size range of Ag/AgO NPs is sensitive to choice of solvent and reaction time. Direct use of Ag/AgO@Ni-PCP shows influential growth inhibition towards Escherichia coli and the pathogen Salmonella typhimurium at extremely low concentrations. The pristine template shows no cytotoxic activity, even though it contains Ni nodes in the framework.

  3. Measuring coherence with entanglement concurrence

    NASA Astrophysics Data System (ADS)

    Qi, Xianfei; Gao, Ting; Yan, Fengli

    2017-07-01

    Quantum coherence is a fundamental manifestation of the quantum superposition principle. Recently, Baumgratz et al (2014 Phys. Rev. Lett. 113 140401) presented a rigorous framework to quantify coherence from the view of theory of physical resource. Here we propose a new valid quantum coherence measure which is a convex roof measure, for a quantum system of arbitrary dimension, essentially using the generalized Gell-Mann matrices. Rigorous proof shows that the proposed coherence measure, coherence concurrence, fulfills all the requirements dictated by the resource theory of quantum coherence measures. Moreover, strong links between the resource frameworks of coherence concurrence and entanglement concurrence is derived, which shows that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. Our work provides a clear quantitative and operational connection between coherence and entanglement based on two kinds of concurrence. This new coherence measure, coherence concurrence, may also be beneficial to the study of quantum coherence.

  4. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    PubMed

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. An empirical generative framework for computational modeling of language acquisition.

    PubMed

    Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-06-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.

  6. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  7. Lattice-free prediction of three-dimensional structure of programmed DNA assemblies

    PubMed Central

    Pan, Keyao; Kim, Do-Nyun; Zhang, Fei; Adendorff, Matthew R.; Yan, Hao; Bathe, Mark

    2014-01-01

    DNA can be programmed to self-assemble into high molecular weight 3D assemblies with precise nanometer-scale structural features. Although numerous sequence design strategies exist to realize these assemblies in solution, there is currently no computational framework to predict their 3D structures on the basis of programmed underlying multi-way junction topologies constrained by DNA duplexes. Here, we introduce such an approach and apply it to assemblies designed using the canonical immobile four-way junction. The procedure is used to predict the 3D structure of high molecular weight planar and spherical ring-like origami objects, a tile-based sheet-like ribbon, and a 3D crystalline tensegrity motif, in quantitative agreement with experiments. Our framework provides a new approach to predict programmed nucleic acid 3D structure on the basis of prescribed secondary structure motifs, with possible application to the design of such assemblies for use in biomolecular and materials science. PMID:25470497

  8. Antiviral Information Management System (AIMS): a prototype for operational innovation in drug development.

    PubMed

    Jadhav, Pravin R; Neal, Lauren; Florian, Jeff; Chen, Ying; Naeger, Lisa; Robertson, Sarah; Soon, Guoxing; Birnkrant, Debra

    2010-09-01

    This article presents a prototype for an operational innovation in knowledge management (KM). These operational innovations are geared toward managing knowledge efficiently and accessing all available information by embracing advances in bioinformatics and allied fields. The specific components of the proposed KM system are (1) a database to archive hepatitis C virus (HCV) treatment data in a structured format and retrieve information in a query-capable manner and (2) an automated analysis tool to inform trial design elements for HCV drug development. The proposed framework is intended to benefit drug development by increasing efficiency of dose selection and improving the consistency of advice from US Food and Drug Administration (FDA). It is also hoped that the framework will encourage collaboration among FDA, industry, and academic scientists to guide the HCV drug development process using model-based quantitative analysis techniques.

  9. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes

    PubMed Central

    Zhang, Hong; Pei, Yun

    2016-01-01

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions. PMID:27529266

  10. The memory remains: Understanding collective memory in the digital age

    PubMed Central

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-01-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects. PMID:28435881

  11. On iterative algorithms for quantitative photoacoustic tomography in the radiative transport regime

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Zhou, Tie

    2017-11-01

    In this paper, we present a numerical reconstruction method for quantitative photoacoustic tomography (QPAT), based on the radiative transfer equation (RTE), which models light propagation more accurately than diffusion approximation (DA). We investigate the reconstruction of absorption coefficient and scattering coefficient of biological tissues. An improved fixed-point iterative method to retrieve the absorption coefficient, given the scattering coefficient, is proposed for its cheap computational cost; the convergence of this method is also proved. The Barzilai-Borwein (BB) method is applied to retrieve two coefficients simultaneously. Since the reconstruction of optical coefficients involves the solutions of original and adjoint RTEs in the framework of optimization, an efficient solver with high accuracy is developed from Gao and Zhao (2009 Transp. Theory Stat. Phys. 38 149-92). Simulation experiments illustrate that the improved fixed-point iterative method and the BB method are competitive methods for QPAT in the relevant cases.

  12. Fuzzy object modeling

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.

    2011-03-01

    To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.

  13. The memory remains: Understanding collective memory in the digital age.

    PubMed

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-04-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects.

  14. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes.

    PubMed

    Zhang, Hong; Pei, Yun

    2016-08-12

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions.

  15. Enabling Interactive Measurements from Large Coverage Microscopy

    PubMed Central

    Bajcsy, Peter; Vandecreme, Antoine; Amelot, Julien; Chalfoun, Joe; Majurski, Michael; Brady, Mary

    2017-01-01

    Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb. PMID:28663600

  16. Using the DSAP Framework to Guide Instructional Design and Technology Integration in BYOD Classrooms

    ERIC Educational Resources Information Center

    Wasko, Christopher W.

    2016-01-01

    The purpose of this study was to determine the suitability of the DSAP Framework to guide instructional design and technology integration for teachers piloting a BYOD (Bring Your Own Device) initiative and to measure the impact the initiative had on the amount and type of technology used in pilot classrooms. Quantitative and qualitative data were…

  17. Teacher Education Preparation and Implementation for Multicultural and Diverse School Environments in the 21st Century: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Cole, Patricia Ann

    2013-01-01

    This sequential explanatory mixed methods study investigated 24 college and university syllabi for content consisting of multicultural education that used the framework for multicultural education devised by James A. Banks (2006). This framework was used to analyze data collected using descriptive statistics for quantitative phase one. The four…

  18. Teaching Games for Understanding in American High-School Soccer: A Quantitative Data Analysis Using the Game Performance Assessment Instrument

    ERIC Educational Resources Information Center

    Harvey, Stephen; Cushion, Christopher J.; Wegis, Heidi M.; Massa-Gonzalez, Ada N.

    2010-01-01

    Background: Previous research examining the effectiveness of the Teaching Games for Understanding (TGfU) approach has been equivocal. This has been hampered by a dependence on a comparative (i.e., "which method is best?") theoretical framework. An alternative "practice-referenced" framework has the potential to examine the effectiveness of TGfU…

  19. Characterizing the concentration of Cryptosporidium in Australian surface waters for setting health-based targets for drinking water treatment.

    PubMed

    Petterson, S; Roser, D; Deere, D

    2015-09-01

    It is proposed that the next revision of the Australian Drinking Water Guidelines will include 'health-based targets', where the required level of potable water treatment quantitatively relates to the magnitude of source water pathogen concentrations. To quantify likely Cryptosporidium concentrations in southern Australian surface source waters, the databases for 25 metropolitan water supplies with good historical records, representing a range of catchment sizes, land use and climatic regions were mined. The distributions and uncertainty intervals for Cryptosporidium concentrations were characterized for each site. Then, treatment targets were quantified applying the framework recommended in the World Health Organization Guidelines for Drinking-Water Quality 2011. Based on total oocyst concentrations, and not factoring in genotype or physiological state information as it relates to infectivity for humans, the best estimates of the required level of treatment, expressed as log10 reduction values, ranged among the study sites from 1.4 to 6.1 log10. Challenges associated with relying on historical monitoring data for defining drinking water treatment requirements were identified. In addition, the importance of quantitative microbial risk assessment input assumptions on the quantified treatment targets was investigated, highlighting the need for selection of locally appropriate values.

  20. National Service Frameworks and UK general practitioners: street-level bureaucrats at work?

    PubMed

    Checkland, Kath

    2004-11-01

    This paper argues that the past decade has seen significant changes in the nature of medical work in general practice in the UK. Increasing pressure to use normative clinical guidelines and the move towards explicit quantitative measures of performance together have the potential to alter the way in which health care is delivered to patients. Whilst it is possible to view these developments from the well-established sociological perspectives of deprofessionalisation and proletarianisation, this paper takes a view of general practice as work, and uses the ideas of Lipsky to analyse practice-level responses to some of these changes. In addition to evidence-based clinical guidelines, National Service Frameworks, introduced by the UK government in 1997, also specify detailed models of service provision that health care providers are expected to follow. As part of a larger study examining the impact of National Service Frameworks in general practice, the response of three practices to the first four NSFs were explored. The failure of NSFs to make a significant impact is compared to the practices' positive responses to purely clinical guidelines such as those developed by the British Hypertension Society. Lipsky's concept of public service workers as 'street-level bureaucrats' is discussed and used as a framework within which to view these findings.

  1. Quantifying uncertainty in health impact assessment: a case-study example on indoor housing ventilation.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2014-01-01

    Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.

  2. Action understanding as inverse planning.

    PubMed

    Baker, Chris L; Saxe, Rebecca; Tenenbaum, Joshua B

    2009-12-01

    Humans are adept at inferring the mental states underlying other agents' actions, such as goals, beliefs, desires, emotions and other thoughts. We propose a computational framework based on Bayesian inverse planning for modeling human action understanding. The framework represents an intuitive theory of intentional agents' behavior based on the principle of rationality: the expectation that agents will plan approximately rationally to achieve their goals, given their beliefs about the world. The mental states that caused an agent's behavior are inferred by inverting this model of rational planning using Bayesian inference, integrating the likelihood of the observed actions with the prior over mental states. This approach formalizes in precise probabilistic terms the essence of previous qualitative approaches to action understanding based on an "intentional stance" [Dennett, D. C. (1987). The intentional stance. Cambridge, MA: MIT Press] or a "teleological stance" [Gergely, G., Nádasdy, Z., Csibra, G., & Biró, S. (1995). Taking the intentional stance at 12 months of age. Cognition, 56, 165-193]. In three psychophysical experiments using animated stimuli of agents moving in simple mazes, we assess how well different inverse planning models based on different goal priors can predict human goal inferences. The results provide quantitative evidence for an approximately rational inference mechanism in human goal inference within our simplified stimulus paradigm, and for the flexible nature of goal representations that human observers can adopt. We discuss the implications of our experimental results for human action understanding in real-world contexts, and suggest how our framework might be extended to capture other kinds of mental state inferences, such as inferences about beliefs, or inferring whether an entity is an intentional agent.

  3. Highly sensitive photoelectrochemical biosensor for kinase activity detection and inhibition based on the surface defect recognition and multiple signal amplification of metal-organic frameworks.

    PubMed

    Wang, Zonghua; Yan, Zhiyong; Wang, Feng; Cai, Jibao; Guo, Lei; Su, Jiakun; Liu, Yang

    2017-11-15

    A turn-on photoelectrochemical (PEC) biosensor based on the surface defect recognition and multiple signal amplification of metal-organic frameworks (MOFs) was proposed for highly sensitive protein kinase activity analysis and inhibitor evaluation. In this strategy, based on the phosphorylation reaction in the presence of protein kinase A (PKA), the Zr-based metal-organic frameworks (UiO-66) accommodated with [Ru(bpy) 3 ] 2+ photoactive dyes in the pores were linked to the phosphorylated kemptide modified TiO 2 /ITO electrode through the chelation between the Zr 4+ defects on the surface of UiO-66 and the phosphate groups in kemptide. Under visible light irradiation, the excited electrons from [Ru(bpy) 3 ] 2+ adsorbed in the pores of UiO-66 injected into the TiO 2 conduction band to generate photocurrent, which could be utilized for protein kinase activities detection. The large surface area and high porosities of UiO-66 facilitated a large number of [Ru(bpy) 3 ] 2+ that increased the photocurrent significantly, and afforded a highly sensitive PEC analysis of kinase activity. The detection limit of the as-proposed PEC biosensor was 0.0049UmL -1 (S/N!=!3). The biosensor was also applied for quantitative kinase inhibitor evaluation and PKA activities detection in MCF-7 cell lysates. The developed visible-light PEC biosensor provides a simple detection procedure and a cost-effective manner for PKA activity assays, and shows great potential in clinical diagnosis and drug discoveries. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Qualitative and Quantitative Analysis for Facial Complexion in Traditional Chinese Medicine

    PubMed Central

    Zhao, Changbo; Li, Guo-zheng; Li, Fufeng; Wang, Zhi; Liu, Chang

    2014-01-01

    Facial diagnosis is an important and very intuitive diagnostic method in Traditional Chinese Medicine (TCM). However, due to its qualitative and experience-based subjective property, traditional facial diagnosis has a certain limitation in clinical medicine. The computerized inspection method provides classification models to recognize facial complexion (including color and gloss). However, the previous works only study the classification problems of facial complexion, which is considered as qualitative analysis in our perspective. For quantitative analysis expectation, the severity or degree of facial complexion has not been reported yet. This paper aims to make both qualitative and quantitative analysis for facial complexion. We propose a novel feature representation of facial complexion from the whole face of patients. The features are established with four chromaticity bases splitting up by luminance distribution on CIELAB color space. Chromaticity bases are constructed from facial dominant color using two-level clustering; the optimal luminance distribution is simply implemented with experimental comparisons. The features are proved to be more distinctive than the previous facial complexion feature representation. Complexion recognition proceeds by training an SVM classifier with the optimal model parameters. In addition, further improved features are more developed by the weighted fusion of five local regions. Extensive experimental results show that the proposed features achieve highest facial color recognition performance with a total accuracy of 86.89%. And, furthermore, the proposed recognition framework could analyze both color and gloss degrees of facial complexion by learning a ranking function. PMID:24967342

  5. Power Analysis of Artificial Selection Experiments Using Efficient Whole Genome Simulation of Quantitative Traits

    PubMed Central

    Kessner, Darren; Novembre, John

    2015-01-01

    Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50–100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates. PMID:25672748

  6. Coding Early Naturalists' Accounts into Long-Term Fish Community Changes in the Adriatic Sea (1800–2000)

    PubMed Central

    Fortibuoni, Tomaso; Libralato, Simone; Raicevich, Saša; Giovanardi, Otello; Solidoro, Cosimo

    2010-01-01

    The understanding of fish communities' changes over the past centuries has important implications for conservation policy and marine resource management. However, reconstructing these changes is difficult because information on marine communities before the second half of the 20th century is, in most cases, anecdotal and merely qualitative. Therefore, historical qualitative records and modern quantitative data are not directly comparable, and their integration for long-term analyses is not straightforward. We developed a methodology that allows the coding of qualitative information provided by early naturalists into semi-quantitative information through an intercalibration with landing proportions. This approach allowed us to reconstruct and quantitatively analyze a 200-year-long time series of fish community structure indicators in the Northern Adriatic Sea (Mediterranean Sea). Our analysis provides evidence of long-term changes in fish community structure, including the decline of Chondrichthyes, large-sized and late-maturing species. This work highlights the importance of broadening the time-frame through which we look at marine ecosystem changes and provides a methodology to exploit, in a quantitative framework, historical qualitative sources. To the purpose, naturalists' eyewitness accounts proved to be useful for extending the analysis on fish community back in the past, well before the onset of field-based monitoring programs. PMID:21103349

  7. Real-time probabilistic covariance tracking with efficient model update.

    PubMed

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  8. Seismic and Restoration Assessment of Monumental Masonry Structures

    PubMed Central

    Asteris, Panagiotis G.; Douvika, Maria G.; Apostolopoulou, Maria; Moropoulou, Antonia

    2017-01-01

    Masonry structures are complex systems that require detailed knowledge and information regarding their response under seismic excitations. Appropriate modelling of a masonry structure is a prerequisite for a reliable earthquake-resistant design and/or assessment. However, modelling a real structure with a robust quantitative (mathematical) representation is a very difficult, complex and computationally-demanding task. The paper herein presents a new stochastic computational framework for earthquake-resistant design of masonry structural systems. The proposed framework is based on the probabilistic behavior of crucial parameters, such as material strength and seismic characteristics, and utilizes fragility analysis based on different failure criteria for the masonry material. The application of the proposed methodology is illustrated in the case of a historical and monumental masonry structure, namely the assessment of the seismic vulnerability of the Kaisariani Monastery, a byzantine church that was built in Athens, Greece, at the end of the 11th to the beginning of the 12th century. Useful conclusions are drawn regarding the effectiveness of the intervention techniques used for the reduction of the vulnerability of the case-study structure, by means of comparison of the results obtained. PMID:28767073

  9. Seismic and Restoration Assessment of Monumental Masonry Structures.

    PubMed

    Asteris, Panagiotis G; Douvika, Maria G; Apostolopoulou, Maria; Moropoulou, Antonia

    2017-08-02

    Masonry structures are complex systems that require detailed knowledge and information regarding their response under seismic excitations. Appropriate modelling of a masonry structure is a prerequisite for a reliable earthquake-resistant design and/or assessment. However, modelling a real structure with a robust quantitative (mathematical) representation is a very difficult, complex and computationally-demanding task. The paper herein presents a new stochastic computational framework for earthquake-resistant design of masonry structural systems. The proposed framework is based on the probabilistic behavior of crucial parameters, such as material strength and seismic characteristics, and utilizes fragility analysis based on different failure criteria for the masonry material. The application of the proposed methodology is illustrated in the case of a historical and monumental masonry structure, namely the assessment of the seismic vulnerability of the Kaisariani Monastery, a byzantine church that was built in Athens, Greece, at the end of the 11th to the beginning of the 12th century. Useful conclusions are drawn regarding the effectiveness of the intervention techniques used for the reduction of the vulnerability of the case-study structure, by means of comparison of the results obtained.

  10. Combining Bayesian Networks and Agent Based Modeling to develop a decision-support model in Vietnam

    NASA Astrophysics Data System (ADS)

    Nong, Bao Anh; Ertsen, Maurits; Schoups, Gerrit

    2016-04-01

    Complexity and uncertainty in natural resources management have been focus themes in recent years. Within these debates, with the aim to define an approach feasible for water management practice, we are developing an integrated conceptual modeling framework for simulating decision-making processes of citizens, in our case in the Day river area, Vietnam. The model combines Bayesian Networks (BNs) and Agent-Based Modeling (ABM). BNs are able to combine both qualitative data from consultants / experts / stakeholders, and quantitative data from observations on different phenomena or outcomes from other models. Further strengths of BNs are that the relationship between variables in the system is presented in a graphical interface, and that components of uncertainty are explicitly related to their probabilistic dependencies. A disadvantage is that BNs cannot easily identify the feedback of agents in the system once changes appear. Hence, ABM was adopted to represent the reaction among stakeholders under changes. The modeling framework is developed as an attempt to gain better understanding about citizen's behavior and factors influencing their decisions in order to reduce uncertainty in the implementation of water management policy.

  11. A spectral approach for the quantitative description of cardiac collagen network from nonlinear optical imaging.

    PubMed

    Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia

    2015-01-01

    The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.

  12. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  13. Evaluating Landscape Options for Corridor Restoration between Giant Panda Reserves

    PubMed Central

    Wang, Fang; McShea, William J.; Wang, Dajun; Li, Sheng; Zhao, Qing; Wang, Hao; Lu, Zhi

    2014-01-01

    The establishment of corridors can offset the negative effects of habitat fragmentation by connecting isolated habitat patches. However, the practical value of corridor planning is minimal if corridor identification is not based on reliable quantitative information about species-environment relationships. An example of this need for quantitative information is planning for giant panda conservation. Although the species has been the focus of intense conservation efforts for decades, most corridor projects remain hypothetical due to the lack of reliable quantitative researches at an appropriate spatial scale. In this paper, we evaluated a framework for giant panda forest corridor planning. We linked our field survey data with satellite imagery, and conducted species occupancy modelling to examine the habitat use of giant panda within the potential corridor area. We then conducted least-cost and circuit models to identify potential paths of dispersal across the landscape, and compared the predicted cost under current conditions and alternative conservation management options considered during corridor planning. We found that due to giant panda's association with areas of low elevation and flat terrain, human infrastructures in the same area have resulted in corridor fragmentation. We then identified areas with high potential to function as movement corridors, and our analysis of alternative conservation scenarios showed that both forest/bamboo restoration and automobile tunnel construction would significantly improve the effectiveness of corridor, while residence relocation would not significantly improve corridor effectiveness in comparison with the current condition. The framework has general value in any conservation activities that anticipate improving habitat connectivity in human modified landscapes. Specifically, our study suggested that, in this landscape, automobile tunnels are the best means to remove current barriers to giant panda movements caused by anthropogenic interferences. PMID:25133757

  14. Quantitative Live Imaging of Human Embryonic Stem Cell Derived Neural Rosettes Reveals Structure-Function Dynamics Coupled to Cortical Development.

    PubMed

    Ziv, Omer; Zaritsky, Assaf; Yaffe, Yakey; Mutukula, Naresh; Edri, Reuven; Elkabetz, Yechiel

    2015-10-01

    Neural stem cells (NSCs) are progenitor cells for brain development, where cellular spatial composition (cytoarchitecture) and dynamics are hypothesized to be linked to critical NSC capabilities. However, understanding cytoarchitectural dynamics of this process has been limited by the difficulty to quantitatively image brain development in vivo. Here, we study NSC dynamics within Neural Rosettes--highly organized multicellular structures derived from human pluripotent stem cells. Neural rosettes contain NSCs with strong epithelial polarity and are expected to perform apical-basal interkinetic nuclear migration (INM)--a hallmark of cortical radial glial cell development. We developed a quantitative live imaging framework to characterize INM dynamics within rosettes. We first show that the tendency of cells to follow the INM orientation--a phenomenon we referred to as radial organization, is associated with rosette size, presumably via mechanical constraints of the confining structure. Second, early forming rosettes, which are abundant with founder NSCs and correspond to the early proliferative developing cortex, show fast motions and enhanced radial organization. In contrast, later derived rosettes, which are characterized by reduced NSC capacity and elevated numbers of differentiated neurons, and thus correspond to neurogenesis mode in the developing cortex, exhibit slower motions and decreased radial organization. Third, later derived rosettes are characterized by temporal instability in INM measures, in agreement with progressive loss in rosette integrity at later developmental stages. Finally, molecular perturbations of INM by inhibition of actin or non-muscle myosin-II (NMII) reduced INM measures. Our framework enables quantification of cytoarchitecture NSC dynamics and may have implications in functional molecular studies, drug screening, and iPS cell-based platforms for disease modeling.

  15. Evaluating landscape options for corridor restoration between giant panda reserves.

    PubMed

    Wang, Fang; McShea, William J; Wang, Dajun; Li, Sheng; Zhao, Qing; Wang, Hao; Lu, Zhi

    2014-01-01

    The establishment of corridors can offset the negative effects of habitat fragmentation by connecting isolated habitat patches. However, the practical value of corridor planning is minimal if corridor identification is not based on reliable quantitative information about species-environment relationships. An example of this need for quantitative information is planning for giant panda conservation. Although the species has been the focus of intense conservation efforts for decades, most corridor projects remain hypothetical due to the lack of reliable quantitative researches at an appropriate spatial scale. In this paper, we evaluated a framework for giant panda forest corridor planning. We linked our field survey data with satellite imagery, and conducted species occupancy modelling to examine the habitat use of giant panda within the potential corridor area. We then conducted least-cost and circuit models to identify potential paths of dispersal across the landscape, and compared the predicted cost under current conditions and alternative conservation management options considered during corridor planning. We found that due to giant panda's association with areas of low elevation and flat terrain, human infrastructures in the same area have resulted in corridor fragmentation. We then identified areas with high potential to function as movement corridors, and our analysis of alternative conservation scenarios showed that both forest/bamboo restoration and automobile tunnel construction would significantly improve the effectiveness of corridor, while residence relocation would not significantly improve corridor effectiveness in comparison with the current condition. The framework has general value in any conservation activities that anticipate improving habitat connectivity in human modified landscapes. Specifically, our study suggested that, in this landscape, automobile tunnels are the best means to remove current barriers to giant panda movements caused by anthropogenic interferences.

  16. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    PubMed

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  17. Sex Differences in Animal Models: Focus on Addiction

    PubMed Central

    Becker, Jill B.

    2016-01-01

    The purpose of this review is to discuss ways to think about and study sex differences in preclinical animal models. We use the framework of addiction, in which animal models have excellent face and construct validity, to illustrate the importance of considering sex differences. There are four types of sex differences: qualitative, quantitative, population, and mechanistic. A better understanding of the ways males and females can differ will help scientists design experiments to characterize better the presence or absence of sex differences in new phenomena that they are investigating. We have outlined major quantitative, population, and mechanistic sex differences in the addiction domain using a heuristic framework of the three established stages of the addiction cycle: binge/intoxication, withdrawal/negative affect, and preoccupation/anticipation. Female rats, in general, acquire the self-administration of drugs and alcohol more rapidly, escalate their drug taking with extended access more rapidly, show more motivational withdrawal, and (where tested in animal models of “craving”) show greater reinstatement. The one exception is that female rats show less motivational withdrawal to alcohol. The bases for these quantitative sex differences appear to be both organizational, in that estradiol-treated neonatal animals show the male phenotype, and activational, in that the female phenotype depends on the effects of gonadal hormones. In animals, differences within the estrous cycle can be observed but are relatively minor. Such hormonal effects seem to be most prevalent during the acquisition of drug taking and less influential once compulsive drug taking is established and are linked largely to progesterone and estradiol. This review emphasizes not only significant differences in the phenotypes of females and males in the domain of addiction but emphasizes the paucity of data to date in our understanding of those differences. PMID:26772794

  18. Toward a Mixed-Methods Research Approach to Content Analysis in The Digital Age: The Combined Content-Analysis Model and its Applications to Health Care Twitter Feeds.

    PubMed

    Hamad, Eradah O; Savundranayagam, Marie Y; Holmes, Jeffrey D; Kinsella, Elizabeth Anne; Johnson, Andrew M

    2016-03-08

    Twitter's 140-character microblog posts are increasingly used to access information and facilitate discussions among health care professionals and between patients with chronic conditions and their caregivers. Recently, efforts have emerged to investigate the content of health care-related posts on Twitter. This marks a new area for researchers to investigate and apply content analysis (CA). In current infodemiology, infoveillance and digital disease detection research initiatives, quantitative and qualitative Twitter data are often combined, and there are no clear guidelines for researchers to follow when collecting and evaluating Twitter-driven content. The aim of this study was to identify studies on health care and social media that used Twitter feeds as a primary data source and CA as an analysis technique. We evaluated the resulting 18 studies based on a narrative review of previous methodological studies and textbooks to determine the criteria and main features of quantitative and qualitative CA. We then used the key features of CA and mixed-methods research designs to propose the combined content-analysis (CCA) model as a solid research framework for designing, conducting, and evaluating investigations of Twitter-driven content. We conducted a PubMed search to collect studies published between 2010 and 2014 that used CA to analyze health care-related tweets. The PubMed search and reference list checks of selected papers identified 21 papers. We excluded 3 papers and further analyzed 18. Results suggest that the methods used in these studies were not purely quantitative or qualitative, and the mixed-methods design was not explicitly chosen for data collection and analysis. A solid research framework is needed for researchers who intend to analyze Twitter data through the use of CA. We propose the CCA model as a useful framework that provides a straightforward approach to guide Twitter-driven studies and that adds rigor to health care social media investigations. We provide suggestions for the use of the CCA model in elder care-related contexts.

  19. Toward a Mixed-Methods Research Approach to Content Analysis in The Digital Age: The Combined Content-Analysis Model and its Applications to Health Care Twitter Feeds

    PubMed Central

    Hamad, Eradah O; Savundranayagam, Marie Y; Holmes, Jeffrey D; Kinsella, Elizabeth Anne

    2016-01-01

    Background Twitter’s 140-character microblog posts are increasingly used to access information and facilitate discussions among health care professionals and between patients with chronic conditions and their caregivers. Recently, efforts have emerged to investigate the content of health care-related posts on Twitter. This marks a new area for researchers to investigate and apply content analysis (CA). In current infodemiology, infoveillance and digital disease detection research initiatives, quantitative and qualitative Twitter data are often combined, and there are no clear guidelines for researchers to follow when collecting and evaluating Twitter-driven content. Objective The aim of this study was to identify studies on health care and social media that used Twitter feeds as a primary data source and CA as an analysis technique. We evaluated the resulting 18 studies based on a narrative review of previous methodological studies and textbooks to determine the criteria and main features of quantitative and qualitative CA. We then used the key features of CA and mixed-methods research designs to propose the combined content-analysis (CCA) model as a solid research framework for designing, conducting, and evaluating investigations of Twitter-driven content. Methods We conducted a PubMed search to collect studies published between 2010 and 2014 that used CA to analyze health care-related tweets. The PubMed search and reference list checks of selected papers identified 21 papers. We excluded 3 papers and further analyzed 18. Results Results suggest that the methods used in these studies were not purely quantitative or qualitative, and the mixed-methods design was not explicitly chosen for data collection and analysis. A solid research framework is needed for researchers who intend to analyze Twitter data through the use of CA. Conclusions We propose the CCA model as a useful framework that provides a straightforward approach to guide Twitter-driven studies and that adds rigor to health care social media investigations. We provide suggestions for the use of the CCA model in elder care-related contexts. PMID:26957477

  20. Control of Cattle Ticks and Tick-Borne Diseases by Acaricide in Southern Province of Zambia: A Retrospective Evaluation of Animal Health Measures According to Current One Health Concepts.

    PubMed

    Laing, Gabrielle; Aragrande, Maurizio; Canali, Massimo; Savic, Sara; De Meneghi, Daniele

    2018-01-01

    One health thinking for health interventions is increasingly being used to capture previously unseen stakeholders and impacts across people, animals, and the environment. The Network for One Health Evaluation (NEOH) proposes a systems-based framework to quantitatively assess integration and highlight the added value (theory of change) that this approach will bring to a project. This case study will retrospectively evaluate the pioneering use of a One Health (OH) approach during an international collaboration (satellite project to tackle production losses due to tick-borne disease in cattle in Southern Zambia in late 1980s). The objective of the evaluation is twofold: retrospective evaluation the OH-ness of the satellite project and identification of costs and benefits. Data for evaluation was recovered from publications, project documents, and witness interviews. A mixed qualitative and quantitative evaluation was undertaken. In this case study, a transdisciplinary approach allowed for the identification of a serious public health risk arising from the unexpected reuse of chemical containers by the local public against advice. Should this pioneering project not have been completed then it is assumed this behavior could have had a large impact on public wellbeing and ultimately reduced regional productivity and compromised welfare. From the economic evaluation, the costs of implementing this OH approach, helping to avoid harm, were small in comparison to overall project costs. The overall OH Index was 0.34. The satellite project demonstrated good OH operations by managing to incorporate the input across multiple dimensions but was slightly weaker on OH infrastructures (OH Ratio = 1.20). These quantitative results can be used in the initial validation and benchmarking of this novel framework. Limitations of the evaluation were mainly a lack of data due to the length of time since project completion and a lack of formal monitoring of program impact. In future health strategy development and execution, routine monitoring and evaluation from an OH perspective (by utilizing the framework proposed by NEOH), could prove valuable or used as a tool for retrospective evaluation of existing policies.

  1. Quantitative Analysis of Representations of Nature of Science in Nordic Upper Secondary School Textbooks Using Framework of Analysis Based on Philosophy of Chemistry

    NASA Astrophysics Data System (ADS)

    Vesterinen, Veli-Matti; Aksela, Maija; Lavonen, Jari

    2013-07-01

    The aim of this study was to assess how the different aspects of nature of science (NOS) were represented in Finnish and Swedish upper secondary school chemistry textbooks. The dimensions of NOS were analyzed from five popular chemistry textbook series. The study provides a quantitative method for analysis of representations of NOS in chemistry textbooks informed by domain-specific research on the philosophy of chemistry and chemical education. The selection of sections analyzed was based on the four themes of scientific literacy: knowledge of science, investigate nature of science, science as a way of thinking, and interaction of science, technology and society. For the second round of analysis the theme of science as a way of thinking was chosen for a closer inspection. The units of analysis in this theme were analyzed using seven domain specific dimensions of NOS: tentative, empirical, model-based, inferential, technological products, instrumentation, and social and societal dimensions. Based on the inter-rater agreement, the procedure and frameworks of analysis presented in this study was a reliable way of assessing the emphasis given to the domain specific aspects of NOS. All textbooks have little emphasis on the theme science as a way of thinking on a whole. In line with the differences of curricula, Swedish textbooks emphasize the tentative dimension of NOS more than Finnish textbooks. To provide teachers with a sufficiently wide variety of examples to discuss the different dimensions of NOS changes to the national core curricula are needed. Although changing the emphasis of the curricula would be the most obvious way to affect the emphasis of the textbooks, other efforts such as pre- and in-service courses for developing teachers understanding of NOS and pedagogic approaches for NOS instruction to their classroom practice might also be needed.

  2. Software Engineering Research/Developer Collaborations (C104)

    NASA Technical Reports Server (NTRS)

    Shell, Elaine; Shull, Forrest

    2005-01-01

    The goal of this collaboration was to produce Flight Software Branch (FSB) process standards for software inspections which could be used across three new missions within the FSB. The standard was developed by Dr. Forrest Shull (Fraunhofer Center for Experimental Software Engineering, Maryland) using the Perspective-Based Inspection approach, (PBI research has been funded by SARP) , then tested on a pilot Branch project. Because the short time scale of the collaboration ruled out a quantitative evaluation, it would be decided whether the standard was suitable for roll-out to other Branch projects based on a qualitative measure: whether the standard received high ratings from Branch personnel as to usability and overall satisfaction. The project used for piloting the Perspective-Based Inspection approach was a multi-mission framework designed for reuse. This was a good choice because key representatives from the three new missions would be involved in the inspections. The perspective-based approach was applied to produce inspection procedures tailored for the specific quality needs of the branch. The technical information to do so was largely drawn through a series of interviews with Branch personnel. The framework team used the procedures to review requirements. The inspections were useful for indicating that a restructuring of the requirements document was needed, which led to changes in the development project plan. The standard was sent out to other Branch personnel for review. Branch personnel were very positive. However, important changes were identified because the perspective of Attitude Control System (ACS) developers had not been adequately represented, a result of the specific personnel interviewed. The net result is that with some further work to incorporate the ACS perspective, and in synchrony with the roll out of independent Branch standards, the PBI approach will be implemented in the FSB. Also, the project intends to continue its collaboration with the technology provider (Dr. Forrest Shull) past the end of the grant, to allow a more rigorous quantitative evaluation.

  3. Evaluating Academic Scientists Collaborating in Team-Based Research: A Proposed Framework.

    PubMed

    Mazumdar, Madhu; Messinger, Shari; Finkelstein, Dianne M; Goldberg, Judith D; Lindsell, Christopher J; Morton, Sally C; Pollock, Brad H; Rahbar, Mohammad H; Welty, Leah J; Parker, Robert A

    2015-10-01

    Criteria for evaluating faculty are traditionally based on a triad of scholarship, teaching, and service. Research scholarship is often measured by first or senior authorship on peer-reviewed scientific publications and being principal investigator on extramural grants. Yet scientific innovation increasingly requires collective rather than individual creativity, which traditional measures of achievement were not designed to capture and, thus, devalue. The authors propose a simple, flexible framework for evaluating team scientists that includes both quantitative and qualitative assessments. An approach for documenting contributions of team scientists in team-based scholarship, nontraditional education, and specialized service activities is also outlined. Although biostatisticians are used for illustration, the approach is generalizable to team scientists in other disciplines.The authors offer three key recommendations to members of institutional promotion committees, department chairs, and others evaluating team scientists. First, contributions to team-based scholarship and specialized contributions to education and service need to be assessed and given appropriate and substantial weight. Second, evaluations must be founded on well-articulated criteria for assessing the stature and accomplishments of team scientists. Finally, mechanisms for collecting evaluative data must be developed and implemented at the institutional level. Without these three essentials, contributions of team scientists will continue to be undervalued in the academic environment.

  4. Automatic segmentation of right ventricle on ultrasound images using sparse matrix transform and level set

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Cong, Zhibin; Halig, Luma V.; Fei, Baowei

    2013-03-01

    An automatic framework is proposed to segment right ventricle on ultrasound images. This method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform (SMT), a training model, and a localized region based level set. First, the sparse matrix transform extracts main motion regions of myocardium as eigenimages by analyzing statistical information of these images. Second, a training model of right ventricle is registered to the extracted eigenimages in order to automatically detect the main location of the right ventricle and the corresponding transform relationship between the training model and the SMT-extracted results in the series. Third, the training model is then adjusted as an adapted initialization for the segmentation of each image in the series. Finally, based on the adapted initializations, a localized region based level set algorithm is applied to segment both epicardial and endocardial boundaries of the right ventricle from the whole series. Experimental results from real subject data validated the performance of the proposed framework in segmenting right ventricle from echocardiography. The mean Dice scores for both epicardial and endocardial boundaries are 89.1%+/-2.3% and 83.6+/-7.3%, respectively. The automatic segmentation method based on sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.

  5. Securing Real-Time Sessions in an IMS-Based Architecture

    NASA Astrophysics Data System (ADS)

    Cennamo, Paolo; Fresa, Antonio; Longo, Maurizio; Postiglione, Fabio; Robustelli, Anton Luca; Toro, Francesco

    The emerging all-IP mobile network infrastructures based on 3rd Generation IP Multimedia Subsystem philosophy are characterised by radio access technology independence and ubiquitous connectivity for mobile users. Currently, great focus is being devoted to security issues since most of the security threats presently affecting the public Internet domain, and the upcoming ones as well, are going to be suffered by mobile users in the years to come. While a great deal of research activity, together with standardisation efforts and experimentations, is carried out on mechanisms for signalling protection, very few integrated frameworks for real-time multimedia data protection have been proposed in a context of IP Multimedia Subsystem, and even fewer experimental results based on testbeds are available. In this paper, after a general overview of the security issues arising in an advanced IP Multimedia Subsystem scenario, a comprehensive infrastructure for real-time multimedia data protection, based on the adoption of the Secure Real-Time Protocol, is proposed; then, the development of a testbed incorporating such functionalities, including mechanisms for key management and cryptographic context transfer, and allowing the setup of Secure Real-Time Protocol sessions is presented; finally, experimental results are provided together with quantitative assessments and comparisons of system performances for audio sessions with and without the adoption of the Secure Real-Time Protocol framework.

  6. Texture analysis of ultrahigh field T2*-weighted MR images of the brain: application to Huntington's disease.

    PubMed

    Doan, Nhat Trung; van den Bogaard, Simon J A; Dumas, Eve M; Webb, Andrew G; van Buchem, Mark A; Roos, Raymund A C; van der Grond, Jeroen; Reiber, Johan H C; Milles, Julien

    2014-03-01

    To develop a framework for quantitative detection of between-group textural differences in ultrahigh field T2*-weighted MR images of the brain. MR images were acquired using a three-dimensional (3D) T2*-weighted gradient echo sequence on a 7 Tesla MRI system. The phase images were high-pass filtered to remove phase wraps. Thirteen textural features were computed for both the magnitude and phase images of a region of interest based on 3D Gray-Level Co-occurrence Matrix, and subsequently evaluated to detect between-group differences using a Mann-Whitney U-test. We applied the framework to study textural differences in subcortical structures between premanifest Huntington's disease (HD), manifest HD patients, and controls. In premanifest HD, four phase-based features showed a difference in the caudate nucleus. In manifest HD, 7 magnitude-based features showed a difference in the pallidum, 6 phase-based features in the caudate nucleus, and 10 phase-based features in the putamen. After multiple comparison correction, significant differences were shown in the putamen in manifest HD by two phase-based features (both adjusted P values=0.04). This study provides the first evidence of textural heterogeneity of subcortical structures in HD. Texture analysis of ultrahigh field T2*-weighted MR images can be useful for noninvasive monitoring of neurodegenerative diseases. Copyright © 2013 Wiley Periodicals, Inc.

  7. Characterization and prediction of chemical functions and weight fractions in consumer products.

    PubMed

    Isaacs, Kristin K; Goldsmith, Michael-Rock; Egeghy, Peter; Phillips, Katherine; Brooks, Raina; Hong, Tao; Wambaugh, John F

    2016-01-01

    Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-based chemical prioritization.

  8. Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders

    PubMed Central

    Wang, Peng; Barrett, Frederick; Martin, Elizabeth; Milanova, Marina; Gur, Raquel E.; Gur, Ruben C.; Kohler, Christian; Verma, Ragini

    2008-01-01

    Deficits in emotional expression are prominent in several neuropsychiatric disorders, including schizophrenia. Available clinical facial expression evaluations provide subjective and qualitative measurements, which are based on static 2D images that do not capture the temporal dynamics and subtleties of expression changes. Therefore, there is a need for automated, objective and quantitative measurements of facial expressions captured using videos. This paper presents a computational framework that creates probabilistic expression profiles for video data and can potentially help to automatically quantify emotional expression differences between patients with neuropsychiatric disorders and healthy controls. Our method automatically detects and tracks facial landmarks in videos, and then extracts geometric features to characterize facial expression changes. To analyze temporal facial expression changes, we employ probabilistic classifiers that analyze facial expressions in individual frames, and then propagate the probabilities throughout the video to capture the temporal characteristics of facial expressions. The applications of our method to healthy controls and case studies of patients with schizophrenia and Asperger’s syndrome demonstrate the capability of the video-based expression analysis method in capturing subtleties of facial expression. Such results can pave the way for a video based method for quantitative analysis of facial expressions in clinical research of disorders that cause affective deficits. PMID:18045693

  9. When is enough evidence enough? - Using systematic decision analysis and value-of-information analysis to determine the need for further evidence.

    PubMed

    Siebert, Uwe; Rochau, Ursula; Claxton, Karl

    2013-01-01

    Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.

  10. A serious game for improving the decision making skills and knowledge levels of Turkish football referees according to the laws of the game.

    PubMed

    Gulec, Ulas; Yilmaz, Murat

    2016-01-01

    Digital game-based learning environments provide emerging opportunities to overcome learning barriers by combining newly developed technologies and traditional game design. This study proposes a quantitative research approach supported by expert validation interviews to designing a game-based learning framework. The goal is to improve the learning experience and decision-making skills of soccer referees in Turkey. A serious game was developed and tested on a group of referees (N = 54). The assessment results of these referees were compared with two sample t-test and the Wilcoxon signed-ranked test for both the experimental group and the control group. The findings of the current study confirmed that a game-based learning environment has greater merit over the paper-based alternatives.

  11. Incremental Structured Dictionary Learning for Video Sensor-Based Object Tracking

    PubMed Central

    Xue, Ming; Yang, Hua; Zheng, Shibao; Zhou, Yi; Yu, Zhenghua

    2014-01-01

    To tackle robust object tracking for video sensor-based applications, an online discriminative algorithm based on incremental discriminative structured dictionary learning (IDSDL-VT) is presented. In our framework, a discriminative dictionary combining both positive, negative and trivial patches is designed to sparsely represent the overlapped target patches. Then, a local update (LU) strategy is proposed for sparse coefficient learning. To formulate the training and classification process, a multiple linear classifier group based on a K-combined voting (KCV) function is proposed. As the dictionary evolves, the models are also trained to timely adapt the target appearance variation. Qualitative and quantitative evaluations on challenging image sequences compared with state-of-the-art algorithms demonstrate that the proposed tracking algorithm achieves a more favorable performance. We also illustrate its relay application in visual sensor networks. PMID:24549252

  12. Capturing farm diversity with hypothesis-based typologies: An innovative methodological framework for farming system typology development

    PubMed Central

    Alvarez, Stéphanie; Timler, Carl J.; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A.; Groot, Jeroen C. J.

    2018-01-01

    Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia’s Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies. PMID:29763422

  13. Capturing farm diversity with hypothesis-based typologies: An innovative methodological framework for farming system typology development.

    PubMed

    Alvarez, Stéphanie; Timler, Carl J; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A; Groot, Jeroen C J

    2018-01-01

    Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia's Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies.

  14. Enhancing value of clinical pharmacodynamics in oncology drug development: An alliance between quantitative pharmacology and translational science.

    PubMed

    Venkatakrishnan, K; Ecsedy, J A

    2017-01-01

    Clinical pharmacodynamic evaluation is a key component of the "pharmacologic audit trail" in oncology drug development. We posit that its value can and should be greatly enhanced via application of a robust quantitative pharmacology framework informed by biologically mechanistic considerations. Herein, we illustrate examples of intersectional blindspots across the disciplines of quantitative pharmacology and translational science and offer a roadmap aimed at enhancing the caliber of clinical pharmacodynamic research in the development of oncology therapeutics. © 2016 American Society for Clinical Pharmacology and Therapeutics.

  15. How to improve patient retention in an antiretroviral treatment program in Ethiopia: a mixed-methods study

    PubMed Central

    2014-01-01

    Background Patient retention, defined as continuous engagement of patients in care, is one of the crucial indicators for monitoring and evaluating the performance of antiretroviral treatment (ART) programs. It has been identified that suboptimal patient retention in care is one of the challenges of ART programs in many settings. ART programs have, therefore, been striving hard to identify and implement interventions that improve their suboptimal levels of retention. The objective of this study was to develop a framework for improving patient retention in care based on interventions implemented in health facilities that have achieved higher levels of retention in care. Methods A mixed-methods study, based on the positive deviance approach, was conducted in Ethiopia in 2011/12. Quantitative data were collected to estimate and compare the levels of retention in care in nine health facilities. Key informant interviews and focus group discussions were conducted to identify a package of interventions implemented in the health facilities with relatively higher or improving levels of retention. Results Retention in care in the Ethiopian ART program was found to be variable across health facilities. Among hospitals, the poorest performer had 0.46 (0.35, 0.60) times less retention than the reference; among health centers, the poorest performers had 0.44 (0.28, 0.70) times less retention than the reference. Health facilities with higher and improving patient retention were found to implement a comprehensive package of interventions: (1) retention promoting activities by health facilities, (2) retention promoting activities by community-based organizations, (3) coordination of these activities by case manager(s), and (4) patient information systems by data clerk(s). On the contrary, such interventions were either poorly implemented or did not exist in health facilities with lower retention in care. A framework to improve retention in care was developed based on the evidence found by applying the positive deviance approach. Conclusion A framework for improving retention in care of patients on ART was developed. We recommend that health facilities implement the framework, monitor and evaluate their levels of retention in care, and, if necessary, adapt the framework to their own contexts. PMID:24475889

  16. Mars Observer: Mission toward a basic understanding of Mars

    NASA Technical Reports Server (NTRS)

    Albee, Arden L.

    1992-01-01

    The Mars Observer Mission will provide a spacecraft platform about Mars from which the entire Martian surface and atmosphere will be observed and mapped by remote sensing instruments for at least 1 Martian year. The scientific objectives for the Mission emphasize qualitative and quantitative determination of the elemental and mineralogical composition of the surface; measurement of the global surface topography, gravity field, and magnetic field; and the development of a synoptic data base of climatological conditions. The Mission will provide basic global understanding of Mars as it exists today and will provide a framework for understanding its past.

  17. How measurement science can improve confidence in research results.

    PubMed

    Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T

    2018-04-01

    The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.

  18. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  19. Positivists, Post-Positivists, Post-Structuralists, and Post-Modernists: Why Can't We All Get Along? Towards a Framework for Unifying Research Paradigms.

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.

    Since the latter part of the 19th century, a fervent debate has ensued about quantitative and qualitative research paradigms. From these disputes, purists have emerged on both sides. Quantitative purists express assumptions that are consistent with a positivist philosophy, whereas qualitative purists (i.e., post-positivists, post-structuralists,…

  20. [Medical habilitation in German-speaking countries : Quantitative assessment of content and elaboration of habilitation guidelines].

    PubMed

    Weineck, S B; Koelblinger, D; Kiesslich, T

    2015-04-01

    Habilitation defines the qualification to conduct self-contained university teaching and is the key for access to a professorship at German, Austrian and Swiss universities. Despite all changes implemented in the European higher education systems during the Bologna process, it is the highest qualification level issued through the process of an university examination and remains the core concept of scientific careers in these countries. In the field of medicine, this applies not only to scientific staff at the universities but also to those medical doctors aiming at a clinical career track. To provide a quantitative analysis of the scientific, didactic, and procedural criteria for medical habilitation in German-speaking countries. Based on the guidelines of all 43 medical academic institutions, the criteria which candidates are required to fulfil prior to habilitation as well as formal requirements related to the habilitation procedure itself have been acquired and quantitatively analyzed. Having evaluated all habilitation guidelines by means of 87 items, the quantitative analysis revealed significant differences in terms of number, kind and scale of criteria stated therein. Most habilitation guidelines scarcely define the capabilities applicants have to prove: concerning the scientific qualifications on types of publications for instance, no item was mentioned in more than half of all habilitation guidelines. Based on this data analysis, the authors discuss the related literature and describe five main distinguishing areas of habilitation guidelines in terms of the set of the formal and procedural framework as well as the prequalification and postqualification criteria imposed on habilitation candidates. There are therefore substantial differences in the organization of the habilitation for medicine.

  1. Holocene Temperature Reconstructions from Arctic Lakes based on Alkenone Paleothermometry and Non-Destructive Scanning Techniques

    NASA Astrophysics Data System (ADS)

    D'Andrea, W. J.; Balascio, N. L.; Bradley, R. S.; Bakke, J.; Gjerde, M.; Kaufman, D. S.; Briner, J. P.; von Gunten, L.

    2014-12-01

    Generating continuous, accurate and quantitative Holocene temperature estimates from the Arctic is an ongoing challenge. In many Arctic regions, tree ring-based approaches cannot be used and lake sediments provide the most valuable repositories for extracting paleotemperature information. Advances in lacustrine alkenone paleothermometry now allow for quantitative reconstruction of lake-water temperature based on the UK37 values of sedimentary alkenones. In addition, a recent study demonstrated the efficacy of non-destructive scanning reflectance spectroscopy in the visible range (VIS-RS) for high-resolution quantitative temperature reconstruction from arctic lake sediments1. In this presentation, I will report a new UK37-based temperature reconstruction and a scanning VIS-RS record (using the RABD660;670 index as a measure of sedimentary chlorin content) from Kulusuk Lake in southeastern Greenland (65.6°N, 37.1°W). The UK37 record reveals a ~3°C increase in summer lake water temperatures between ~10ka and ~7ka followed by sustained warmth until ~4ka and a gradual (~3°C) cooling until ~400 yr BP. The strong correlation between UK37 and RABD660;670 measured in the same sediment core provides further evidence that in arctic lakes where temperature regulates primary productivity, and thereby sedimentary chlorin content, these proxies can be combined to develop high-resolution quantitative temperature records. The Holocene temperature history of Kulusuk Lake determined using this approach corresponds to changes in the size of the glaciers adjacent to the lake, as inferred from sediment minerogenic properties measured with scanning XRF. Glaciers retreated during early Holocene warming, likely disappeared during the period of mid-Holocene warmth, and advanced after 4ka. I will also discuss new UK37 and RABD660;670 reconstructions from northwestern Svalbard and the central Brooks Range of Alaska within the framework of published regional temperature reconstructions and model simulations of Holocene temperature around the Arctic. 1. von Gunten, L., D'Andrea, W.J., Bradley, R.S. and Huang, Y., 2012, Proxy-to-proxy calibration: Increasing the temporal resolution of quantitative climate reconstructions. Scientific Reports, v. 2, 609. doi: 10:1038/srep00609.

  2. Experiences and expectations of women with urogenital prolapse: a quantitative and qualitative exploration.

    PubMed

    Srikrishna, S; Robinson, D; Cardozo, L; Cartwright, R

    2008-10-01

    To explore the expectations and goals of women undergoing surgery for urogenital prolapse using both a quantitative quality of life approach exploring symptom bother and a qualitative interview-based approach exploring patient goals and expectations. Prospective observational study. Tertiary referral centre for urogynaecology. Forty-three women with symptomatic pelvic organ prolapse were recruited from the waiting list for pelvic floor reconstructive surgery. All women were assessed with a structured clinical interview on an individual basis. The data obtained were transcribed verbatim and then analysed thematically based on the grounded theory. Individual codes and subcodes were identified to develop a coding framework. The prolapse quality-of-life (pQoL) questionnaire was used to determine the impact of pelvic organ prolapse on the woman's daily life. We arbitrarily classified 'bother' as minimal, mild, moderate and marked if scores ranged from 0 to 25, 25-50, 50-75 and 75-100, respectively. The degree of prolapse was objectively quantified using the pelvic organ prolapse quantification (POP-Q) system. Quantitative data were analysed using SPSS. Ethical approval was obtained from the Kings College Hospital Ethics Committee. Quantitative data from POP-Q, subjective data from pQoL, qualitative data based on the structured clinical interview. Forty-three women were recruited over the first 1 year of the study. Their mean age was 56 years (range 36-78) and mean parity was 2 (range 0-6). The mean ordinal stage of the prolapse was 2 (range stages 1-4). Quantitative analysis of the pQoL data suggested that the main domains affected were prolapse impact on life (mean score 74.71) and personal relationships (mean score 46.66). Qualitative analysis based on the clinical interview suggested that these women were most affected by the actual physical symptoms of prolapse (bulge, pain and bowel problems) as well by the impact prolapse has on their sexual function. While disease-specific QoL questionnaires allow broad comparisons to be made assessing patient bother, they may lack the sensitivity to assess individual symptoms. A qualitative approach may individualize patient care and ultimately improve patient satisfaction and overall outcome when treating women complaining of urogenital prolapse.

  3. Spatiotemporal Segmentation and Modeling of the Mitral Valve in Real-Time 3D Echocardiographic Images.

    PubMed

    Pouch, Alison M; Aly, Ahmed H; Lai, Eric K; Yushkevich, Natalie; Stoffers, Rutger H; Gorman, Joseph H; Cheung, Albert T; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A

    2017-09-01

    Transesophageal echocardiography is the primary imaging modality for preoperative assessment of mitral valves with ischemic mitral regurgitation (IMR). While there are well known echocardiographic insights into the 3D morphology of mitral valves with IMR, such as annular dilation and leaflet tethering, less is understood about how quantification of valve dynamics can inform surgical treatment of IMR or predict short-term recurrence of the disease. As a step towards filling this knowledge gap, we present a novel framework for 4D segmentation and geometric modeling of the mitral valve in real-time 3D echocardiography (rt-3DE). The framework integrates multi-atlas label fusion and template-based medial modeling to generate quantitatively descriptive models of valve dynamics. The novelty of this work is that temporal consistency in the rt-3DE segmentations is enforced during both the segmentation and modeling stages with the use of groupwise label fusion and Kalman filtering. The algorithm is evaluated on rt-3DE data series from 10 patients: five with normal mitral valve morphology and five with severe IMR. In these 10 data series that total 207 individual 3DE images, each 3DE segmentation is validated against manual tracing and temporal consistency between segmentations is demonstrated. The ultimate goal is to generate accurate and consistent representations of valve dynamics that can both visually and quantitatively provide insight into normal and pathological valve function.

  4. Strategy to Conduct Quantitative Ecohydrologic Analysis of a UNESCO World Heritage Site: the Peace-Athabasca Delta, Canada

    NASA Astrophysics Data System (ADS)

    Ward, E. M.; Gorelick, S.; Hadly, E. A.

    2016-12-01

    The 6000 km2 Peace-Athabasca Delta ("Delta") in northeastern Alberta, Canada, is a Ramsar Convention Wetland and UNESCO World Heritage Site ("in Danger" status pending) where hydropower development and climate change are creating ecological impacts through desiccation and reduction in Delta shoreline habitat. We focus on ecohydrologic changes and mitigation and adaptation options to advance the field of ecohydrology using interdisciplinary technology by combining, for the first time, satellite remote sensing and hydrologic simulation with individual-based population modeling of muskrat (Ondatra zibethicus), a species native to the Delta whose population dynamics are strongly controlled by the hydrology of floodplain lakes. We are building a conceptual and quantitative modeling framework linking climate change, upstream water demand, and hydrologic change in the floodplain to muskrat population dynamics with the objective of exploring the impacts of these stressors on this ecosystem. We explicitly account for cultural and humanistic influences and are committed to effective communication with the regional subsistence community that depends on muskrat for food and income. Our modeling framework can ultimately serve as the basis for improved stewardship and sustainable development upstream of stressed freshwater deltaic, coastal and lake systems worldwide affected by climate change, providing a predictive tool to quantify population changes of animals relevant to regional subsistence food security and commercial trapping.

  5. Quantitative structure-activity relationship analysis of substituted arylazo pyridone dyes in photocatalytic system: Experimental and theoretical study.

    PubMed

    Dostanić, J; Lončarević, D; Zlatar, M; Vlahović, F; Jovanović, D M

    2016-10-05

    A series of arylazo pyridone dyes was synthesized by changing the type of the substituent group in the diazo moiety, ranging from strong electron-donating to strong electron-withdrawing groups. The structural and electronic properties of the investigated dyes was calculated at the M062X/6-31+G(d,p) level of theory. The observed good linear correlations between atomic charges and Hammett σp constants provided a basis to discuss the transmission of electronic substituent effects through a dye framework. The reactivity of synthesized dyes was tested through their decolorization efficiency in TiO2 photocatalytic system (Degussa P-25). Quantitative structure-activity relationship analysis revealed a strong correlation between reactivity of investigated dyes and Hammett substituent constants. The reaction was facilitated by electron-withdrawing groups, and retarded by electron-donating ones. Quantum mechanical calculations was used in order to describe the mechanism of the photocatalytic oxidation reactions of investigated dyes and interpret their reactivities within the framework of the Density Functional Theory (DFT). According to DFT based reactivity descriptors, i.e. Fukui functions and local softness, the active site moves from azo nitrogen atom linked to benzene ring to pyridone carbon atom linked to azo bond, going from dyes with electron-donating groups to dyes with electron-withdrawing groups. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Use and misuse of mixed methods in population oral health research: A scoping review.

    PubMed

    Gupta, A; Keuskamp, D

    2018-05-30

    Despite the known benefits of a mixed methods approach in health research, little is known of its use in the field of population oral health. To map the extent of literature using a mixed methods approach to examine population oral health outcomes. For a comprehensive search of all the available literature published in the English language, databases including PubMed, Dentistry and Oral Sciences Source (DOSS), CINAHL, Web of Science and EMBASE (including Medline) were searched using a range of keywords from inception to October 2017. Only peer-reviewed, population-based studies of oral health outcomes conducted among non-institutionalised participants and using mixed methods were considered eligible for inclusion. Only nine studies met the inclusion criteria and were included in the review. The most frequent oral health outcome investigated was caries experience. However, most studies lacked a theoretical rationale or framework for using mixed methods, or supporting the use of qualitative data. Concurrent triangulation with a convergent design was the most commonly used mixed methods typology for integrating quantitative and qualitative data. The tools used to collect quantitative and qualitative data were mostly limited to surveys and interviews. With growing complexity recognised in the determinants of oral disease, future studies addressing population oral health outcomes are likely to benefit from the use of mixed methods. Explicit consideration of theoretical framework and methodology will strengthen those investigations. Copyright© 2018 Dennis Barber Ltd.

  7. Querying quantitative logic models (Q2LM) to study intracellular signaling networks and cell-cytokine interactions.

    PubMed

    Morris, Melody K; Shriver, Zachary; Sasisekharan, Ram; Lauffenburger, Douglas A

    2012-03-01

    Mathematical models have substantially improved our ability to predict the response of a complex biological system to perturbation, but their use is typically limited by difficulties in specifying model topology and parameter values. Additionally, incorporating entities across different biological scales ranging from molecular to organismal in the same model is not trivial. Here, we present a framework called "querying quantitative logic models" (Q2LM) for building and asking questions of constrained fuzzy logic (cFL) models. cFL is a recently developed modeling formalism that uses logic gates to describe influences among entities, with transfer functions to describe quantitative dependencies. Q2LM does not rely on dedicated data to train the parameters of the transfer functions, and it permits straight-forward incorporation of entities at multiple biological scales. The Q2LM framework can be employed to ask questions such as: Which therapeutic perturbations accomplish a designated goal, and under what environmental conditions will these perturbations be effective? We demonstrate the utility of this framework for generating testable hypotheses in two examples: (i) a intracellular signaling network model; and (ii) a model for pharmacokinetics and pharmacodynamics of cell-cytokine interactions; in the latter, we validate hypotheses concerning molecular design of granulocyte colony stimulating factor. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Computer-aided pulmonary image analysis in small animal models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next.more » The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.« less

  9. Student concepts of Natural Selection from a resource-based perspective

    NASA Astrophysics Data System (ADS)

    Benjamin, Scott Shawn

    The past two decades have produced a substantial amount of research about the teaching and learning of evolution; however, recent research often lacks a theoretical foundation. Application of a new theoretical framework could help fill the void and improve research about student concepts of evolution. This study seeks to show that a resource-based framework (Hammer et al., 2005) can improve research into student concepts of natural selection. Concepts of natural selection from urban community college students were assessed via qualitative (interviews, written open-response questions, and write/think aloud procedures) and quantitative methods (coded open response analysis, Concept Inventory for Natural Selection (CINS)(Anderson, Fisher, & Norman, 2002). Results showed that students demonstrate four important aspects of resource-based framework: the multi-faceted construction of concepts, context sensitivity/ concept flexibility, at-the-moment activation of resources, and perceptual frames. In open response assessment, evolutionary-gain responses produced significantly different responses than evolutionary-loss questions with: 1) significantly more correct answers for the gain than loss question (Wilcoxon signed rank test, z = -3.68, p=0.0002); 2) more Lamarckian responses to loss than the gain question (Fisher exact, p=0.0039); and significantly different distributions in expanded need vs basic need answers (Fishers exact, p = 0.02). Results from CINS scores showed significant differences in post activity scores between students that held different naive concepts associated with origin of variation, origin of species, differential reproduction, and limited survival suggesting that some naive ideas facilitate learning. Outcomes also suggest that an everyday or self-experience typological perceptual frame is an underlying source of many incorrect ideas about evolution. Interview and write/think aloud assessments propose four process resources applied by students as they explain evolutionary change: list what I know, why story, compare past to present, mapping self-experience. The study concludes that a resource-based framework is a valuable tool to advance the study student concepts of natural selection.

  10. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  11. We can do that! Collaborative assessment of school environments to promote healthy adolescent nutrition and physical activity behaviors.

    PubMed

    Williams, Susan L; Mummery, W Kerry

    2015-04-01

    Evidence for effectiveness of school-based studies for prevention of adolescent obesity is equivocal. Tailoring interventions to specific settings is considered necessary for effectiveness and sustainability. The PRECEDE framework provides a formative research approach for comprehensive understanding of school environments and identification of key issues/areas to focus resources and energies. No reported studies have tested applicability of the PRECEDE framework in schools in relation to obesity. Adolescents (n = 362), parents (n = 349) and teachers (n = 146) from six secondary schools participated in two quantitative studies and two qualitative studies. Data collected from these studies permitted confirmation of adolescent overweight/obesity a health issue for schools; the need for secondary schools to focus health promotion efforts on healthy nutrition, with inclusion of parents/homes and appreciation for gender differences in developing interventions. Community buy-in and commitment to school-based obesity prevention programs may be dependent on initially addressing what may be perceived as minor issues, and developing policies to guide practices within schools in relation to supply and access to healthy foods, use of sporting equipment and participation in physical activities. The PRECEDE framework allows systematic assessment of school environments and provided opportunity to identify realistic and relevant interventions for promoting healthy adolescent physical activity and nutrition behaviors. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  12. Cortical Surface Registration for Image-Guided Neurosurgery Using Laser-Range Scanning

    PubMed Central

    Sinha, Tuhin K.; Cash, David M.; Galloway, Robert L.; Weil, Robert J.

    2013-01-01

    In this paper, a method of acquiring intraoperative data using a laser range scanner (LRS) is presented within the context of model-updated image-guided surgery. Registering textured point clouds generated by the LRS to tomographic data is explored using established point-based and surface techniques as well as a novel method that incorporates geometry and intensity information via mutual information (SurfaceMI). Phantom registration studies were performed to examine accuracy and robustness for each framework. In addition, an in vivo registration is performed to demonstrate feasibility of the data acquisition system in the operating room. Results indicate that SurfaceMI performed better in many cases than point-based (PBR) and iterative closest point (ICP) methods for registration of textured point clouds. Mean target registration error (TRE) for simulated deep tissue targets in a phantom were 1.0 ± 0.2, 2.0 ± 0.3, and 1.2 ± 0.3 mm for PBR, ICP, and SurfaceMI, respectively. With regard to in vivo registration, the mean TRE of vessel contour points for each framework was 1.9 ± 1.0, 0 9 ± 0.6, and 1.3 ± 0.5 for PBR, ICP, and SurfaceMI, respectively. The methods discussed in this paper in conjunction with the quantitative data provide impetus for using LRS technology within the model-updated image-guided surgery framework. PMID:12906252

  13. A framework and case studies for evaluation of enzyme ontogeny in children's health risk evaluation.

    PubMed

    Ginsberg, Gary; Vulimiri, Suryanarayana V; Lin, Yu-Sheng; Kancherla, Jayaram; Foos, Brenda; Sonawane, Babasaheb

    2017-01-01

    Knowledge of the ontogeny of Phase I and Phase II metabolizing enzymes may be used to inform children's vulnerability based upon likely differences in internal dose from xenobiotic exposure. This might provide a qualitative assessment of toxicokinetic (TK) variability and uncertainty pertinent to early lifestages and help scope a more quantitative physiologically based toxicokinetic (PBTK) assessment. Although much is known regarding the ontogeny of metabolizing systems, this is not commonly utilized in scoping and problem formulation stage of human health risk evaluation. A framework is proposed for introducing this information into problem formulation which combines data on enzyme ontogeny and chemical-specific TK to explore potential child/adult differences in internal dose and whether such metabolic differences may be important factors in risk evaluation. The framework is illustrated with five case study chemicals, including some which are data rich and provide proof of concept, while others are data poor. Case studies for toluene and chlorpyrifos indicate potentially important child/adult TK differences while scoping for acetaminophen suggests enzyme ontogeny is unlikely to increase early-life risks. Scoping for trichloroethylene and aromatic amines indicates numerous ways that enzyme ontogeny may affect internal dose which necessitates further evaluation. PBTK modeling is a critical and feasible next step to further evaluate child-adult differences in internal dose for a number of these chemicals.

  14. Symptom outcomes important to women with anal incontinence: a conceptual framework.

    PubMed

    Sung, Vivian W; Rogers, Rebecca G; Bann, Carla M; Arya, Lily; Barber, Matthew D; Lowder, Jerry; Lukacz, Emily S; Markland, Alayne; Siddiqui, Nazema; Wilmot, Amanda; Meikle, Susan F

    2014-05-01

    To develop a framework that describes the most important symptom outcomes for anal incontinence treatment from the patient perspective. A conceptual framework was developed by the Pelvic Floor Disorders Network based on four semistructured focus groups and confirmed in two sets of 10 cognitive interviews including women with anal incontinence. We explored: 1) patient-preferred terminology for describing anal incontinence symptoms; 2) patient definitions of treatment "success"; 3) importance of symptoms and outcomes in the framework; and 4) conceptual gaps (defined as outcomes not previously identified as important). Sessions were conducted according to grounded theory transcribed, coded, and qualitatively and quantitatively analyzed to identify relevant themes. Content and face validity of the framework were further assessed using cognitive interviews. Thirty-four women participated in focus groups and 20 in cognitive interviews. Overall, 29 (54%) were aged 60 years or older, 42 (78%) were white, and 10 (19%) had a high school degree or less. Two overarching outcome themes were identified: "primary bowel leakage symptoms" and "ancillary bowel symptoms." Subdomains important in primary bowel leakage symptoms included leakage characteristics (symptom frequency, amount of leakage, symptom bother) and conditions when bowel leakage occurs (predictability, awareness, urgency). Subdomains important under ancillary bowel symptoms included emptying disorders (constipation, obstructed defecation, and wiping issues) and discomfort (pain, burning). New outcomes identified included predictability, awareness, wiping issues, and discomfort. Women with anal incontinence desire a wide range of symptom outcomes after treatment. These are captured in our conceptual framework, which can aid clinicians and researchers in assessing anal incontinence. LEVEL OF EVIEDENCE: II.

  15. Developing a monitoring and evaluation framework to integrate and formalize the informal waste and recycling sector: the case of the Philippine National Framework Plan.

    PubMed

    Serrona, Kevin Roy B; Yu, Jeongsoo; Aguinaldo, Emelita; Florece, Leonardo M

    2014-09-01

    The Philippines has been making inroads in solid waste management with the enactment and implementation of the Republic Act 9003 or the Ecological Waste Management Act of 2000. Said legislation has had tremendous influence in terms of how the national and local government units confront the challenges of waste management in urban and rural areas using the reduce, reuse, recycle and recovery framework or 4Rs. One of the sectors needing assistance is the informal waste sector whose aspiration is legal recognition of their rank and integration of their waste recovery activities in mainstream waste management. To realize this, the Philippine National Solid Waste Management Commission initiated the formulation of the National Framework Plan for the Informal Waste Sector, which stipulates approaches, strategies and methodologies to concretely involve the said sector in different spheres of local waste management, such as collection, recycling and disposal. What needs to be fleshed out is the monitoring and evaluation component in order to gauge qualitative and quantitative achievements vis-a-vis the Framework Plan. In the process of providing an enabling environment for the informal waste sector, progress has to be monitored and verified qualitatively and quantitatively and measured against activities, outputs, objectives and goals. Using the Framework Plan as the reference, this article developed monitoring and evaluation indicators using the logical framework approach in project management. The primary objective is to institutionalize monitoring and evaluation, not just in informal waste sector plans, but in any waste management initiatives to ensure that envisaged goals are achieved. © The Author(s) 2014.

  16. Testing for biases in selection on avian reproductive traits and partitioning direct and indirect selection using quantitative genetic models.

    PubMed

    Reed, Thomas E; Gienapp, Phillip; Visser, Marcel E

    2016-10-01

    Key life history traits such as breeding time and clutch size are frequently both heritable and under directional selection, yet many studies fail to document microevolutionary responses. One general explanation is that selection estimates are biased by the omission of correlated traits that have causal effects on fitness, but few valid tests of this exist. Here, we show, using a quantitative genetic framework and six decades of life-history data on two free-living populations of great tits Parus major, that selection estimates for egg-laying date and clutch size are relatively unbiased. Predicted responses to selection based on the Robertson-Price Identity were similar to those based on the multivariate breeder's equation (MVBE), indicating that unmeasured covarying traits were not missing from the analysis. Changing patterns of phenotypic selection on these traits (for laying date, linked to climate change) therefore reflect changing selection on breeding values, and genetic constraints appear not to limit their independent evolution. Quantitative genetic analysis of correlational data from pedigreed populations can be a valuable complement to experimental approaches to help identify whether apparent associations between traits and fitness are biased by missing traits, and to parse the roles of direct versus indirect selection across a range of environments. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  17. Assessing treatment response in triple-negative breast cancer from quantitative image analysis in perfusion magnetic resonance imaging.

    PubMed

    Banerjee, Imon; Malladi, Sadhika; Lee, Daniela; Depeursinge, Adrien; Telli, Melinda; Lipson, Jafi; Golden, Daniel; Rubin, Daniel L

    2018-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is sensitive but not specific to determining treatment response in early stage triple-negative breast cancer (TNBC) patients. We propose an efficient computerized technique for assessing treatment response, specifically the residual tumor (RT) status and pathological complete response (pCR), in response to neoadjuvant chemotherapy. The proposed approach is based on Riesz wavelet analysis of pharmacokinetic maps derived from noninvasive DCE-MRI scans, obtained before and after treatment. We compared the performance of Riesz features with the traditional gray level co-occurrence matrices and a comprehensive characterization of the lesion that includes a wide range of quantitative features (e.g., shape and boundary). We investigated a set of predictive models ([Formula: see text]) incorporating distinct combinations of quantitative characterizations and statistical models at different time points of the treatment and some area under the receiver operating characteristic curve (AUC) values we reported are above 0.8. The most efficient models are based on first-order statistics and Riesz wavelets, which predicted RT with an AUC value of 0.85 and pCR with an AUC value of 0.83, improving results reported in a previous study by [Formula: see text]. Our findings suggest that Riesz texture analysis of TNBC lesions can be considered a potential framework for optimizing TNBC patient care.

  18. Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification.

    PubMed

    Wang, Shouyi; Bowen, Stephen R; Chaovalitwongse, W Art; Sandison, George A; Grabowski, Thomas J; Kinahan, Paul E

    2014-02-21

    The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUV(peak)) over lesions of interest. Relative differences in SUV(peak) between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUV(peak) values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.

  19. Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification

    NASA Astrophysics Data System (ADS)

    Wang, Shouyi; Bowen, Stephen R.; Chaovalitwongse, W. Art; Sandison, George A.; Grabowski, Thomas J.; Kinahan, Paul E.

    2014-02-01

    The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUVpeak) over lesions of interest. Relative differences in SUVpeak between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUVpeak values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.

  20. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  1. A systematic review of the health and well-being impacts of school gardening: synthesis of quantitative and qualitative evidence.

    PubMed

    Ohly, Heather; Gentry, Sarah; Wigglesworth, Rachel; Bethel, Alison; Lovell, Rebecca; Garside, Ruth

    2016-03-25

    School gardening programmes are increasingly popular, with suggested benefits including healthier eating and increased physical activity. Our objectives were to understand the health and well-being impacts of school gardens and the factors that help or hinder their success. We conducted a systematic review of quantitative and qualitative evidence (PROSPERO CRD42014007181). We searched multiple databases and used a range of supplementary approaches. Studies about school gardens were included if they reported on physical or mental health or well-being. Quantitative studies had to include a comparison group. Studies were quality appraised using appropriate tools. Findings were narratively synthesised and the qualitative evidence used to produce a conceptual framework to illustrate how benefits might be accrued. Evidence from 40 articles (21 quantitative studies; 16 qualitative studies; 3 mixed methods studies) was included. Generally the quantitative research was poor. Evidence for changes in fruit and vegetable intake was limited and based on self-report. The qualitative research was better quality and ascribed a range of health and well-being impacts to school gardens, with some idealistic expectations for their impact in the long term. Groups of pupils who do not excel in classroom activities were thought to particularly benefit. Lack of funding and over reliance on volunteers were thought to threaten success, while involvement with local communities and integration of gardening activities into the school curriculum were thought to support success. More robust quantitative research is needed to convincingly support the qualitative evidence suggesting wide ranging benefits from school gardens.

  2. The Consolidated Framework for Implementation Research (CFIR): a useful theoretical framework for guiding and evaluating a guideline implementation process in a hospital-based nursing practice.

    PubMed

    Breimaier, Helga E; Heckemann, Birgit; Halfens, Ruud J G; Lohrmann, Christa

    2015-01-01

    Implementing clinical practice guidelines (CPGs) in healthcare settings is a complex intervention involving both independent and interdependent components. Although the Consolidated Framework for Implementation Research (CFIR) has never been evaluated in a practical context, it appeared to be a suitable theoretical framework to guide an implementation process. The aim of this study was to evaluate the comprehensiveness, applicability and usefulness of the CFIR in the implementation of a fall-prevention CPG in nursing practice to improve patient care in an Austrian university teaching hospital setting. The evaluation of the CFIR was based on (1) team-meeting minutes, (2) the main investigator's research diary, containing a record of a before-and-after, mixed-methods study design embedded in a participatory action research (PAR) approach for guideline implementation, and (3) an analysis of qualitative and quantitative data collected from graduate and assistant nurses in two Austrian university teaching hospital departments. The CFIR was used to organise data per and across time point(s) and assess their influence on the implementation process, resulting in implementation and service outcomes. Overall, the CFIR could be demonstrated to be a comprehensive framework for the implementation of a guideline into a hospital-based nursing practice. However, the CFIR did not account for some crucial factors during the planning phase of an implementation process, such as consideration of stakeholder aims and wishes/needs when implementing an innovation, pre-established measures related to the intended innovation and pre-established strategies for implementing an innovation. For the CFIR constructs reflecting & evaluating and engaging, a more specific definition is recommended. The framework and its supplements could easily be used by researchers, and their scope was appropriate for the complexity of a prospective CPG-implementation project. The CFIR facilitated qualitative data analysis and provided a structure that allowed project results to be organised and viewed in a broader context to explain the main findings. The CFIR was a valuable and helpful framework for (1) the assessment of the baseline, process and final state of the implementation process and influential factors, (2) the content analysis of qualitative data collected throughout the implementation process, and (3) explaining the main findings.

  3. Marginal fit and photoelastic stress analysis of CAD-CAM and overcast 3-unit implant-supported frameworks.

    PubMed

    Presotto, Anna Gabriella Camacho; Bhering, Cláudia Lopes Brilhante; Mesquita, Marcelo Ferraz; Barão, Valentim Adelino Ricardo

    2017-03-01

    Several studies have shown the superiority of computer-assisted design and computer-assisted manufacturing (CAD-CAM) technology compared with conventional casting. However, an advanced technology exists for casting procedures (the overcasting technique), which may serve as an acceptable and affordable alternative to CAD-CAM technology for fabricating 3-unit implant-supported fixed dental prostheses (FDPs). The purpose of this in vitro study was to evaluate, using quantitative photoelastic analysis, the effect of the prosthetic framework fabrication method (CAD-CAM and overcasting) on the marginal fit and stress transmitted to implants. The correlation between marginal fit and stress was also investigated. Three-unit implant-supported FDP frameworks were made using the CAD-CAM (n=10) and overcasting (n=10) methods. The frameworks were waxed to simulate a mandibular first premolar (PM region) to first molar (M region) FDP using overcast mini-abutment cylinders. The wax patterns were overcast (overcast experimental group) or scanned to obtain the frameworks (CAD-CAM control group). All frameworks were fabricated from cobalt-chromium (CoCr) alloy. The marginal fit was analyzed according to the single-screw test protocol, obtaining an average value for each region (M and PM) and each framework. The frameworks were tightened for the photoelastic model with standardized 10-Ncm torque. Stress was measured by quantitative photoelastic analysis. The results were submitted to the Student t test, 2-way ANOVA, and Pearson correlation test (α=.05). The framework fabrication method (FM) and evaluation site (ES; M and PM regions) did not affect the marginal fit values (P=.559 for FM and P=.065 for ES) and stress (P=.685 for FM and P=.468 for ES) in the implant-supported system. Positive correlations between marginal fit and stress were observed (CAD-CAM: r=0.922; P<.001; overcast: r=0.908; P<.001). CAD-CAM and overcasting methods present similar marginal fit and stress values for 3-unit FDP frameworks. The decreased marginal fit of frameworks induces greater stress in the implant-supported system. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  4. Multiscale multimodal fusion of histological and MRI volumes for characterization of lung inflammation

    NASA Astrophysics Data System (ADS)

    Rusu, Mirabela; Wang, Haibo; Golden, Thea; Gow, Andrew; Madabhushi, Anant

    2013-03-01

    Mouse lung models facilitate the investigation of conditions such as chronic inflammation which are associated with common lung diseases. The multi-scale manifestation of lung inflammation prompted us to use multi-scale imaging - both in vivo, ex vivo MRI along with ex vivo histology, for its study in a new quantitative way. Some imaging modalities, such as MRI, are non-invasive and capture macroscopic features of the pathology, while others, e.g. ex vivo histology, depict detailed structures. Registering such multi-modal data to the same spatial coordinates will allow the construction of a comprehensive 3D model to enable the multi-scale study of diseases. Moreover, it may facilitate the identification and definition of quantitative of in vivo imaging signatures for diseases and pathologic processes. We introduce a quantitative, image analytic framework to integrate in vivo MR images of the entire mouse with ex vivo histology of the lung alone, using lung ex vivo MRI as conduit to facilitate their co-registration. In our framework, we first align the MR images by registering the in vivo and ex vivo MRI of the lung using an interactive rigid registration approach. Then we reconstruct the 3D volume of the ex vivo histological specimen by efficient group wise registration of the 2D slices. The resulting 3D histologic volume is subsequently registered to the MRI volumes by interactive rigid registration, directly to the ex vivo MRI, and implicitly to in vivo MRI. Qualitative evaluation of the registration framework was performed by comparing airway tree structures in ex vivo MRI and ex vivo histology where airways are visible and may be annotated. We present a use case for evaluation of our co-registration framework in the context of studying chronic inammation in a diseased mouse.

  5. Toward a multi-objective decision support framework to support regulations of unconventional oil and gas development

    NASA Astrophysics Data System (ADS)

    Alongi, M.; Howard, C.; Kasprzyk, J. R.; Ryan, J. N.

    2015-12-01

    Unconventional oil and gas development (UOGD) using hydraulic fracturing and horizontal drilling has recently fostered an unprecedented acceleration in energy development. Regulations seek to protect environmental quality of areas surrounding UOGD, while maintaining economic benefits. One such regulation is a setback distance, which dictates the minimum proximity between an oil and gas well and an object such as a residential or commercial building, property line, or water source. In general, most setback regulations have been strongly politically motivated without a clear scientific basis for understanding the relationship between the setback distance and various performance outcomes. This presentation discusses a new decision support framework for setback regulations, as part of a large NSF-funded sustainability research network (SRN) on UOGD. The goal of the decision support framework is to integrate a wide array of scientific information from the SRN into a coherent framework that can help inform policy regarding UOGD. The decision support framework employs multiobjective evolutionary algorithm (MOEA) optimization coupled with simulation models of air quality and other performance-based outcomes on UOGD. The result of the MOEA optimization runs are quantitative tradeoff curves among different objectives. For example, one such curve could demonstrate air pollution concentrations versus estimates of energy development profits, for different levels of setback distance. Our results will also inform policy-relevant discussions surrounding UOGD such as comparing single- and multi-well pads, as well as regulations on the density of well development over a spatial area.

  6. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies.

    PubMed

    Madill, A; Jordan, A; Shirley, C

    2000-02-01

    The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.

  7. Synthesis of quantitative and qualitative research: an example using Critical Interpretive Synthesis.

    PubMed

    Flemming, Kate

    2010-01-01

    This paper is a report of a Critical Interpretive Synthesis to synthesize quantitative research, in the form of an effectiveness review and a guideline, with qualitative research to examine the use of morphine to treat cancer-related pain. Critical Interpretive Synthesis is a new method of reviewing, developed from meta-ethnography, which integrates systematic review methodology with a qualitative tradition of enquiry. It has not previously been used specifically to synthesize effectiveness and qualitative literature. Data sources. An existing systematic review of quantitative research and a guideline examining the effectiveness of oral morphine to treat cancer pain were identified. Electronic searches of Medline, CINAHL, Embase, PsychINFO, Health Management Information Consortium database and the Social Science Citation Index to identify qualitative research were carried out in May 2008. Qualitative research papers reporting on the use of morphine to treat cancer pain were identified. The findings of the effectiveness research were used as a framework to guide the translation of findings from qualitative research using an integrative grid. A secondary translation of findings from the qualitative research, not specifically mapped to the effectiveness literature, was guided by the framework. Nineteen qualitative papers were synthesized with the quantitative effectiveness literature, producing 14 synthetic constructs. These were developed into four synthesizing arguments which drew on patients', carers' and healthcare professionals' interpretations of the meaning and context of the use of morphine to treat cancer pain. Critical Interpretive Synthesis can be adapted to synthesize reviews of quantitative research into effectiveness with qualitative research and fits into an existing typology of approaches to synthesizing qualitative and quantitative research.

  8. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  9. Tracking and Motion Analysis of Crack Propagations in Crystals for Molecular Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsap, L V; Duchaineau, M; Goldgof, D B

    2001-05-14

    This paper presents a quantitative analysis for a discovery in molecular dynamics. Recent simulations have shown that velocities of crack propagations in crystals under certain conditions can become supersonic, which is contrary to classical physics. In this research, they present a framework for tracking and motion analysis of crack propagations in crystals. It includes line segment extraction based on Canny edge maps, feature selection based on physical properties, and subsequent tracking of primary and secondary wavefronts. This tracking is completely automated; it runs in real time on three 834-image sequences using forty 250 MHZ processors. Results supporting physical observations aremore » presented in terms of both feature tracking and velocity analysis.« less

  10. Sparse reconstruction of liver cirrhosis from monocular mini-laparoscopic sequences

    NASA Astrophysics Data System (ADS)

    Marcinczak, Jan Marek; Painer, Sven; Grigat, Rolf-Rainer

    2015-03-01

    Mini-laparoscopy is a technique which is used by clinicians to inspect the liver surface with ultra-thin laparoscopes. However, so far no quantitative measures based on mini-laparoscopic sequences are possible. This paper presents a Structure from Motion (SfM) based methodology to do 3D reconstruction of liver cirrhosis from mini-laparoscopic videos. The approach combines state-of-the-art tracking, pose estimation, outlier rejection and global optimization to obtain a sparse reconstruction of the cirrhotic liver surface. Specular reflection segmentation is included into the reconstruction framework to increase the robustness of the reconstruction. The presented approach is evaluated on 15 endoscopic sequences using three cirrhotic liver phantoms. The median reconstruction accuracy ranges from 0.3 mm to 1 mm.

  11. Designing a mixed methods study in primary care.

    PubMed

    Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V

    2004-01-01

    Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.

  12. A cognitive perspective on health systems integration: results of a Canadian Delphi study.

    PubMed

    Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan

    2014-05-19

    Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs - that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada's National Symposium on Integrated Care. Respondents were asked to rate the framework's clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named "Integration Mindsets Framework" consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives.

  13. A Framework to Determine New System Requirements Under Design Parameter and Demand Uncertainties

    DTIC Science & Technology

    2015-04-30

    relegates quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the...quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the approach...play a critical role in determining new system requirements. Scope and Method of Approach The early stages of the design process have substantial

  14. Quantitative Field Testing Rotylenchulus reniformis DNA from Metagenomic Samples Isolated Directly from Soil

    PubMed Central

    Showmaker, Kurt; Lawrence, Gary W.; Lu, Shien; Balbalian, Clarissa; Klink, Vincent P.

    2011-01-01

    A quantitative PCR procedure targeting the β-tubulin gene determined the number of Rotylenchulus reniformis Linford & Oliveira 1940 in metagenomic DNA samples isolated from soil. Of note, this outcome was in the presence of other soil-dwelling plant parasitic nematodes including its sister genus Helicotylenchus Steiner, 1945. The methodology provides a framework for molecular diagnostics of nematodes from metagenomic DNA isolated directly from soil. PMID:22194958

  15. Determinants of fruit and vegetable consumption among children and adolescents: a review of the literature. Part II: qualitative studies.

    PubMed

    Krølner, Rikke; Rasmussen, Mette; Brug, Johannes; Klepp, Knut-Inge; Wind, Marianne; Due, Pernille

    2011-10-14

    Large proportions of children do not fulfil the World Health Organization recommendation of eating at least 400 grams of fruit and vegetables (FV) per day. To promote an increased FV intake among children it is important to identify factors which influence their consumption. Both qualitative and quantitative studies are needed. Earlier reviews have analysed evidence from quantitative studies. The aim of this paper is to present a systematic review of qualitative studies of determinants of children's FV intake. Relevant studies were identified by searching Anthropology Plus, Cinahl, CSA illumine, Embase, International Bibliography of the Social Sciences, Medline, PsycINFO, and Web of Science using combinations of synonyms for FV intake, children/adolescents and qualitative methods as search terms. The literature search was completed by December 1st 2010. Papers were included if they applied qualitative methods to investigate 6-18-year-olds' perceptions of factors influencing their FV consumption. Quantitative studies, review studies, studies reported in other languages than English, and non-peer reviewed or unpublished manuscripts were excluded. The papers were reviewed systematically using standardised templates for summary of papers, quality assessment, and synthesis of findings across papers. The review included 31 studies, mostly based on US populations and focus group discussions. The synthesis identified the following potential determinants for FV intake which supplement the quantitative knowledge base: Time costs; lack of taste guarantee; satiety value; appropriate time/occasions/settings for eating FV; sensory and physical aspects; variety, visibility, methods of preparation; access to unhealthy food; the symbolic value of food for image, gender identity and social interaction with peers; short term outcome expectancies. The review highlights numerous potential determinants which have not been investigated thoroughly in quantitative studies. Future large scale quantitative studies should attempt to quantify the importance of these factors. Further, mechanisms behind gender, age and socioeconomic differences in FV consumption are proposed which should be tested quantitatively in order to better tailor interventions to vulnerable groups. Finally, the review provides input to the conceptualisation and measurements of concepts (i.e. peer influence, availability in schools) which may refine survey instruments and theoretical frameworks concerning eating behaviours.

  16. A system dynamics optimization framework to achieve population desired of average weight target

    NASA Astrophysics Data System (ADS)

    Abidin, Norhaslinda Zainal; Zulkepli, Jafri Haji; Zaibidi, Nerda Zura

    2017-11-01

    Obesity is becoming a serious problem in Malaysia as it has been rated as the highest among Asian countries. The aim of the paper is to propose a system dynamics (SD) optimization framework to achieve population desired weight target based on the changes in physical activity behavior and its association to weight and obesity. The system dynamics approach of stocks and flows diagram was used to quantitatively model the impact of both behavior on the population's weight and obesity trends. This work seems to bring this idea together and highlighting the interdependence of the various aspects of eating and physical activity behavior on the complex of human weight regulation system. The model was used as an experimentation vehicle to investigate the impacts of changes in physical activity on weight and prevalence of obesity implications. This framework paper provides evidence on the usefulness of SD optimization as a strategic decision making approach to assist in decision making related to obesity prevention. SD applied in this research is relatively new in Malaysia and has a high potential to apply to any feedback models that address the behavior cause to obesity.

  17. Ten questions concerning future buildings beyond zero energy and carbon neutrality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Phelan, Patrick E.; Gonzalez, Jorge

    2017-07-01

    Architects, planners, and building scientists have been at the forefront of envisioning a future built environment for centuries. However, fragmental views that emphasize one facet of the built environment, such as energy, environment, or groundbreaking technologies, often do not achieve expected outcomes. Buildings are responsible for approximately one-third of worldwide carbon emissions and account for over 40% of primary energy consumption in the U.S. In addition to achieving the ambitious goal of reducing building greenhouse gas emissions by 75% by 2050, buildings must improve their functionality and performance to meet current and future human, societal, and environmental needs in amore » changing world. In this article, we introduce a new framework to guide potential evolution of the building stock in the next century, based on greenhouse gas emissions as the common thread to investigate the potential implications of new design paradigms, innovative operational strategies, and disruptive technologies. This framework emphasizes integration of multidisciplinary knowledge, scalability for mainstream buildings, and proactive approaches considering constraints and unknowns. The framework integrates the interrelated aspects of the built environment through a series of quantitative metrics that aim to improve environmental outcomes while optimizing building performance to achieve healthy, adaptive, and productive buildings.« less

  18. Integrated presentation of ecological risk from multiple stressors

    PubMed Central

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-01-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic. PMID:27782171

  19. High Resolution, Large Deformation 3D Traction Force Microscopy

    PubMed Central

    López-Fagundo, Cristina; Reichner, Jonathan; Hoffman-Kim, Diane; Franck, Christian

    2014-01-01

    Traction Force Microscopy (TFM) is a powerful approach for quantifying cell-material interactions that over the last two decades has contributed significantly to our understanding of cellular mechanosensing and mechanotransduction. In addition, recent advances in three-dimensional (3D) imaging and traction force analysis (3D TFM) have highlighted the significance of the third dimension in influencing various cellular processes. Yet irrespective of dimensionality, almost all TFM approaches have relied on a linear elastic theory framework to calculate cell surface tractions. Here we present a new high resolution 3D TFM algorithm which utilizes a large deformation formulation to quantify cellular displacement fields with unprecedented resolution. The results feature some of the first experimental evidence that cells are indeed capable of exerting large material deformations, which require the formulation of a new theoretical TFM framework to accurately calculate the traction forces. Based on our previous 3D TFM technique, we reformulate our approach to accurately account for large material deformation and quantitatively contrast and compare both linear and large deformation frameworks as a function of the applied cell deformation. Particular attention is paid in estimating the accuracy penalty associated with utilizing a traditional linear elastic approach in the presence of large deformation gradients. PMID:24740435

  20. Youth social withdrawal behavior (hikikomori): A systematic review of qualitative and quantitative studies.

    PubMed

    Li, Tim M H; Wong, Paul W C

    2015-07-01

    Acute and/or severe social withdrawal behavior among youth was seen as a culture-bound psychiatric syndrome in Japan, but more youth social withdrawal cases in different countries have been discovered recently. However, due to the lack of a formal definition and diagnostic tool for youth social withdrawal, cross-cultural observational and intervention studies are limited. We aimed to consolidate existing knowledge in order to understand youth social withdrawal from diverse perspectives and suggest different interventions for different trajectories of youth social withdrawal. This review examined the current available scientific information on youth social withdrawal in the academic databases: ProQuest, ScienceDirect, Web of Science and PubMed. We included quantitative and qualitative studies of socially withdrawn youths published in English and academic peer-reviewed journals. We synthesized the information into the following categories: (1) definitions of youth social withdrawal, (2) developmental theories, (3) factors associated with youth social withdrawal and (4) interventions for socially withdrawn youths. Accordingly, there are diverse and controversial definitions for youth social withdrawal. Studies of youth social withdrawal are based on models that lead to quite different conclusions. Researchers with an attachment perspective view youth social withdrawal as a negative phenomenon, whereas those who adopt Erikson's developmental theory view it more positively as a process of seeking self-knowledge. Different interventions for socially withdrawn youths have been developed, mainly in Japan, but evidence-based practice is almost non-existent. We propose a theoretical framework that views youth social withdrawal as resulting from the interplay between psychological, social and behavioral factors. Future validation of the framework will help drive forward advances in theory and interventions for youth social withdrawal as an emerging issue in developed countries. © The Royal Australian and New Zealand College of Psychiatrists 2015.

Top