Sample records for rigorous quantitative analysis

  1. Why Open-Ended Survey Questions Are Unlikely to Support Rigorous Qualitative Insights.

    PubMed

    LaDonna, Kori A; Taylor, Taryn; Lingard, Lorelei

    2018-03-01

    Health professions education researchers are increasingly relying on a combination of quantitative and qualitative research methods to explore complex questions in the field. This important and necessary development, however, creates new methodological challenges that can affect both the rigor of the research process and the quality of the findings. One example is "qualitatively" analyzing free-text responses to survey or assessment instrument questions. In this Invited Commentary, the authors explain why analysis of such responses rarely meets the bar for rigorous qualitative research. While the authors do not discount the potential for free-text responses to enhance quantitative findings or to inspire new research questions, they caution that these responses rarely produce data rich enough to generate robust, stand-alone insights. The authors consider exemplars from health professions education research and propose strategies for treating free-text responses appropriately.

  2. To Your Health: NLM update transcript - Improving medical research rigor?

    MedlinePlus

    ... be a well-tailored solution to enhance the quantitative rigor of medical research, suggests a viewpoint recently published in the Journal ... about 96 percent of medical and public health research articles (that report ... more quantitative rigor would attract widespread attention — if not high ...

  3. Quality and rigor of the concept mapping methodology: a pooled study analysis.

    PubMed

    Rosas, Scott R; Kane, Mary

    2012-05-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative characteristics and estimates of quality and rigor that may guide for future studies are lacking. To address this gap, we conducted a pooled analysis of 69 concept mapping studies to describe characteristics across study phases, generate specific indicators of validity and reliability, and examine the relationship between select study characteristics and quality indicators. Individual study characteristics and estimates were pooled and quantitatively summarized, describing the distribution, variation and parameters for each. In addition, variation in the concept mapping data collection in relation to characteristics and estimates was examined. Overall, results suggest concept mapping yields strong internal representational validity and very strong sorting and rating reliability estimates. Validity and reliability were consistently high despite variation in participation and task completion percentages across data collection modes. The implications of these findings as a practical reference to assess the quality and rigor for future concept mapping studies are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Evaluating Computer-Related Incidents on Campus

    ERIC Educational Resources Information Center

    Rothschild, Daniel; Rezmierski, Virginia

    2004-01-01

    The Computer Incident Factor Analysis and Categorization (CIFAC) Project at the University of Michigan began in September 2003 with grants from EDUCAUSE and the National Science Foundation (NSF). The project's primary goal is to create a best-practices security framework for colleges and universities based on rigorous quantitative analysis of…

  5. Conceptualization of Light Refraction

    ERIC Educational Resources Information Center

    Sokolowski, Andrzej

    2013-01-01

    There have been a number of papers dealing quantitatively with light refraction. Yet the conceptualization of the phenomenon that sets the foundation for a more rigorous math analysis is minimized. The purpose of this paper is to fill that gap. (Contains 3 figures.)

  6. The Relationship between Student's Quantitative Skills, Application of Math, Science Courses, and Science Marks at Single-Sex Independent High Schools

    ERIC Educational Resources Information Center

    Cambridge, David

    2012-01-01

    For independent secondary schools who offer rigorous curriculum to attract students, integration of quantitative skills in the science courses has become an important definition of rigor. However, there is little research examining students' quantitative skills in relation to high school science performance within the single-sex independent school…

  7. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  8. An International Meta-Analysis of Reading Recovery

    ERIC Educational Resources Information Center

    D'Agostino, Jerome V.; Harmey, Sinéad J.

    2016-01-01

    Reading Recovery is one of the most researched literacy programs worldwide. Although there have been at least 4 quantitative reviews of its effectiveness, none have considered all rigorous group-comparison studies from all implementing nations from the late 1970s to 2015. Using a hierarchical linear modeling (HLM) v-known analysis, we examined if…

  9. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  10. Near infrared spectroscopy as an on-line method to quantitatively determine glycogen and predict ultimate pH in pre rigor bovine M. longissimus dorsi.

    PubMed

    Lomiwes, D; Reis, M M; Wiklund, E; Young, O A; North, M

    2010-12-01

    The potential of near infrared (NIR) spectroscopy as an on-line method to quantify glycogen and predict ultimate pH (pH(u)) of pre rigor beef M. longissimus dorsi (LD) was assessed. NIR spectra (538 to 1677 nm) of pre rigor LD from steers, cows and bulls were collected early post mortem and measurements were made for pre rigor glycogen concentration and pH(u). Spectral and measured data were combined to develop models to quantify glycogen and predict the pH(u) of pre rigor LD. NIR spectra and pre rigor predicted values obtained from quantitative models were shown to be poorly correlated against glycogen and pH(u) (r(2)=0.23 and 0.20, respectively). Qualitative models developed to categorize each muscle according to their pH(u) were able to correctly categorize 42% of high pH(u) samples. Optimum qualitative and quantitative models derived from NIR spectra found low correlation between predicted values and reference measurements. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd.. All rights reserved.

  11. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  12. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  13. Anticipatory Understanding of Adversary Intent: A Signature-Based Knowledge System

    DTIC Science & Technology

    2009-06-01

    concept of logical positivism has been applied more recently to all human knowledge and reflected in current data fusion research, information mining...this work has been successfully translated into useful analytical tools that can provide a rigorous and quantitative basis for predictive analysis

  14. Retrospective Analysis of a Classical Biological Control Programme

    USDA-ARS?s Scientific Manuscript database

    1. Classical biological control has been a key technology in the management of invasive arthropod pests globally for over 120 years, yet rigorous quantitative evaluations of programme success or failure are rare. Here, I used life table and matrix model analyses, and life table response experiments ...

  15. QTest: Quantitative Testing of Theories of Binary Choice

    PubMed Central

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  16. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  17. An Examination of the (Un)Intended Consequences of Performance Funding in Higher Education

    ERIC Educational Resources Information Center

    Umbricht, Mark R.; Fernandez, Frank; Ortagus, Justin C.

    2017-01-01

    Previous studies have shown that state performance funding policies do not increase baccalaureate degree production, but higher education scholarship lacks a rigorous, quantitative analysis of the unintended consequences of performance funding. In this article, we use difference-in-differences estimation with fixed effects to evaluate performance…

  18. The influence of biological and technical factors on quantitative analysis of amyloid PET: Points to consider and recommendations for controlling variability in longitudinal data.

    PubMed

    Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William

    2015-09-01

    In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Quantitative phosphoproteomic analysis of caprine muscle with high and low meat quality.

    PubMed

    Liu, Manshun; Wei, Yanchao; Li, Xin; Quek, Siew Young; Zhao, Jing; Zhong, Huazhen; Zhang, Dequan; Liu, Yongfeng

    2018-07-01

    During the conversion of muscle to meat, protein phosphorylation can regulate various biological processes that have important effects on meat quality. To investigate the phosphorylation pattern of protein on rigor mortis, goat longissimus thoracis and external intercostals were classified into two groups (high quality and low quality), and meat quality was evaluated according to meat quality attributes (Warner-Bratzler shear force, Color, pH and drip loss). A quantitative mass spectrometry-based phosphoproteomic study was conducted to analyze the caprine muscle at 12h postmortem applying the TiO 2 -SIMAC-HILIC (TiSH) phosphopeptide enrichment strategy. A total of 2125 phosphopeptides were identified from 750 phosphoproteins. Among them, 96 proteins had differed in phosphorylation levels. The majority of these proteins are involved in glucose metabolism and muscle contraction. The differential phosphorylation level of proteins (PFK, MYL2 and HSP27) in two groups may be the crucial factors of regulating muscle rigor mortis. This study provides a comprehensive view for the phosphorylation status of caprine muscle at rigor mortis, it also gives a better understanding of the regulation of protein phosphorylation on various biological processes that affect the final meat quality attributes. Copyright © 2018. Published by Elsevier Ltd.

  20. Not so Fast My Friend: The Rush to R and the Need for Rigorous Evaluation of Data Analysis and Software in Education

    ERIC Educational Resources Information Center

    Harwell, Michael

    2014-01-01

    Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…

  1. Returning to Work after Cancer: Quantitative Studies and Prototypical Narratives

    PubMed Central

    Steiner, John F.; Nowels, Carolyn T.; Main, Deborah S.

    2009-01-01

    Objective A combination of quantitative data and illustrative narratives may allow cancer survivorship researchers to disseminate their research findings more broadly. We identified recent, methodologically rigorous quantitative studies on return to work after cancer, summarized the themes from these studies, and illustrated those themes with narratives of individual cancer survivors. Methods We reviewed English-language studies of return to work for adult cancer survivors through June, 2008, and identified 13 general themes from papers that met methodological criteria (population-based sampling, prospective and longitudinal assessment, detailed assessment of work, evaluation of economic impact, assessment of moderators of work return, and large sample size). We drew survivorship narratives from a prior qualitative research study to illustrate these themes. Results Nine quantitative studies met 4 or more of our 6 methodological criteria. These studies suggested that most cancer survivors could return to work without residual disabilities. Cancer site, clinical prognosis, treatment modalities, socioeconomic status, and attributes of the job itself influenced the likelihood of work return. Three narratives - a typical survivor who returned to work after treatment, an individual unable to return to work, and an inspiring survivor who returned to work despite substantial barriers - illustrated many of the themes from the quantitative literature while providing additional contextual details. Conclusion Illustrative narratives can complement the findings of cancer survivorship research if researchers are rigorous and transparent in the selection, analysis, and retelling of those stories. PMID:19507264

  2. Departmentalized Classroom Environments versus Traditional Classroom Environments in Second through Fourth Grades: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Ray, Staci Janelle

    2017-01-01

    Since No Child Left Behind was introduced, kindergarten through 12th-grade educators have seen a dramatic increase in accountability, rigor of standards, and responsibilities in the classroom (New America Foundation, 2015). In order to meet the increased demands of federal education regulations in second through fourth grades, many administrators…

  3. Qualitative Methods in Mental Health Services Research

    PubMed Central

    Palinkas, Lawrence A.

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This paper reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the papers included in this special series along with representative examples from the literature. Qualitative methods are used to provide a “thick description” or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods, but often differ with respect to study design, data collection and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semi-structured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research. PMID:25350675

  4. Qualitative and mixed methods in mental health services and implementation research.

    PubMed

    Palinkas, Lawrence A

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This article reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the articles included in this special series along with representative examples from the literature. Qualitative methods are used to provide a "thick description" or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods but often differ with respect to study design, data collection, and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semistructured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed-method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research.

  5. Building quantitative, three-dimensional atlases of gene expression and morphology at cellular resolution.

    PubMed

    Knowles, David W; Biggin, Mark D

    2013-01-01

    Animals comprise dynamic three-dimensional arrays of cells that express gene products in intricate spatial and temporal patterns that determine cellular differentiation and morphogenesis. A rigorous understanding of these developmental processes requires automated methods that quantitatively record and analyze complex morphologies and their associated patterns of gene expression at cellular resolution. Here we summarize light microscopy-based approaches to establish permanent, quantitative datasets-atlases-that record this information. We focus on experiments that capture data for whole embryos or large areas of tissue in three dimensions, often at multiple time points. We compare and contrast the advantages and limitations of different methods and highlight some of the discoveries made. We emphasize the need for interdisciplinary collaborations and integrated experimental pipelines that link sample preparation, image acquisition, image analysis, database design, visualization, and quantitative analysis. Copyright © 2013 Wiley Periodicals, Inc.

  6. Academic Rigor and Economic Value: GED[R] and High School Students' Perceptions and Misperceptions of the GED[R] vs. the High School Diploma

    ERIC Educational Resources Information Center

    Horne, Lela M.; Rachal, John R.; Shelley, Kyna

    2012-01-01

    A mixed methods framework utilized quantitative and qualitative data to determine whether statistically significant differences existed between high school and GED[R] student perceptions of credential value. An exploratory factor analysis (n=326) extracted four factors and then a MANOVA procedure was performed with a stratified quota sample…

  7. Designing a mixed methods study in primary care.

    PubMed

    Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V

    2004-01-01

    Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.

  8. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  9. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    PubMed

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  10. Model-based assessment of estuary ecosystem health using the latent health factor index, with application to the richibucto estuary.

    PubMed

    Chiu, Grace S; Wu, Margaret A; Lu, Lin

    2013-01-01

    The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.

  11. Defence Reporter. Summer/Autumn 2011

    DTIC Science & Technology

    2011-01-01

    of protection afforded by this material was determined against a liquid challenge. R000259AA Cermet Materials for Armour Applications Dstl Porton...initial research, however areas of focus have been identified and a comprehensive plan of study has been developed. R000261FE Force Multipliers in... study was not intended to be a rigorous quantitative analysis of COIN. It has identified six factors that have had an overall positive effect upon the

  12. Designing A Mixed Methods Study In Primary Care

    PubMed Central

    Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.

    2004-01-01

    BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277

  13. Parallel labeling experiments for pathway elucidation and (13)C metabolic flux analysis.

    PubMed

    Antoniewicz, Maciek R

    2015-12-01

    Metabolic pathway models provide the foundation for quantitative studies of cellular physiology through the measurement of intracellular metabolic fluxes. For model organisms metabolic models are well established, with many manually curated genome-scale model reconstructions, gene knockout studies and stable-isotope tracing studies. However, for non-model organisms a similar level of knowledge is often lacking. Compartmentation of cellular metabolism in eukaryotic systems also presents significant challenges for quantitative (13)C-metabolic flux analysis ((13)C-MFA). Recently, innovative (13)C-MFA approaches have been developed based on parallel labeling experiments, the use of multiple isotopic tracers and integrated data analysis, that allow more rigorous validation of pathway models and improved quantification of metabolic fluxes. Applications of these approaches open new research directions in metabolic engineering, biotechnology and medicine. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research

    ERIC Educational Resources Information Center

    Mertens, Steven B.

    2006-01-01

    This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…

  15. Quantitative Approaches to Group Research: Suggestions for Best Practices

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  16. Respirator Qualitative/Quantitative Fit Test Method Analysis

    DTIC Science & Technology

    1980-08-01

    carefully fitted to sample the atmosphere in the oral/ nasal or visual cavity. Leakage is expressed as a ratio of the ambient challenge atmosphere...8, 10, 40, 205, 274). While cer- Lain counercial polymer films have reportedly caused fibrosarcomas and other tumors in rats, this has been attributed... nasal passages (Portland cement dust), or cause injury to the skin or mucous membranes by chemical or mechanical action per se or by the rigorous skin

  17. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  18. Cognitive task analysis: Techniques applied to airborne weapons training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less

  19. Useful, Used, and Peer Approved: The Importance of Rigor and Accessibility in Postsecondary Research and Evaluation. WISCAPE Viewpoints

    ERIC Educational Resources Information Center

    Vaade, Elizabeth; McCready, Bo

    2012-01-01

    Traditionally, researchers, policymakers, and practitioners have perceived a tension between rigor and accessibility in quantitative research and evaluation in postsecondary education. However, this study indicates that both producers and consumers of these studies value high-quality work and clear findings that can reach multiple audiences. The…

  20. A philosophy of science perspective on the quantitative analysis of behavior.

    PubMed

    Smith, Terry L

    2015-05-01

    B.F. Skinner argued that the science of behavior would progress more rapidly without appealing to theories of learning. He also suggested that theories in a quite different sense were possible, but that the science of behavior as of 1950 was not ready for them. The following analysis distinguishes between Skinner's two concepts of theory. It argues that theory in the second sense has arisen in the quantitative analysis of behavior. The attempt to give a dynamic account of the static regularities of this theory, however, has produced a theory in the first sense. Within its limited domain, this theory offers a rigorous alternative to cognitive accounts of behavior. Rather than distracting attention from actual behavior, it has now led to novel predictions about it. This article is part of a Special Issue entitled 'SQAB 2014'. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  2. Imprecise Probability Methods for Weapons UQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Vander Wiel, Scott Alan

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  3. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  4. Ambiguity, logic, simplicity, and dynamics: Wittgensteinian evaluative criteria in peer review of quantitative research on categorization.

    PubMed

    Shimp, Charles P

    2004-06-30

    Research on categorization has changed over time, and some of these changes resemble how Wittgenstein's views changed from his Tractatus Logico-Philosophicus to his Philosophical Investigations. Wittgenstein initially focused on unambiguous, abstract, parsimonious, logical propositions and rules, and on independent, static, "atomic facts." This approach subsequently influenced the development of logical positivism and thereby may have indirectly influenced method and theory in research on categorization: much animal research on categorization has focused on learning simple, static, logical rules unambiguously interrelating small numbers of independent features. He later rejected logical simplicity and rigor and focused instead on Gestalt ideas about figure-ground reversals and context, the ambiguity of family resemblance, and the function of details of everyday language. Contemporary contextualism has been influenced by this latter position, some features of which appear in contemporary empirical research on categorization. These developmental changes are illustrated by research on avian local and global levels of visual perceptual analysis, categorization of rectangles and moving objects, and artificial grammar learning. Implications are described for peer review of quantitative theory in which ambiguity, logical rigor, simplicity, or dynamics are designed to play important roles.

  5. Quantitative atomic-scale structure characterization of ordered mesoporous carbon materials by solid state NMR

    DOE PAGES

    Wang, Zhuoran; Opembe, Naftali; Kobayashi, Takeshi; ...

    2018-02-03

    In this study, solid-state (SS)NMR techniques were applied to characterize the atomic-scale structures of ordered mesoporous carbon (OMC) materials prepared using Pluronic F127 as template with resorcinol and formaldehyde as polymerizing precursors. A rigorous quantitative analysis was developed using a combination of 13C SSNMR spectra acquired with direct polarization and cross polarization on natural abundant and selectively 13C-enriched series of samples pyrolyzed at various temperatures. These experiments identified and counted the key functional groups present in the OMCs at various stages of preparation and thermal treatment. Lastly, the chemical evolution of molecular networks, the average sizes of aromatic clusters andmore » the extended molecular structures of OMCs were then inferred by coupling this information with the elemental analysis results.« less

  6. Quantitative atomic-scale structure characterization of ordered mesoporous carbon materials by solid state NMR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhuoran; Opembe, Naftali; Kobayashi, Takeshi

    In this study, solid-state (SS)NMR techniques were applied to characterize the atomic-scale structures of ordered mesoporous carbon (OMC) materials prepared using Pluronic F127 as template with resorcinol and formaldehyde as polymerizing precursors. A rigorous quantitative analysis was developed using a combination of 13C SSNMR spectra acquired with direct polarization and cross polarization on natural abundant and selectively 13C-enriched series of samples pyrolyzed at various temperatures. These experiments identified and counted the key functional groups present in the OMCs at various stages of preparation and thermal treatment. Lastly, the chemical evolution of molecular networks, the average sizes of aromatic clusters andmore » the extended molecular structures of OMCs were then inferred by coupling this information with the elemental analysis results.« less

  7. Reducible or irreducible? Mathematical reasoning and the ontological method.

    PubMed

    Fisher, William P

    2010-01-01

    Science is often described as nothing but the practice of measurement. This perspective follows from longstanding respect for the roles mathematics and quantification have played as media through which alternative hypotheses are evaluated and experience becomes better managed. Many figures in the history of science and psychology have contributed to what has been called the "quantitative imperative," the demand that fields of study employ number and mathematics even when they do not constitute the language in which investigators think together. But what makes an area of study scientific is, of course, not the mere use of number, but communities of investigators who share common mathematical languages for exchanging quantitative and quantitative value. Such languages require rigorous theoretical underpinning, a basis in data sufficient to the task, and instruments traceable to reference standard quantitative metrics. The values shared and exchanged by such communities typically involve the application of mathematical models that specify the sufficient and invariant relationships necessary for rigorous theorizing and instrument equating. The mathematical metaphysics of science are explored with the aim of connecting principles of quantitative measurement with the structures of sufficient reason.

  8. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  9. Model-Based Assessment of Estuary Ecosystem Health Using the Latent Health Factor Index, with Application to the Richibucto Estuary

    PubMed Central

    Chiu, Grace S.; Wu, Margaret A.; Lu, Lin

    2013-01-01

    The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443

  10. Student’s rigorous mathematical thinking based on cognitive style

    NASA Astrophysics Data System (ADS)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  11. Quantum key distribution with an unknown and untrusted source

    NASA Astrophysics Data System (ADS)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2008-05-01

    The security of a standard bidirectional “plug-and-play” quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we solve this question directly by presenting the quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard Bennett-Brassard 1984 protocol, weak+vacuum decoy state protocol, and one-decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source.

  12. A Transformative Model for Undergraduate Quantitative Biology Education

    ERIC Educational Resources Information Center

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…

  13. β-Glucuronidase is a suitable internal control gene for mRNA quantitation in pathophysiological and non-pathological livers.

    PubMed

    Yamaguchi, Hiromi; Matsumoto, Sawako; Ishibashi, Mariko; Hasegawa, Kiyoshi; Sugitani, Masahiko; Takayama, Tadatoshi; Esumi, Mariko

    2013-10-01

    The level of expression of housekeeping genes is in general considered stable, and a representative gene such as glyceraldehyde-3-phosphate dehydrogenase is commonly used as an internal control for quantitating mRNA. However, expression of housekeeping genes is not always constant under pathological conditions. To determine which genes would be most suitable as internal controls for quantitative gene expression studies in human liver diseases, we quantified 12 representative housekeeping genes in 27 non-cancerous liver tissues (normal, chronic hepatitis C with and without liver cirrhosis). We identified β-glucuronidase as the most suitable gene for studies on liver by rigorous statistical analysis of inter- and intra-group comparisons. We conclude that it is important to determine the most appropriate control gene for the particular condition to be analyzed. © 2013 Elsevier Inc. All rights reserved.

  14. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  15. Rigorous analysis of an electric-field-driven liquid crystal lens for 3D displays

    NASA Astrophysics Data System (ADS)

    Kim, Bong-Sik; Lee, Seung-Chul; Park, Woo-Sang

    2014-08-01

    We numerically analyzed the optical performance of an electric field driven liquid crystal (ELC) lens adopted for 3-dimensional liquid crystal displays (3D-LCDs) through rigorous ray tracing. For the calculation, we first obtain the director distribution profile of the liquid crystals by using the Erickson-Leslie motional equation; then, we calculate the transmission of light through the ELC lens by using the extended Jones matrix method. The simulation was carried out for a 9view 3D-LCD with a diagonal of 17.1 inches, where the ELC lens was slanted to achieve natural stereoscopic images. The results show that each view exists separately according to the viewing position at an optimum viewing distance of 80 cm. In addition, our simulation results provide a quantitative explanation for the ghost or blurred images between views observed from a 3D-LCD with an ELC lens. The numerical simulations are also shown to be in good agreement with the experimental results. The present simulation method is expected to provide optimum design conditions for obtaining natural 3D images by rigorously analyzing the optical functionalities of an ELC lens.

  16. Precise quantitation of 136 urinary proteins by LC/MRM-MS using stable isotope labeled peptides as internal standards for biomarker discovery and/or verification studies.

    PubMed

    Percy, Andrew J; Yang, Juncong; Hardie, Darryl B; Chambers, Andrew G; Tamura-Wells, Jessica; Borchers, Christoph H

    2015-06-15

    Spurred on by the growing demand for panels of validated disease biomarkers, increasing efforts have focused on advancing qualitative and quantitative tools for more highly multiplexed and sensitive analyses of a multitude of analytes in various human biofluids. In quantitative proteomics, evolving strategies involve the use of the targeted multiple reaction monitoring (MRM) mode of mass spectrometry (MS) with stable isotope-labeled standards (SIS) used for internal normalization. Using that preferred approach with non-invasive urine samples, we have systematically advanced and rigorously assessed the methodology toward the precise quantitation of the largest, multiplexed panel of candidate protein biomarkers in human urine to date. The concentrations of the 136 proteins span >5 orders of magnitude (from 8.6 μg/mL to 25 pg/mL), with average CVs of 8.6% over process triplicate. Detailed here is our quantitative method, the analysis strategy, a feasibility application to prostate cancer samples, and a discussion of the utility of this method in translational studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. High-throughput SISCAPA quantitation of peptides from human plasma digests by ultrafast, liquid chromatography-free mass spectrometry.

    PubMed

    Razavi, Morteza; Frick, Lauren E; LaMarr, William A; Pope, Matthew E; Miller, Christine A; Anderson, N Leigh; Pearson, Terry W

    2012-12-07

    We investigated the utility of an SPE-MS/MS platform in combination with a modified SISCAPA workflow for chromatography-free MRM analysis of proteotypic peptides in digested human plasma. This combination of SISCAPA and SPE-MS/MS technology allows sensitive, MRM-based quantification of peptides from plasma digests with a sample cycle time of ∼7 s, a 300-fold improvement over typical MRM analyses with analysis times of 30-40 min that use liquid chromatography upstream of MS. The optimized system includes capture and enrichment to near purity of target proteotypic peptides using rigorously selected, high affinity, antipeptide monoclonal antibodies and reduction of background peptides using a novel treatment of magnetic bead immunoadsorbents. Using this method, we have successfully quantitated LPS-binding protein and mesothelin (concentrations of ∼5000 ng/mL and ∼10 ng/mL, respectively) in human plasma. The method eliminates the need for upstream liquid-chromatography and can be multiplexed, thus facilitating quantitative analysis of proteins, including biomarkers, in large sample sets. The method is ideal for high-throughput biomarker validation after affinity enrichment and has the potential for applications in clinical laboratories.

  18. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  19. Shaping Social Work Science: What Should Quantitative Researchers Do?

    ERIC Educational Resources Information Center

    Guo, Shenyang

    2015-01-01

    Based on a review of economists' debates on mathematical economics, this article discusses a key issue for shaping the science of social work--research methodology. The article describes three important tasks quantitative researchers need to fulfill in order to enhance the scientific rigor of social work research. First, to test theories using…

  20. A Methodological Self-Study of Quantitizing: Negotiating Meaning and Revealing Multiplicity

    ERIC Educational Resources Information Center

    Seltzer-Kelly, Deborah; Westwood, Sean J.; Pena-Guzman, David M.

    2012-01-01

    This inquiry developed during the process of "quantitizing" qualitative data the authors had gathered for a mixed methods curriculum efficacy study. Rather than providing the intended rigor to their data coding process, their use of an intercoder reliability metric prompted their investigation of the multiplicity and messiness that, as they…

  1. Cost analysis of advanced turbine blade manufacturing processes

    NASA Technical Reports Server (NTRS)

    Barth, C. F.; Blake, D. E.; Stelson, T. S.

    1977-01-01

    A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.

  2. Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Daniel S.; Tandon, Lav

    The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.

  3. Photons Revisited

    NASA Astrophysics Data System (ADS)

    Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg

    2014-06-01

    A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.

  4. When Assessment Data Are Words: Validity Evidence for Qualitative Educational Assessments.

    PubMed

    Cook, David A; Kuper, Ayelet; Hatala, Rose; Ginsburg, Shiphra

    2016-10-01

    Quantitative scores fail to capture all important features of learner performance. This awareness has led to increased use of qualitative data when assessing health professionals. Yet the use of qualitative assessments is hampered by incomplete understanding of their role in forming judgments, and lack of consensus in how to appraise the rigor of judgments therein derived. The authors articulate the role of qualitative assessment as part of a comprehensive program of assessment, and translate the concept of validity to apply to judgments arising from qualitative assessments. They first identify standards for rigor in qualitative research, and then use two contemporary assessment validity frameworks to reorganize these standards for application to qualitative assessment.Standards for rigor in qualitative research include responsiveness, reflexivity, purposive sampling, thick description, triangulation, transparency, and transferability. These standards can be reframed using Messick's five sources of validity evidence (content, response process, internal structure, relationships with other variables, and consequences) and Kane's four inferences in validation (scoring, generalization, extrapolation, and implications). Evidence can be collected and evaluated for each evidence source or inference. The authors illustrate this approach using published research on learning portfolios.The authors advocate a "methods-neutral" approach to assessment, in which a clearly stated purpose determines the nature of and approach to data collection and analysis. Increased use of qualitative assessments will necessitate more rigorous judgments of the defensibility (validity) of inferences and decisions. Evidence should be strategically sought to inform a coherent validity argument.

  5. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  6. Measuring the Unmeasurable: Upholding Rigor in Quantitative Studies of Personal and Social Development in Outdoor Adventure Education

    ERIC Educational Resources Information Center

    Scrutton, Roger; Beames, Simon

    2015-01-01

    Outdoor adventure education (OAE) has a long history of being credited with the personal and social development (PSD) of its participants. PSD is notoriously difficult to measure quantitatively, yet stakeholders demand statistical evidence that given approaches to eliciting PSD are effective in their methods. Rightly or wrongly, many stakeholders…

  7. Digital pathology and image analysis for robust high-throughput quantitative assessment of Alzheimer disease neuropathologic changes.

    PubMed

    Neltner, Janna Hackett; Abner, Erin Lynn; Schmitt, Frederick A; Denison, Stephanie Kay; Anderson, Sonya; Patel, Ela; Nelson, Peter T

    2012-12-01

    Quantitative neuropathologic methods provide information that is important for both research and clinical applications. The technologic advancement of digital pathology and image analysis offers new solutions to enable valid quantification of pathologic severity that is reproducible between raters regardless of experience. Using an Aperio ScanScope XT and its accompanying image analysis software, we designed algorithms for quantitation of amyloid and tau pathologies on 65 β-amyloid (6F/3D antibody) and 48 phospho-tau (PHF-1)-immunostained sections of human temporal neocortex. Quantitative digital pathologic data were compared with manual pathology counts. There were excellent correlations between manually counted and digitally analyzed neuropathologic parameters (R² = 0.56-0.72). Data were highly reproducible among 3 participants with varying degrees of expertise in neuropathology (intraclass correlation coefficient values, >0.910). Digital quantification also provided additional parameters, including average plaque area, which shows statistically significant differences when samples are stratified according to apolipoprotein E allele status (average plaque area, 380.9 μm² in apolipoprotein E [Latin Small Letter Open E]4 carriers vs 274.4 μm² for noncarriers; p < 0.001). Thus, digital pathology offers a rigorous and reproducible method for quantifying Alzheimer disease neuropathologic changes and may provide additional insights into morphologic characteristics that were previously more challenging to assess because of technical limitations.

  8. Quantum key distribution with an unknown and untrusted source

    NASA Astrophysics Data System (ADS)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2009-03-01

    The security of a standard bi-directional ``plug & play'' quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we present the first quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard BB84 protocol, weak+vacuum decoy state protocol, and one-decoy decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source. Our work is published in [1]. [4pt] [1] Y. Zhao, B. Qi, and H.-K. Lo, Phys. Rev. A, 77:052327 (2008).

  9. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    PubMed

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  10. qSR: a quantitative super-resolution analysis tool reveals the cell-cycle dependent organization of RNA Polymerase I in live human cells.

    PubMed

    Andrews, J O; Conway, W; Cho, W -K; Narayanan, A; Spille, J -H; Jayanth, N; Inoue, T; Mullen, S; Thaler, J; Cissé, I I

    2018-05-09

    We present qSR, an analytical tool for the quantitative analysis of single molecule based super-resolution data. The software is created as an open-source platform integrating multiple algorithms for rigorous spatial and temporal characterizations of protein clusters in super-resolution data of living cells. First, we illustrate qSR using a sample live cell data of RNA Polymerase II (Pol II) as an example of highly dynamic sub-diffractive clusters. Then we utilize qSR to investigate the organization and dynamics of endogenous RNA Polymerase I (Pol I) in live human cells, throughout the cell cycle. Our analysis reveals a previously uncharacterized transient clustering of Pol I. Both stable and transient populations of Pol I clusters co-exist in individual living cells, and their relative fraction vary during cell cycle, in a manner correlating with global gene expression. Thus, qSR serves to facilitate the study of protein organization and dynamics with very high spatial and temporal resolutions directly in live cell.

  11. Rigor or Reliability and Validity in Qualitative Research: Perspectives, Strategies, Reconceptualization, and Recommendations.

    PubMed

    Cypress, Brigitte S

    Issues are still raised even now in the 21st century by the persistent concern with achieving rigor in qualitative research. There is also a continuing debate about the analogous terms reliability and validity in naturalistic inquiries as opposed to quantitative investigations. This article presents the concept of rigor in qualitative research using a phenomenological study as an exemplar to further illustrate the process. Elaborating on epistemological and theoretical conceptualizations by Lincoln and Guba, strategies congruent with qualitative perspective for ensuring validity to establish the credibility of the study are described. A synthesis of the historical development of validity criteria evident in the literature during the years is explored. Recommendations are made for use of the term rigor instead of trustworthiness and the reconceptualization and renewed use of the concept of reliability and validity in qualitative research, that strategies for ensuring rigor must be built into the qualitative research process rather than evaluated only after the inquiry, and that qualitative researchers and students alike must be proactive and take responsibility in ensuring the rigor of a research study. The insights garnered here will move novice researchers and doctoral students to a better conceptual grasp of the complexity of reliability and validity and its ramifications for qualitative inquiry.

  12. Quantitatively differentiating microstructural variations of skeletal muscle tissues by multispectral Mueller matrix imaging

    NASA Astrophysics Data System (ADS)

    Dong, Yang; He, Honghui; He, Chao; Ma, Hui

    2016-10-01

    Polarized light is sensitive to the microstructures of biological tissues and can be used to detect physiological changes. Meanwhile, spectral features of the scattered light can also provide abundant microstructural information of tissues. In this paper, we take the backscattering polarization Mueller matrix images of bovine skeletal muscle tissues during the 24-hour experimental time, and analyze their multispectral behavior using quantitative Mueller matrix parameters. In the processes of rigor mortis and proteolysis of muscle samples, multispectral frequency distribution histograms (FDHs) of the Mueller matrix elements can reveal rich qualitative structural information. In addition, we analyze the temporal variations of the sample using the multispectral Mueller matrix transformation (MMT) parameters. The experimental results indicate that the different stages of rigor mortis and proteolysis for bovine skeletal muscle samples can be judged by these MMT parameters. The results presented in this work show that combining with the multispectral technique, the FDHs and MMT parameters can characterize the microstructural variation features of skeletal muscle tissues. The techniques have the potential to be used as tools for quantitative assessment of meat qualities in food industry.

  13. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  14. Bioregional monitoring design and occupancy estimation for two Sierra Nevadan amphibian taxa

    EPA Science Inventory

    Land-management agencies need quantitative, statistically rigorous monitoring data, often at large spatial and temporal scales, to support resource-management decisions. Monitoring designs typically must accommodate multiple ecological, logistical, political, and economic objec...

  15. Separating intrinsic from extrinsic fluctuations in dynamic biological systems

    PubMed Central

    Paulsson, Johan

    2011-01-01

    From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems. PMID:21730172

  16. Separating intrinsic from extrinsic fluctuations in dynamic biological systems.

    PubMed

    Hilfinger, Andreas; Paulsson, Johan

    2011-07-19

    From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems.

  17. PRO development: rigorous qualitative research as the crucial foundation.

    PubMed

    Lasch, Kathryn Eilene; Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-10-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity.

  18. PRO development: rigorous qualitative research as the crucial foundation

    PubMed Central

    Marquis, Patrick; Vigneux, Marc; Abetz, Linda; Arnould, Benoit; Bayliss, Martha; Crawford, Bruce; Rosa, Kathleen

    2010-01-01

    Recently published articles have described criteria to assess qualitative research in the health field in general, but very few articles have delineated qualitative methods to be used in the development of Patient-Reported Outcomes (PROs). In fact, how PROs are developed with subject input through focus groups and interviews has been given relatively short shrift in the PRO literature when compared to the plethora of quantitative articles on the psychometric properties of PROs. If documented at all, most PRO validation articles give little for the reader to evaluate the content validity of the measures and the credibility and trustworthiness of the methods used to develop them. Increasingly, however, scientists and authorities want to be assured that PRO items and scales have meaning and relevance to subjects. This article was developed by an international, interdisciplinary group of psychologists, psychometricians, regulatory experts, a physician, and a sociologist. It presents rigorous and appropriate qualitative research methods for developing PROs with content validity. The approach described combines an overarching phenomenological theoretical framework with grounded theory data collection and analysis methods to yield PRO items and scales that have content validity. PMID:20512662

  19. Quantitative imaging biomarker ontology (QIBO) for knowledge representation of biomedical imaging biomarkers.

    PubMed

    Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David

    2013-08-01

    A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.

  20. Capturing the crystalline phase of two-dimensional nanocrystal superlattices in action.

    PubMed

    Jiang, Zhang; Lin, Xiao-Min; Sprung, Michael; Narayanan, Suresh; Wang, Jin

    2010-03-10

    Critical photonic, electronic, and magnetic applications of two-dimensional nanocrystal superlattices often require nanostructures in perfect single-crystal phases with long-range order and limited defects. Here we discovered a crystalline phase with quasi-long-range positional order for two-dimensional nanocrystal superlattice domains self-assembled at the liquid-air interface during droplet evaporation, using in situ time-resolved X-ray scattering along with rigorous theories on two dimensional crystal structures. Surprisingly, it was observed that drying these superlattice domains preserved only an orientational order but not a long-range positional order, also supported by quantitative analysis of transmission electron microscopy images.

  1. Electronic structure and microscopic model of V(2)GeO(4)F(2)-a quantum spin system with S = 1.

    PubMed

    Rahaman, Badiur; Saha-Dasgupta, T

    2007-07-25

    We present first-principles density functional calculations and downfolding studies of the electronic and magnetic properties of the oxide-fluoride quantum spin system V(2)GeO(4)F(2). We discuss explicitly the nature of the exchange paths and provide quantitative estimates of magnetic exchange couplings. A microscopic modelling based on analysis of the electronic structure of this systems puts it in the interesting class of weakly coupled alternating chain S = 1 systems. Based on the microscopic model, we make inferrences about its spin excitation spectra, which needs to be tested by rigorous experimental study.

  2. The physics of osmotic pressure

    NASA Astrophysics Data System (ADS)

    Bowler, M. G.

    2017-09-01

    Osmosis drives the development of a pressure difference of many atmospheres between a dilute solution and pure solvent with which it is in contact through a semi-permeable membrane. The educational importance of this paper is that it presents a novel treatment in terms of fluid mechanics that is quantitative and exact. It is also simple and intuitive, showing vividly how osmotic pressures are generated and maintained in equilibrium, driven by differential solvent pressures. The present rigorous analysis using the virial theorem seems unknown and can be easily understood—and taught—at various different levels. It should be valuable to undergraduates, graduate students and indeed to the general physicist.

  3. Reference condition approach to restoration planning

    USGS Publications Warehouse

    Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.

    2010-01-01

    Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.

  4. Quantitation of fixative-induced morphologic and antigenic variation in mouse and human breast cancers

    PubMed Central

    Cardiff, Robert D; Hubbard, Neil E; Engelberg, Jesse A; Munn, Robert J; Miller, Claramae H; Walls, Judith E; Chen, Jane Q; Velásquez-García, Héctor A; Galvez, Jose J; Bell, Katie J; Beckett, Laurel A; Li, Yue-Ju; Borowsky, Alexander D

    2013-01-01

    Quantitative Image Analysis (QIA) of digitized whole slide images for morphometric parameters and immunohistochemistry of breast cancer antigens was used to evaluate the technical reproducibility, biological variability, and intratumoral heterogeneity in three transplantable mouse mammary tumor models of human breast cancer. The relative preservation of structure and immunogenicity of the three mouse models and three human breast cancers was also compared when fixed with representatives of four distinct classes of fixatives. The three mouse mammary tumor cell models were an ER + /PR + model (SSM2), a Her2 + model (NDL), and a triple negative model (MET1). The four breast cancer antigens were ER, PR, Her2, and Ki67. The fixatives included examples of (1) strong cross-linkers, (2) weak cross-linkers, (3) coagulants, and (4) combination fixatives. Each parameter was quantitatively analyzed using modified Aperio Technologies ImageScope algorithms. Careful pre-analytical adjustments to the algorithms were required to provide accurate results. The QIA permitted rigorous statistical analysis of results and grading by rank order. The analyses suggested excellent technical reproducibility and confirmed biological heterogeneity within each tumor. The strong cross-linker fixatives, such as formalin, consistently ranked higher than weak cross-linker, coagulant and combination fixatives in both the morphometric and immunohistochemical parameters. PMID:23399853

  5. Imaging 2D optical diffuse reflectance in skeletal muscle

    NASA Astrophysics Data System (ADS)

    Ranasinghesagara, Janaka; Yao, Gang

    2007-04-01

    We discovered a unique pattern of optical reflectance from fresh prerigor skeletal muscles, which can not be described using existing theories. A numerical fitting function was developed to quantify the equiintensity contours of acquired reflectance images. Using this model, we studied the changes of reflectance profile during stretching and rigor process. We found that the prominent anisotropic features diminished after rigor completion. These results suggested that muscle sarcomere structures played important roles in modulating light propagation in whole muscle. When incorporating the sarcomere diffraction in a Monte Carlo model, we showed that the resulting reflectance profiles quantitatively resembled the experimental observation.

  6. An ounce of prevention or a pound of cure: bioeconomic risk analysis of invasive species.

    PubMed

    Leung, Brian; Lodge, David M; Finnoff, David; Shogren, Jason F; Lewis, Mark A; Lamberti, Gary

    2002-12-07

    Numbers of non-indigenous species--species introduced from elsewhere - are increasing rapidly worldwide, causing both environmental and economic damage. Rigorous quantitative risk-analysis frameworks, however, for invasive species are lacking. We need to evaluate the risks posed by invasive species and quantify the relative merits of different management strategies (e.g. allocation of resources between prevention and control). We present a quantitative bioeconomic modelling framework to analyse risks from non-indigenous species to economic activity and the environment. The model identifies the optimal allocation of resources to prevention versus control, acceptable invasion risks and consequences of invasion to optimal investments (e.g. labour and capital). We apply the model to zebra mussels (Dreissena polymorpha), and show that society could benefit by spending up to US$324 000 year(-1) to prevent invasions into a single lake with a power plant. By contrast, the US Fish and Wildlife Service spent US$825 000 in 2001 to manage all aquatic invaders in all US lakes. Thus, greater investment in prevention is warranted.

  7. Single-case synthesis tools II: Comparing quantitative outcome measures.

    PubMed

    Zimmerman, Kathleen N; Pustejovsky, James E; Ledford, Jennifer R; Barton, Erin E; Severini, Katherine E; Lloyd, Blair P

    2018-03-07

    Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes-overlap measures (percentage non-overlapping data, improvement rate difference, and Tau) and parametric within-case effect sizes (standardized mean difference and log response ratio [increasing and decreasing])-were compared to determine if choice of synthesis method within and across classes impacts conclusions regarding effectiveness. The effectiveness of sensory-based interventions (SBI), a commonly used class of treatments for young children, was evaluated. Separately from evaluations of rigor and quality, authors evaluated behavior change between baseline and SBI conditions. SBI were unlikely to result in positive behavior change across all measures except IRD. However, subgroup analyses resulted in variable conclusions, indicating that the choice of measures for SCD meta-analyses can impact conclusions. Suggestions for using the log response ratio in SCD meta-analyses and considerations for understanding variability in SCD meta-analysis conclusions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Quantitative high-performance liquid chromatography of nucleosides in biological materials.

    PubMed

    Gehrke, C W; Kuo, K C; Davis, G E; Suits, R D; Waalkes, T P; Borek, E

    1978-03-21

    A rigorous, comprehensive, and reliable reversed-phase high-performance liquid chromatographic (HPLC) method has been developed for the analysis of ribonucleosides in urine (psi, m1A, m1I, m2G, A, m2(2)G). An initial isolation of ribonucleosides with an affinity gel containing an immobilized phenylboronic acid was used to improve selectivity and sensitivity. Response for all nucleosides was linear from 0.1 to 50 nmoles injected and good quantitation was obtained for 25 microliter or less of sample placed on the HPLC column. Excellent precision of analysis for urinary nucleosides was achieved on matrix dependent and independent samples, and the high resolution of the reversed-phase column allowed the complete separation of 9 nucleosides from other unidentified UV absorbing components at the 1-ng level. Supporting experimental data are presented on precision, recovery, chromatographic methods, minimum detection limit, retention time, relative molar response, sample clean-up, stability of nucleosides, boronate gel capacity, and application to analysis of urine from patients with leukemia and breast cancer. This method is now being used routinely for the determination of the concentration and ratios of nucleosides in urine from patients with different types of cancer and in chemotherapy response studies.

  9. Interface Pattern Selection in Directional Solidification

    NASA Technical Reports Server (NTRS)

    Trivedi, Rohit; Tewari, Surendra N.

    2001-01-01

    The central focus of this research is to establish key scientific concepts that govern the selection of cellular and dendritic patterns during the directional solidification of alloys. Ground-based studies have established that the conditions under which cellular and dendritic microstructures form are precisely where convection effects are dominant in bulk samples. Thus, experimental data can not be obtained terrestrially under pure diffusive regime. Furthermore, reliable theoretical models are not yet possible which can quantitatively incorporate fluid flow in the pattern selection criterion. Consequently, microgravity experiments on cellular and dendritic growth are designed to obtain benchmark data under diffusive growth conditions that can be quantitatively analyzed and compared with the rigorous theoretical model to establish the fundamental principles that govern the selection of specific microstructure and its length scales. In the cellular structure, different cells in an array are strongly coupled so that the cellular pattern evolution is controlled by complex interactions between thermal diffusion, solute diffusion and interface effects. These interactions give infinity of solutions, and the system selects only a narrow band of solutions. The aim of this investigation is to obtain benchmark data and develop a rigorous theoretical model that will allow us to quantitatively establish the physics of this selection process.

  10. Transportation and the economy national and state perspectives

    DOT National Transportation Integrated Search

    1998-05-01

    In the past months, many years of research and data collection have begun paying off in a rich series of analytical studies paving the way for a strong, rigorous and quantitative explanation of transportation's role in the economy and the power of tr...

  11. Collisional damping rates for plasma waves

    NASA Astrophysics Data System (ADS)

    Tigik, S. F.; Ziebell, L. F.; Yoon, P. H.

    2016-06-01

    The distinction between the plasma dynamics dominated by collisional transport versus collective processes has never been rigorously addressed until recently. A recent paper [P. H. Yoon et al., Phys. Rev. E 93, 033203 (2016)] formulates for the first time, a unified kinetic theory in which collective processes and collisional dynamics are systematically incorporated from first principles. One of the outcomes of such a formalism is the rigorous derivation of collisional damping rates for Langmuir and ion-acoustic waves, which can be contrasted to the heuristic customary approach. However, the results are given only in formal mathematical expressions. The present brief communication numerically evaluates the rigorous collisional damping rates by considering the case of plasma particles with Maxwellian velocity distribution function so as to assess the consequence of the rigorous formalism in a quantitative manner. Comparison with the heuristic ("Spitzer") formula shows that the accurate damping rates are much lower in magnitude than the conventional expression, which implies that the traditional approach over-estimates the importance of attenuation of plasma waves by collisional relaxation process. Such a finding may have a wide applicability ranging from laboratory to space and astrophysical plasmas.

  12. Bayesian evidence computation for model selection in non-linear geoacoustic inference problems.

    PubMed

    Dettmer, Jan; Dosso, Stan E; Osler, John C

    2010-12-01

    This paper applies a general Bayesian inference approach, based on Bayesian evidence computation, to geoacoustic inversion of interface-wave dispersion data. Quantitative model selection is carried out by computing the evidence (normalizing constants) for several model parameterizations using annealed importance sampling. The resulting posterior probability density estimate is compared to estimates obtained from Metropolis-Hastings sampling to ensure consistent results. The approach is applied to invert interface-wave dispersion data collected on the Scotian Shelf, off the east coast of Canada for the sediment shear-wave velocity profile. Results are consistent with previous work on these data but extend the analysis to a rigorous approach including model selection and uncertainty analysis. The results are also consistent with core samples and seismic reflection measurements carried out in the area.

  13. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  14. Review and Synthesize Completed Research Through Systematic Review.

    PubMed

    Hopp, Lisa; Rittenmeyer, Leslie

    2015-10-01

    The evidence-based health care movement has generated new opportunity for scholars to generate synthesized sources of evidence. Systematic reviews are rigorous forms of synthesized evidence that scholars can conduct if they have requisite skills, time, and access to excellent library resources. Systematic reviews play an important role in synthesizing what is known and unknown about a particular health issue. Thus, they have a synergistic relationship with primary research. They can both inform clinical decisions when the evidence is adequate and identify gaps in knowledge to inform research priorities. Systematic reviews can be conducted of quantitative and qualitative evidence to answer many types of questions. They all share characteristics of rigor that arise from a priori protocol development, transparency, exhaustive searching, dual independent reviewers who critically appraise studies using standardized tools, rigor in synthesis, and peer review at multiple stages in the conduct and reporting of the systematic review. © The Author(s) 2015.

  15. Television camera as a scientific instrument

    NASA Technical Reports Server (NTRS)

    Smokler, M. I.

    1970-01-01

    Rigorous calibration program, coupled with a sophisticated data-processing program that introduced compensation for system response to correct photometry, geometric linearity, and resolution, converted a television camera to a quantitative measuring instrument. The output data are in the forms of both numeric printout records and photographs.

  16. 76 FR 39860 - Proposed Agency Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... ``Broad Program Area Categories'' (BPACs) for purposes of conducting the research. For each evaluation... data collection methods than those prescribed for high-rigor. For example, data may be collected by... methods to produce energy savings and outcome estimates. A range of qualitative, quantitative (survey), on...

  17. The Holy Trinity of Methodological Rigor: A Skeptical View

    ERIC Educational Resources Information Center

    Coryn, Chris L. S.

    2007-01-01

    The author discusses validation hierarchies grounded in the tradition of quantitative research that generally consists of the criteria of validity, reliability and objectivity and compares this with similar criteria developed by the qualitative tradition, described as trustworthiness, dependability and confirmability. Although these quantitative…

  18. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    PubMed

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Rigorous numerical modeling of scattering-type scanning near-field optical microscopy and spectroscopy

    NASA Astrophysics Data System (ADS)

    Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun

    2017-11-01

    Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.

  20. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  1. Defining quantitative stream disturbance gradients and the additive role of habitat variation to explain macroinvertebrate taxa richness

    EPA Science Inventory

    Most studies dealing with the use of ecological indicators and other applied ecological research relies on some definition or concept of what constitutes least-, intermediate- and most-disturbed condition. Currently, most rigorous methodologies designed to define those conditions...

  2. What Districts Know--and Need to Know--about Their Principals

    ERIC Educational Resources Information Center

    George W. Bush Institute, Education Reform Initiative, 2016

    2016-01-01

    Research on school leadership has highlighted the impact principals have on student achievement through their influence on classroom instruction, organizational conditions, community support, and setting the teaching and learning conditions in schools. However, there is limited rigorous quantitative research on the best district policies and…

  3. A critical methodological review of discourse and conversation analysis studies of family therapy.

    PubMed

    Tseliou, Eleftheria

    2013-12-01

    Discourse (DA) and conversation (CA) analysis, two qualitative research methods, have been recently suggested as potentially promising for the study of family therapy due to common epistemological adherences and their potential for an in situ study of therapeutic dialog. However, to date, there is no systematic methodological review of the few existing DA and CA studies of family therapy. This study aims at addressing this lack by critically reviewing published DA and CA studies of family therapy on methodological grounds. Twenty-eight articles in total are reviewed in relation to certain methodological axes identified in the relevant literature. These include choice of method, framing of research question(s), data/sampling, type of analysis, epistemological perspective, content/type of knowledge claims, and attendance to criteria for good quality practice. It is argued that the reviewed studies show "glimpses" of the methods' potential for family therapy research despite the identification of certain "shortcomings" regarding their methodological rigor. These include unclearly framed research questions and the predominance of case study designs. They also include inconsistencies between choice of method, stated or unstated epistemological orientations and knowledge claims, and limited attendance to criteria for good quality practice. In conclusion, it is argued that DA and CA can add to the existing quantitative and qualitative methods for family therapy research. They can both offer unique ways for a detailed study of the actual therapeutic dialog, provided that future attempts strive for a methodologically rigorous practice and against their uncritical deployment. © FPI, Inc.

  4. Tracking Perfluorocarbon Nanoemulsion Delivery by 19F MRI for Precise High Intensity Focused Ultrasound Tumor Ablation

    PubMed Central

    Shin, Soo Hyun; Park, Eun-Joo; Min, Changki; Choi, Sun Il; Jeon, Soyeon; Kim, Yun-Hee; Kim, Daehong

    2017-01-01

    Perfluorocarbon nanoemulsions (PFCNEs) have recently been undergoing rigorous study to investigate their ability to improve the therapeutic efficacy of tumor ablation by high intensity focused ultrasound (HIFU). For precise control of PFCNE delivery and thermal ablation, their accumulation and distribution in a tumor should be quantitatively analyzed. Here, we used fluorine-19 (19F) magnetic resonance imaging (MRI) to quantitatively track PFCNE accumulation in a tumor, and analyzed how intra-tumoral PFCNE quantities affect the therapeutic efficacy of HIFU treatment. Ablation outcomes were assessed by intra-voxel incoherent motion analysis and bioluminescent imaging up to 14 days after the procedure. Assessment of PFCNE delivery and treatment outcomes showed that 2-3 mg/mL of PFCNE in a tumor produces the largest ablation volume under the same HIFU insonation conditions. Histology showed varying degrees of necrosis depending on the amount of PFCNE delivered. 19F MRI promises to be a valuable platform for precisely guiding PFCNE-enhanced HIFU ablation of tumors. PMID:28255351

  5. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2008-01-01

    An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  6. Harnessing cell-to-cell variations to probe bacterial structure and biophysics

    NASA Astrophysics Data System (ADS)

    Cass, Julie A.

    Advances in microscopy and biotechnology have given us novel insights into cellular biology and physics. While bacteria were long considered to be relatively unstructured, the development of fluorescence microscopy techniques, and spatially and temporally resolved high-throughput quantitative studies, have uncovered that the bacterial cell is highly organized, and its structure rigorously maintained. In this thesis I will describe our gateTool software, designed to harness cell-to-cell variations to probe bacterial structure, and discuss two exciting aspects of structure that we have employed gateTool to investigate: (i) chromosome organization and the cellular mechanisms for controlling DNA dynamics, and (ii) the study of cell wall synthesis, and how the genes in the synthesis pathway impact cellular shape. In the first project, we develop a spatial and temporal mapping of cell-cycle-dependent chromosomal organization, and use this quantitative map to discover that chromosomal loci segregate from midcell with universal dynamics. In the second project, I describe preliminary time- lapse and snapshot imaging analysis suggesting phentoypical coherence across peptidoglycan synthesis pathways.

  7. Scale and Scope Economies of Distance Education in Australian Universities

    ERIC Educational Resources Information Center

    Zhang, Liang-Cheng; Worthington, Andrew C.

    2017-01-01

    Despite compelling qualitative arguments for scale and scope economies in university-level distance education, as distinct from traditional class-based face-to-face instruction, there is little rigorous quantitative evidence in support. In this paper, we explore the scale and scope economies of distance education using a multiplicatively separable…

  8. Problem Orientation in Physical Geography Teaching.

    ERIC Educational Resources Information Center

    Church, Michael

    1988-01-01

    States that the introduction of real, quantitative problems in classroom and field teaching improves scientific rigor and leads more directly to applied studies. Examines the use of problems in an introductory hydrology course, presenting teaching objectives and the full course structure to illustrate their integration with other teaching modes.…

  9. Mixing Methods in Instructional Research. Forum: The Future of Instructional Communication

    ERIC Educational Resources Information Center

    Meluch, Andrea L.

    2017-01-01

    "Communication Education's" commitment to publishing rigorous instructional communication research with a diversity of methods is clear in its publication of quantitative, qualitative, and critical articles (Witt, 2012). Here, Andrea Meluch argues that the trends in the most recent research published within this journal indicate a lack…

  10. Distinctive fingerprints of erosional regimes in terrestrial channel networks

    NASA Astrophysics Data System (ADS)

    Grau Galofre, A.; Jellinek, M.

    2017-12-01

    Satellite imagery and digital elevation maps capture the large scale morphology of channel networks attributed to long term erosional processes, such as fluvial, glacial, groundwater sapping and subglacial erosion. Characteristic morphologies associated with each of these styles of erosion have been studied in detail, but there exists a knowledge gap related to their parameterization and quantification. This knowledge gap prevents a rigorous analysis of the dominant processes that shaped a particular landscape, and a comparison across styles of erosion. To address this gap, we use previous morphological descriptions of glaciers, rivers, sapping valleys and tunnel valleys to identify and measure quantitative metrics diagnostic of these distinctive styles of erosion. From digital elevation models, we identify four geometric metrics: The minimum channel width, channel aspect ratio (longest length to channel width at the outlet), presence of undulating longitudinal profiles, and tributary junction angle. We also parameterize channel network complexity in terms of its stream order and fractal dimension. We then perform a statistical classification of the channel networks using a Principal Component Analysis on measurements of these six metrics on a dataset of 70 channelized systems. We show that rivers, glaciers, groundwater seepage and subglacial meltwater erode the landscape in rigorously distinguishable ways. Our methodology can more generally be applied to identify the contributions of different processes involved in carving a channel network. In particular, we are able to identify transitions from fluvial to glaciated landscapes or vice-versa.

  11. A two-factor error model for quantitative steganalysis

    NASA Astrophysics Data System (ADS)

    Böhme, Rainer; Ker, Andrew D.

    2006-02-01

    Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.

  12. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem.

    PubMed

    Huang, Honggang; Larsen, Martin R; Palmisano, Giuseppe; Dai, Jie; Lametsch, René

    2014-06-25

    Protein phosphorylation can regulate most of the important processes in muscle, such as metabolism and contraction. The postmortem (PM) metabolism and rigor mortis have essential effects on meat quality. In order to identify and characterize the protein phosphorylation events involved in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160 phosphoproteins with 784 phosphorylation sites. Among these, 184 phosphorylation sites on 93 proteins had their phosphorylation levels significantly changed. The proteins involved in glucose metabolism and muscle contraction were the two largest clusters of phosphoproteins with significantly changed phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant changes at the phosphorylation level but were relatively stable at the total protein level, suggesting that protein phosphorylation may have important roles in meat quality development through the regulation of proteins involved in glucose metabolism and muscle contraction, thereby affecting glycolysis and rigor mortis development in PM muscle. The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed the dimethyl labeling combined with the TiSH phosphopeptide enrichment and LC-MS/MS strategy. This was the first high-throughput quantitative phosphoproteomic study in PM muscle of farm animals. In the work, both the proteome and phosphoproteome were analyzed, and the large number of identified peptides, phosphopeptides and phosphorylation sites can greatly enrich the current farm animal protein database. The proteins involved in glycometabolism, muscle contraction and heat shock proteins (HSPs) showed significantly changed phosphorylation levels during PM meat development. This work indicated that PM muscle proteins underwent significant changes at phosphorylation level but were relatively stable at the total protein level, suggesting that protein phosphorylation may have important roles in meat development through the regulation of proteins involved in metabolism and muscle contraction, thereby affecting glycolysis and rigor mortis development in PM muscle. The work can promote the understanding of PM muscle metabolism and meat quality development, and be helpful for future meat quality control. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    PubMed

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.

  14. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    PubMed Central

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040

  15. Monitoring temporal microstructural variations of skeletal muscle tissues by multispectral Mueller matrix polarimetry

    NASA Astrophysics Data System (ADS)

    Dong, Yang; He, Honghui; He, Chao; Ma, Hui

    2017-02-01

    Mueller matrix polarimetry is a powerful tool for detecting microscopic structures, therefore can be used to monitor physiological changes of tissue samples. Meanwhile, spectral features of scattered light can also provide abundant microstructural information of tissues. In this paper, we take the 2D multispectral backscattering Mueller matrix images of bovine skeletal muscle tissues, and analyze their temporal variation behavior using multispectral Mueller matrix parameters. The 2D images of the Mueller matrix elements are reduced to the multispectral frequency distribution histograms (mFDHs) to reveal the dominant structural features of the muscle samples more clearly. For quantitative analysis, the multispectral Mueller matrix transformation (MMT) parameters are calculated to characterize the microstructural variations during the rigor mortis and proteolysis processes of the skeletal muscle tissue samples. The experimental results indicate that the multispectral MMT parameters can be used to judge different physiological stages for bovine skeletal muscle tissues in 24 hours, and combining with the multispectral technique, the Mueller matrix polarimetry and FDH analysis can monitor the microstructural variation features of skeletal muscle samples. The techniques may be used for quick assessment and quantitative monitoring of meat qualities in food industry.

  16. Analysis of Mammalian Sphingolipids by Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) and Tissue Imaging Mass Spectrometry (TIMS)

    PubMed Central

    Sullards, M. Cameron; Liu, Ying; Chen, Yanfeng; Merrill, Alfred H.

    2011-01-01

    Sphingolipids are a highly diverse category of molecules that serve not only as components of biological structures but also as regulators of numerous cell functions. Because so many of the structural features of sphingolipids give rise to their biological activity, there is a need for comprehensive or “sphingolipidomic” methods for identification and quantitation of as many individual subspecies as possible. This review defines sphingolipids as a class, briefly discusses classical methods for their analysis, and focuses primarily on liquid chromatography tandem mass spectrometry (LC-MS/MS) and tissue imaging mass spectrometry (TIMS). Recently, a set of evolving and expanding methods have been developed and rigorously validated for the extraction, identification, separation, and quantitation of sphingolipids by LC-MS/MS. Quantitation of these biomolecules is made possible via the use of an internal standard cocktail. The compounds that can be readily analyzed are free long-chain (sphingoid) bases, sphingoid base 1-phosphates, and more complex species such as ceramides, ceramide 1-phosphates, sphingomyelins, mono- and di-hexosylceramides sulfatides, and novel compounds such as the 1-deoxy- and 1-(deoxymethyl)-sphingoid bases and their N-acyl-derivatives. These methods can be altered slightly to separate and quantitate isomeric species such as glucosyl/galactosylceramide. Because these techniques require the extraction of sphingolipids from their native environment, any information regarding their localization in histological slices is lost. Therefore, this review also describes methods for TIMS. This technique has been shown to be a powerful tool to determine the localization of individual molecular species of sphingolipids directly from tissue slices. PMID:21749933

  17. Image analysis of the blood cells for cytomorphodiagnostics and control of the effectiveness treatment

    NASA Astrophysics Data System (ADS)

    Zhukotsky, Alexander V.; Kogan, Emmanuil M.; Kopylov, Victor F.; Marchenko, Oleg V.; Lomakin, O. A.

    1994-07-01

    A new method for morphodensitometric analysis of blood cells was applied for medically screening some ecological influence and infection pathologies. A complex algorithm of computational image processing was created for supra molecular restructurings of interphase chromatin of lymphocytes research. It includes specific methods of staining and unifies different quantitative analysis methods. Our experience with the use of a television image analyzer in cytological and immunological studies made it possible to carry out some research in morphometric analysis of chromatin structure in interphase lymphocyte nuclei in genetic and virus pathologies. In our study to characterize lymphocytes as an image-forming system by a rigorous mathematical description we used an approach involving contaminant evaluation of the topography of chromatin network intact and victims' lymphocytes. It is also possible to digitize data, which revealed significant distinctions between control and experiment. The method allows us to observe the minute structural changes in chromatin, especially eu- and hetero-chromatin that were previously studied by genetics only in chromosomes.

  18. A quantitative literature-curated gold standard for kinase-substrate pairs

    PubMed Central

    2011-01-01

    We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431

  19. Complicating Canons: A Critical Literacy Challenge to Common Core Assessment

    ERIC Educational Resources Information Center

    Peel, Anne

    2017-01-01

    The widespread adoption of the Common Core State Standards in the US has prioritized rigorous reading of complex texts. The emphasis on text complexity has led to instructional and assessment materials that constrain critical literacy practices by emphasizing quantitative features of text, such as sentence length, and a static list of text…

  20. Development of rigorous fatty acid near-infrared spectroscopy quantitation methods in support of soybean oil improvement

    USDA-ARS?s Scientific Manuscript database

    The seed of soybean (Glycine max L. Merr) is a valuable source of high quality edible oil and protein. Despite dramatic breeding gains over the past 80 years, soybean seed oil continues to be oxidatively unstable. Until recently, the majority of soybean oil underwent partial chemical hydrogenation. ...

  1. Innovative Mixed-Methods Research: Moving beyond Design Technicalities to Epistemological and Methodological Realizations

    ERIC Educational Resources Information Center

    Riazi, A. Mehdi

    2016-01-01

    Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…

  2. Beyond the Quantitative and Qualitative Divide: Research in Art Education as Border Skirmish.

    ERIC Educational Resources Information Center

    Sullivan, Graeme

    1996-01-01

    Analyzes a research project that utilizes a coherent conceptual model of art education research incorporating the demand for empirical rigor and providing for diverse interpretive frameworks. Briefly profiles the NUD*IST (Non-numerical Unstructured Data Indexing Searching and Theorizing) software system that can organize and retrieve complex…

  3. Adding Rigor to Classroom Assessment Techniques for Non-Traditional Adult Programs: A Lifecycle Improvement Approach

    ERIC Educational Resources Information Center

    Thomas, Jason E.; Hornsey, Philip E.

    2014-01-01

    Formative Classroom Assessment Techniques (CAT) have been well-established instructional tools in higher education since their exposition in the late 1980s (Angelo & Cross, 1993). A large body of literature exists surrounding the strengths and weaknesses of formative CATs. Simpson-Beck (2011) suggested insufficient quantitative evidence exists…

  4. Comprehensive methods for earlier detection and monitoring of forest decline

    Treesearch

    Jennifer Pontius; Richard Hallett

    2014-01-01

    Forested ecosystems are threatened by invasive pests, pathogens, and unusual climatic events brought about by climate change. Earlier detection of incipient forest health problems and a quantitatively rigorous assessment method is increasingly important. Here, we describe a method that is adaptable across tree species and stress agents and practical for use in the...

  5. Poverty in People with Disabilities: Indicators from the Capability Approach

    ERIC Educational Resources Information Center

    Rosano, Aldo; Mancini, Federica; Solipaca, Alessandro

    2009-01-01

    People with disability are particularly exposed to poor living conditions: on one hand they have more difficulties in getting an income cause to their inabilities, on the other hand conditions of poverty increase the risk of disability. However, little rigorous quantitative research has been undertaken to measure the real impact of disability on…

  6. Louis Guttman's Contributions to Classical Test Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald

    2005-01-01

    This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…

  7. Using the Principles of "BIO2010" to Develop an Introductory, Interdisciplinary Course for Biology Students

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2010-01-01

    Modern biological sciences require practitioners to have increasing levels of knowledge, competence, and skills in mathematics and programming. A recent review of the science curriculum at the University of Queensland, a large, research-intensive institution in Australia, resulted in the development of a more quantitatively rigorous undergraduate…

  8. Teacher Efficacy of High School Mathematics Co-Teachers

    ERIC Educational Resources Information Center

    Rimpola, Raquel C.

    2011-01-01

    High school mathematics inclusion classes help provide all students the access to rigorous curriculum. This study provides information about the teacher efficacy of high school mathematics co-teachers. It considers the influence of the amount of collaborative planning time on the efficacy of co-teachers. A quantitative research design was used,…

  9. Quantifying trail erosion and stream sedimentation with sediment tracers

    Treesearch

    Mark S. Riedel

    2006-01-01

    Abstract--The impacts of forest disturbance and roads on stream sedimentation have been rigorously investigated and documented. While historical research on turbidity and suspended sediments has been thorough, studies of stream bed sedimentation have typically relied on semi-quantitative measures such as embeddedness or marginal pool depth. To directly quantify the...

  10. A Review of Mixed Methods Research on Bullying and Peer Victimization in School

    ERIC Educational Resources Information Center

    Hong, Jun Sung; Espelage, Dorothy L.

    2012-01-01

    Recognizing the negative outcomes associated with experiences in bullying at school, educational researchers, school officials, and policy-makers have called for more rigorous research on bullying in school. Research on bullying behavior in school has primarily been examined using quantitative methods. Mixed methods research in the field of…

  11. Vibrational algorithms for quantitative crystallographic analyses of hydroxyapatite-based biomaterials: II, application to decayed human teeth.

    PubMed

    Adachi, Tetsuya; Pezzotti, Giuseppe; Yamamoto, Toshiro; Ichioka, Hiroaki; Boffelli, Marco; Zhu, Wenliang; Kanamura, Narisato

    2015-05-01

    A systematic investigation, based on highly spectrally resolved Raman spectroscopy, was undertaken to research the efficacy of vibrational assessments in locating chemical and crystallographic fingerprints for the characterization of dental caries and the early detection of non-cavitated carious lesions. Raman results published by other authors have indicated possible approaches for this method. However, they conspicuously lacked physical insight at the molecular scale and, thus, the rigor necessary to prove the efficacy of this spectroscopy method. After solving basic physical challenges in a companion paper, we apply them here in the form of newly developed Raman algorithms for practical dental research. Relevant differences in mineral crystallite (average) orientation and texture distribution were revealed for diseased enamel at different stages compared with healthy mineralized enamel. Clear spectroscopy features could be directly translated in terms of a rigorous and quantitative classification of crystallography and chemical characteristics of diseased enamel structures. The Raman procedure enabled us to trace back otherwise invisible characteristics in early caries, in the translucent zone (i.e., the advancing front of the disease) and in the body of lesion of cavitated caries.

  12. Food Security, Decision Making and the Use of Remote Sensing in Famine Early Warning Systems

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.

    2008-01-01

    Famine early warning systems use remote sensing in combination with socio-economic and household food economy analysis to provide timely and rigorous information on emerging food security crises. The Famine Early Warning Systems Network (FEWS NET) is the US Agency for International Development's decision support system in 20 African countries, as well as in Guatemala, Haiti and Afghanistan. FEWS NET provides early and actionable policy guidance for the US Government and its humanitarian aid partners. As we move into an era of climate change where weather hazards will become more frequent and severe, understanding how to provide quantitative and actionable scientific information for policy makers using biophysical data is critical for an appropriate and effective response.

  13. Quantitative characterization of gold nanoparticles by size-exclusion and hydrodynamic chromatography, coupled to inductively coupled plasma mass spectrometry and quasi-elastic light scattering.

    PubMed

    Pitkänen, Leena; Montoro Bustos, Antonio R; Murphy, Karen E; Winchester, Michael R; Striegel, André M

    2017-08-18

    The physicochemical characterization of nanoparticles (NPs) is of paramount importance for tailoring and optimizing the properties of these materials as well as for evaluating the environmental fate and impact of the NPs. Characterizing the size and chemical identity of disperse NP sample populations can be accomplished by coupling size-based separation methods to physical and chemical detection methods. Informed decisions regarding the NPs can only be made, however, if the separations themselves are quantitative, i.e., if all or most of the analyte elutes from the column within the course of the experiment. We undertake here the size-exclusion chromatographic characterization of Au NPs spanning a six-fold range in mean size. The main problem which has plagued the size-exclusion chromatography (SEC) analysis of Au NPs, namely lack of quantitation accountability due to generally poor NP recovery from the columns, is overcome by carefully matching eluent formulation with the appropriate stationary phase chemistry, and by the use of on-line inductively coupled plasma mass spectrometry (ICP-MS) detection. Here, for the first time, we demonstrate the quantitative analysis of Au NPs by SEC/ICP-MS, including the analysis of a ternary NP blend. The SEC separations are contrasted to HDC/ICP-MS (HDC: hydrodynamic chromatography) separations employing the same stationary phase chemistry. Additionally, analysis of Au NPs by HDC with on-line quasi-elastic light scattering (QELS) allowed for continuous determination of NP size across the chromatographic profiles, circumventing issues related to the shedding of fines from the SEC columns. The use of chemically homogeneous reference materials with well-defined size range allowed for better assessment of the accuracy and precision of the analyses, and for a more direct interpretation of results, than would be possible employing less rigorously characterized analytes. Published by Elsevier B.V.

  14. Review of rigorous coupled-wave analysis and of homogeneous effective medium approximations for high spatial-frequency surface-relief gratings

    NASA Technical Reports Server (NTRS)

    Glytsis, Elias N.; Brundrett, David L.; Gaylord, Thomas K.

    1993-01-01

    A review of the rigorous coupled-wave analysis as applied to the diffraction of electro-magnetic waves by gratings is presented. The analysis is valid for any polarization, angle of incidence, and conical diffraction. Cascaded and/or multiplexed gratings as well as material anisotropy can be incorporated under the same formalism. Small period rectangular groove gratings can also be modeled using approximately equivalent uniaxial homogeneous layers (effective media). The ordinary and extraordinary refractive indices of these layers depend on the gratings filling factor, the refractive indices of the substrate and superstrate, and the ratio of the freespace wavelength to grating period. Comparisons of the homogeneous effective medium approximations with the rigorous coupled-wave analysis are presented. Antireflection designs (single-layer or multilayer) using the effective medium models are presented and compared. These ultra-short period antireflection gratings can also be used to produce soft x-rays. Comparisons of the rigorous coupled-wave analysis with experimental results on soft x-ray generation by gratings are also included.

  15. Rigorous Science: a How-To Guide

    PubMed Central

    Fang, Ferric C.

    2016-01-01

    ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205

  16. Coordinated Optimization of Visual Cortical Maps (I) Symmetry-based Analysis

    PubMed Central

    Reichl, Lars; Heide, Dominik; Löwel, Siegrid; Crowley, Justin C.; Kaschube, Matthias; Wolf, Fred

    2012-01-01

    In the primary visual cortex of primates and carnivores, functional architecture can be characterized by maps of various stimulus features such as orientation preference (OP), ocular dominance (OD), and spatial frequency. It is a long-standing question in theoretical neuroscience whether the observed maps should be interpreted as optima of a specific energy functional that summarizes the design principles of cortical functional architecture. A rigorous evaluation of this optimization hypothesis is particularly demanded by recent evidence that the functional architecture of orientation columns precisely follows species invariant quantitative laws. Because it would be desirable to infer the form of such an optimization principle from the biological data, the optimization approach to explain cortical functional architecture raises the following questions: i) What are the genuine ground states of candidate energy functionals and how can they be calculated with precision and rigor? ii) How do differences in candidate optimization principles impact on the predicted map structure and conversely what can be learned about a hypothetical underlying optimization principle from observations on map structure? iii) Is there a way to analyze the coordinated organization of cortical maps predicted by optimization principles in general? To answer these questions we developed a general dynamical systems approach to the combined optimization of visual cortical maps of OP and another scalar feature such as OD or spatial frequency preference. From basic symmetry assumptions we obtain a comprehensive phenomenological classification of possible inter-map coupling energies and examine representative examples. We show that each individual coupling energy leads to a different class of OP solutions with different correlations among the maps such that inferences about the optimization principle from map layout appear viable. We systematically assess whether quantitative laws resembling experimental observations can result from the coordinated optimization of orientation columns with other feature maps. PMID:23144599

  17. Clinically Relevant Subregions of Articular Cartilage of the Hip for Analysis and Reporting Quantitative Magnetic Resonance Imaging: A Technical Note.

    PubMed

    Surowiec, Rachel K; Lucas, Erin P; Wilson, Katharine J; Saroki, Adriana J; Ho, Charles P

    2014-01-01

    Before quantitative imaging techniques can become clinically valuable, the method, and more specifically, the regions of locating and reporting these values should be standardized toward reproducibility comparisons across centers and longitudinal follow-up of individual patients. The purpose of this technical note is to describe a rigorous and reproducible method of locating, analyzing, and reporting quantitative MRI values in hip articular cartilage with an approach that is consistent with current orthopedic literature. To demonstrate this localization and documentation, 3 patients (age, 23 ± 5.1 years; 2 males, 1 female) who presented with symptomatic mixed-type femoroacetabular impingement (α angle, 63.3° ± 2.1°; center edge angle, 39° ± 4.2°) were evaluated with T2-mapping at 3 T MRI prior to hip arthroscopy. Manual segmentation was performed and cartilage of the acetabulum and femur was divided into 12 subregions adapted from the geographic zone method. Bone landmarks in the acetabulum and femur, identifiable both in arthroscopy and MR images, were manually selected and the coordinates exported for division of cartilage. Mean T2 values in each zone are presented. The current work outlines a standardized system to locate and describe quantitative mapping values that could aid in surgical decision making, planning, and the noninvasive longitudinal follow-up of implemented cartilage preservation and restoration techniques.

  18. Optimising hydrogen peroxide measurement in exhaled breath condensate.

    PubMed

    Brooks, Wendy M; Lash, Heath; Kettle, Anthony J; Epton, Michael J

    2006-01-01

    Exhaled breath condensate (EBC) analysis has been proposed as a non-invasive method of assessing airway pathology. A number of substances, including hydrogen peroxide (H2O2), have been measured in EBC, without adequate published details of validation and optimisation. To explore factors that affect accurate quantitation of H2O2 in EBC. H2O2 was measured in EBC samples using fluorometry with 4-hydroxyphenylacetic acid. A number of factors that might alter quantitation were studied including pH and buffering conditions, reagent storage, and assay temperature. Standard curve slope was significantly altered by pH, leading to a potential difference in H2O2 quantification of up to 42%. These differences were resolved by increasing the buffering capacity of the reaction mix. H2O2 added to EBC remained stable for 1 h when stored on ice. The assay was unaffected by freezing assay reagents. The limit of detection for H2O2 ranged from 3.4 nM to 8.8 nM depending on the buffer used. The reagents required for this assay can be stored for several months allowing valuable consistency in longitudinal studies. The quantitation of H2O2 in EBC is pH-dependent but increasing assay buffering reduces this effect. Sensitive reproducible quantitation of H2O2 in EBC requires rigorous optimisation.

  19. Signal and noise modeling in confocal laser scanning fluorescence microscopy.

    PubMed

    Herberich, Gerlind; Windoffer, Reinhard; Leube, Rudolf E; Aach, Til

    2012-01-01

    Fluorescence confocal laser scanning microscopy (CLSM) has revolutionized imaging of subcellular structures in biomedical research by enabling the acquisition of 3D time-series of fluorescently-tagged proteins in living cells, hence forming the basis for an automated quantification of their morphological and dynamic characteristics. Due to the inherently weak fluorescence, CLSM images exhibit a low SNR. We present a novel model for the transfer of signal and noise in CLSM that is both theoretically sound as well as corroborated by a rigorous analysis of the pixel intensity statistics via measurement of the 3D noise power spectra, signal-dependence and distribution. Our model provides a better fit to the data than previously proposed models. Further, it forms the basis for (i) the simulation of the CLSM imaging process indispensable for the quantitative evaluation of CLSM image analysis algorithms, (ii) the application of Poisson denoising algorithms and (iii) the reconstruction of the fluorescence signal.

  20. Qualitative Evaluation Methods in Ethics Education: A Systematic Review and Analysis of Best Practices.

    PubMed

    Watts, Logan L; Todd, E Michelle; Mulhearn, Tyler J; Medeiros, Kelsey E; Mumford, Michael D; Connelly, Shane

    2017-01-01

    Although qualitative research offers some unique advantages over quantitative research, qualitative methods are rarely employed in the evaluation of ethics education programs and are often criticized for a lack of rigor. This systematic review investigated the use of qualitative methods in studies of ethics education. Following a review of the literature in which 24 studies were identified, each study was coded based on 16 best practices characteristics in qualitative research. General thematic analysis and grounded theory were found to be the dominant approaches used. Researchers are effectively executing a number of best practices, such as using direct data sources, structured data collection instruments, non-leading questioning, and expert raters. However, other best practices were rarely present in the courses reviewed, such as collecting data using multiple sources, methods, raters, and timepoints, evaluating reliability, and employing triangulation analyses to assess convergence. Recommendations are presented for improving future qualitative research studies in ethics education.

  1. Quantitative analysis of chromosomal CGH in human breast tumors associates copy number abnormalities with p53 status and patient survival

    PubMed Central

    Jain, Ajay N.; Chin, Koei; Børresen-Dale, Anne-Lise; Erikstein, Bjorn K.; Lonning, Per Eystein; Kaaresen, Rolf; Gray, Joe W.

    2001-01-01

    We present a general method for rigorously identifying correlations between variations in large-scale molecular profiles and outcomes and apply it to chromosomal comparative genomic hybridization data from a set of 52 breast tumors. We identify two loci where copy number abnormalities are correlated with poor survival outcome (gain at 8q24 and loss at 9q13). We also identify a relationship between abnormalities at two loci and the mutational status of p53. Gain at 8q24 and loss at 5q15-5q21 are linked with mutant p53. The 9q and 5q losses suggest the possibility of gene products involved in breast cancer progression. The analytical techniques are general and also are applicable to the analysis of array-based expression data. PMID:11438741

  2. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-03

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.

  3. Reexamining protein–protein and protein–solvent interactions from Kirkwood-Buff analysis of light scattering in multi-component solutions

    PubMed Central

    Blanco, Marco A.; Sahin, Erinc; Li, Yi; Roberts, Christopher J.

    2011-01-01

    The classic analysis of Rayleigh light scattering (LS) is re-examined for multi-component protein solutions, within the context of Kirkwood-Buff (KB) theory as well as a more generalized canonical treatment. Significant differences arise when traditional treatments that approximate constant pressure and neglect concentration fluctuations in one or more (co)solvent∕co-solute species are compared with more rigorous treatments at constant volume and with all species free to fluctuate. For dilute solutions, it is shown that LS can be used to rigorously and unambiguously obtain values for the osmotic second virial coefficient (B22), in contrast with recent arguments regarding protein interactions deduced from LS experiments. For more concentrated solutions, it is shown that conventional analysis over(under)-estimates the magnitude of B22 for significantly repulsive(attractive) conditions, and that protein-protein KB integrals (G22) are the more relevant quantity obtainable from LS. Published data for α–chymotrypsinogen A and a series of monoclonal antibodies at different pH and salt concentrations are re-analyzed using traditional and new treatments. The results illustrate that while traditional analysis may be sufficient if one is interested in only the sign of B22 or G22, the quantitative values can be significantly in error. A simple approach is illustrated for determining whether protein concentration (c2) is sufficiently dilute for B22 to apply, and for correcting B22 values from traditional LS regression at higher c2 values. The apparent molecular weight M2, app obtained from LS is shown to generally not be equal to the true molecular weight, with the differences arising from a combination of protein-solute and protein-cosolute interactions that may, in principle, also be determined from LS. PMID:21682538

  4. Proficient vs. Prepared: Disparities between State Tests and the 2013 National Assessment of Educational Progress (NAEP)

    ERIC Educational Resources Information Center

    Achieve, Inc., 2015

    2015-01-01

    Today's economy demands that all young people develop high-level literacy, quantitative reasoning, problem solving, communication, and collaboration skills, all grounded in a rigorous and content-rich K-12 curriculum. Acquiring these skills ensures that high school graduates are academically prepared to pursue the future of their choosing.…

  5. A Case Study of Resources Management Planning with Multiple Objectives and Projects

    Treesearch

    David L. Peterson; David G. Silsbee; Daniel L. Schmoldt

    1995-01-01

    Each National Park Service unit in the United States produces a resources management plan (RMP) every four years or less. The plans commit budgets and personnel to specific projects for four years, but they are prepared with little quantitative and analytical rigor and without formal decisionmaking tools. We have previously described a multiple objective planning...

  6. Strategic and tactiocal planning for managing national park resources

    Treesearch

    Daniel L. Schmoldt; David L. Peterson

    2001-01-01

    Each National Park Service unit in the United States produces a resource management plan (RMP) every four years or less. These plans constitute a strategic agenda for a park. Later, tactical plans commit budgets and personnel to specific projects over the planning horizon. Yet, neither planning stage incorporates much quantitative and analytical rigor and is devoid of...

  7. Breaking into Public Policy Circles for the Benefit of Underserved Communities

    ERIC Educational Resources Information Center

    Flores, Stella M.

    2017-01-01

    Stella M. Flores writes about how she learned to participate in the American political process through lessons from her immigrant family. As a quantitative scholar, she documents the commitment to rigorous, evidence-based research on equity noting that not all datasets are without politics or bias. From this perspective, the story of the Latino in…

  8. Quantifying falsifiability of scientific theories

    NASA Astrophysics Data System (ADS)

    Nemenman, Ilya

    I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.

  9. A Historical Survey of the Contributions of Francois-Joseph Servois to the Development of the Rigorous Calculus

    ERIC Educational Resources Information Center

    Petrilli, Salvatore John, Jr.

    2009-01-01

    Historians of mathematics considered the nineteenth century to be the Golden Age of mathematics. During this time period many areas of mathematics, such as algebra and geometry, were being placed on rigorous foundations. Another area of mathematics which experienced fundamental change was analysis. The drive for rigor in calculus began in 1797…

  10. Determining if disease management saves money: an introduction to meta-analysis.

    PubMed

    Linden, Ariel; Adams, John L

    2007-06-01

    Disease management (DM) programmes have long been promoted as a major medical cost-saving mechanism, even though the scant research that exists on the topic has provided conflicting results. In a 2004 literature review, the Congressional Budget Office stated that 'there is insufficient evidence to conclude that disease management programs can generally reduce the overall cost of health care services'. To address this question more accurately, a meta-analysis was warranted. Meta-analysis is the quantitative technique used to pool the results of many studies on the same topic and summarize them statistically. This method is also quite suitable for individual DM firms to assess whether their programmes are effective at the aggregate level. This paper describes the elements of a rigorous meta-analytic process and discusses potential biases. A hypothetical DM organization is then evaluated with a specific emphasis on medical cost-savings, simulating a case in which different populations are served, evaluation methodologies are employed, and diseases are managed.

  11. Handwriting Examination: Moving from Art to Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, K.H.; Hanlen, R.C.; Manzolillo, P.A.

    In this document, we present a method for validating the premises and methodology of forensic handwriting examination. This method is intuitively appealing because it relies on quantitative measurements currently used qualitatively by FDE's in making comparisons, and it is scientifically rigorous because it exploits the power of multivariate statistical analysis. This approach uses measures of both central tendency and variation to construct a profile for a given individual. (Central tendency and variation are important for characterizing an individual's writing and both are currently used by FDE's in comparative analyses). Once constructed, different profiles are then compared for individuality using clustermore » analysis; they are grouped so that profiles within a group cannot be differentiated from one another based on the measured characteristics, whereas profiles between groups can. The cluster analysis procedure used here exploits the power of multivariate hypothesis testing. The result is not only a profile grouping but also an indication of statistical significance of the groups generated.« less

  12. Development of a theoretical framework for analyzing cerebrospinal fluid dynamics

    PubMed Central

    Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy

    2009-01-01

    Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652

  13. Dynamic Characteristics of Micro-Beams Considering the Effect of Flexible Supports

    PubMed Central

    Zhong, Zuo-Yang; Zhang, Wen-Ming; Meng, Guang

    2013-01-01

    Normally, the boundaries are assumed to allow small deflections and moments for MEMS beams with flexible supports. The non-ideal boundary conditions have a significant effect on the qualitative dynamical behavior. In this paper, by employing the principle of energy equivalence, rigorous theoretical solutions of the tangential and rotational equivalent stiffness are derived based on the Boussinesq's and Cerruti's displacement equations. The non-dimensional differential partial equation of the motion, as well as coupled boundary conditions, are solved analytically using the method of multiple time scales. The closed-form solution provides a direct insight into the relationship between the boundary conditions and vibration characteristics of the dynamic system, in which resonance frequencies increase with the nonlinear mechanical spring effect but decrease with the effect of flexible supports. The obtained results of frequencies and mode shapes are compared with the cases of ideal boundary conditions, and the differences between them are contrasted on frequency response curves. The influences of the support material property on the equivalent stiffness and resonance frequency shift are also discussed. It is demonstrated that the proposed model with the flexible supports boundary conditions has significant effect on the rigorous quantitative dynamical analysis of the MEMS beams. Moreover, the proposed analytical solutions are in good agreement with those obtained from finite element analyses.

  14. Qualitative research methods: key features and insights gained from use in infection prevention research.

    PubMed

    Forman, Jane; Creswell, John W; Damschroder, Laura; Kowalski, Christine P; Krein, Sarah L

    2008-12-01

    Infection control professionals and hospital epidemiologists are accustomed to using quantitative research. Although quantitative studies are extremely important in the field of infection control and prevention, often they cannot help us explain why certain factors affect the use of infection control practices and identify the underlying mechanisms through which they do so. Qualitative research methods, which use open-ended techniques, such as interviews, to collect data and nonstatistical techniques to analyze it, provide detailed, diverse insights of individuals, useful quotes that bring a realism to applied research, and information about how different health care settings operate. Qualitative research can illuminate the processes underlying statistical correlations, inform the development of interventions, and show how interventions work to produce observed outcomes. This article describes the key features of qualitative research and the advantages that such features add to existing quantitative research approaches in the study of infection control. We address the goal of qualitative research, the nature of the research process, sampling, data collection and analysis, validity, generalizability of findings, and presentation of findings. Health services researchers are increasingly using qualitative methods to address practical problems by uncovering interacting influences in complex health care environments. Qualitative research methods, applied with expertise and rigor, can contribute important insights to infection prevention efforts.

  15. A hybrid method for determination of the acoustic impedance of an unflanged cylindrical duct for multimode wave

    NASA Astrophysics Data System (ADS)

    Snakowska, Anna; Jurkiewicz, Jerzy; Gorazd, Łukasz

    2017-05-01

    The paper presents derivation of the impedance matrix based on the rigorous solution of the wave equation obtained by the Wiener-Hopf technique for a semi-infinite unflanged cylindrical duct. The impedance matrix allows, in turn, calculate the acoustic impedance along the duct and, as a special case, the radiation impedance. The analysis is carried out for a multimode incident wave accounting for modes coupling on the duct outlet not only qualitatively but also quantitatively for a selected source operating inside. The quantitative evaluation of the acoustic impedance requires setting of modes amplitudes which has been obtained applying the mode decomposition method to the far-field pressure radiation measurements and theoretical formulae for single mode directivity characteristics for an unflanged duct. Calculation of the acoustic impedance for a non-uniform distribution of the sound pressure and the sound velocity on a duct cross section requires determination of the acoustic power transmitted along/radiated from a duct. In the paper, the impedance matrix, the power, and the acoustic impedance were derived as functions of Helmholtz number and distance from the outlet.

  16. Necromechanics: Death-induced changes in the mechanical properties of human tissues.

    PubMed

    Martins, Pedro A L S; Ferreira, Francisca; Natal Jorge, Renato; Parente, Marco; Santos, Agostinho

    2015-05-01

    After the death phenomenon, the rigor mortis development, characterized by body stiffening, is one of the most evident changes that occur in the body. In this work, the development of rigor mortis was assessed using a skinfold caliper in human cadavers and in live people to measure the deformation in the biceps brachii muscle in response to the force applied by the device. Additionally, to simulate the measurements with the finite element method, a two-dimensional model of an arm section was used. As a result of the experimental procedure, a decrease in deformation with increasing postmortem time was observed, which corresponds to an increase in rigidity. As expected, the deformations for the live subjects were higher. The finite element method analysis showed a correlation between the c1 parameter of the neo-Hookean model in the 4- to 8-h postmortem interval. This was accomplished by adjusting the c1 material parameter in order to simulate the measured experimental displacement. Despite being a preliminary study, the obtained results show that combining the proposed experimental procedure with a numerical technique can be very useful in the study of the postmortem mechanical modifications of human tissues. Moreover, the use of data from living subjects allows us to estimate the time of death paving the way to establish this process as an alternative to the existing techniques. This solution constitutes a portable, non-invasive method of estimating the postmortem interval with direct quantitative measurements using a skinfold caliper. The tools and methods described can be used to investigate the subject and to gain epidemiologic knowledge on rigor mortis phenomenon. © IMechE 2015.

  17. Isotopic exchange during derivatization of platelet activating factor for gas chromatography-mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haroldsen, P.E.; Gaskell, S.J.; Weintraub, S.T.

    1991-04-01

    One approach to the quantitative analysis of platelet activating factor (PAF, 1-O-alkyl-2-acetyl-sn-glycerol-3-phosphocholine; also referred to as AGEPC, alkyl glyceryl ether phosphocholine) is hydrolytic removal of the phosphocholine group and conversion to an electron-capturing derivative for gas chromatography-negative ion mass spectrometry. (2H3)Acetyl-AGEPC has been commonly employed as an internal standard. When 1-hexadecyl-2-(2H3)acetyl glycerol (obtained by enzymatic hydrolysis of (2H3)-C16:0 AGEPC) is treated with pentafluorobenzoyl chloride at 120 degrees C, the resulting 3-pentafluorobenzoate derivative shows extensive loss of the deuterium label. This exchange is evidently acid-catalyzed since derivatization of 1-hexadecyl-2-acetyl glycerol under the same conditions in the presence of a trace ofmore » 2HCl results in the incorporation of up to three deuterium atoms. Isotope exchange can be avoided if the reaction is carried out at low temperature in the presence of base. Direct derivatization of (2H3)-C16:0 AGEPC by treatment with pentafluorobenzoyl chloride or heptafluorobutyric anhydride also results in loss of the deuterium label. The use of (13C2)-C16:0 AGEPC as an internal standard is recommended for rigorous quantitative analysis.« less

  18. The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.

    PubMed

    Lash, Timothy L

    2017-09-15

    In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation

    PubMed Central

    Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.

    2013-01-01

    Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400

  20. Normalizing the causality between time series.

    PubMed

    Liang, X San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  1. A single-cell spiking model for the origin of grid-cell patterns

    PubMed Central

    Kempter, Richard

    2017-01-01

    Spatial cognition in mammals is thought to rely on the activity of grid cells in the entorhinal cortex, yet the fundamental principles underlying the origin of grid-cell firing are still debated. Grid-like patterns could emerge via Hebbian learning and neuronal adaptation, but current computational models remained too abstract to allow direct confrontation with experimental data. Here, we propose a single-cell spiking model that generates grid firing fields via spike-rate adaptation and spike-timing dependent plasticity. Through rigorous mathematical analysis applicable in the linear limit, we quantitatively predict the requirements for grid-pattern formation, and we establish a direct link to classical pattern-forming systems of the Turing type. Our study lays the groundwork for biophysically-realistic models of grid-cell activity. PMID:28968386

  2. [Substantiation of medical-engineering specifications for polymethyl siloxane removal by life support systems].

    PubMed

    Pakhomova, A A; Aksel'-Rubinshteĭn, V Z; Mikos, K N; Nikitin, E I

    2009-01-01

    Analysis of experimental data about the quantitative and qualitative chemical make-up of air in the orbital station Mir and International space station (ISS) showed a permanent presence of silicon. The main source of silicon contaminants seems to be a variety of polymethyl siloxane liquids and siloxane coating of electronics. The article describes the volatile silicon contaminants detected in space stations air. To control concentrations of silicon, the existing air purification system needs to be augmented with carbons having the micropore entrance larger than diameters of silicon-containing molecules. It is also important to elaborate the technology of polymethyl siloxane liquids synthesis so as to reduce the amount of volatile admixtures emission and to observe rigorously the pre-flight off-gassing requirements with special concern about silicon coatings.

  3. Normalizing the causality between time series

    NASA Astrophysics Data System (ADS)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  4. A primer on thermodynamic-based models for deciphering transcriptional regulatory logic.

    PubMed

    Dresch, Jacqueline M; Richards, Megan; Ay, Ahmet

    2013-09-01

    A rigorous analysis of transcriptional regulation at the DNA level is crucial to the understanding of many biological systems. Mathematical modeling has offered researchers a new approach to understanding this central process. In particular, thermodynamic-based modeling represents the most biophysically informed approach aimed at connecting DNA level regulatory sequences to the expression of specific genes. The goal of this review is to give biologists a thorough description of the steps involved in building, analyzing, and implementing a thermodynamic-based model of transcriptional regulation. The data requirements for this modeling approach are described, the derivation for a specific regulatory region is shown, and the challenges and future directions for the quantitative modeling of gene regulation are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Adding to the Truth of the Matter: The Case for Qualitative Research.

    PubMed

    Florczak, Kristine L

    2017-10-01

    The purpose of this column is to discuss the impact that qualitative research has on translational research, whose aim is to improve the health and well-being of individuals. To that end, rigorous qualitative research is examined; translational research is entertained and the manner in which qualitative research can be a co-equal partner with quantitative research is proposed.

  6. Quantitative Predictions of Binding Free Energy Changes in Drug-Resistant Influenza Neuraminidase

    DTIC Science & Technology

    2012-08-30

    drug resistance to two antiviral drugs, zanamivir and oseltamivir. We augmented molecular dynamics (MD) with Hamiltonian Replica Exchange and...conformations that are virtually identical to WT [10]. Molecular simulations that rigorously model the microscopic structure and thermodynamics PLOS...influenza neuraminidase (NA) that confer drug resistance to two antiviral drugs, zanamivir and oseltamivir. We augmented molecular dynamics (MD) with

  7. The Specificity of Observational Studies in Physical Activity and Sports Sciences: Moving Forward in Mixed Methods Research and Proposals for Achieving Quantitative and Qualitative Symmetry.

    PubMed

    Anguera, M Teresa; Camerino, Oleguer; Castañer, Marta; Sánchez-Algarra, Pedro; Onwuegbuzie, Anthony J

    2017-01-01

    Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use-and enormous potential-of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings.

  8. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  9. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  10. The Relationship Between Professional Burnout and Quality and Safety in Healthcare: A Meta-Analysis.

    PubMed

    Salyers, Michelle P; Bonfils, Kelsey A; Luther, Lauren; Firmin, Ruth L; White, Dominique A; Adams, Erin L; Rollins, Angela L

    2017-04-01

    Healthcare provider burnout is considered a factor in quality of care, yet little is known about the consistency and magnitude of this relationship. This meta-analysis examined relationships between provider burnout (emotional exhaustion, depersonalization, and reduced personal accomplishment) and the quality (perceived quality, patient satisfaction) and safety of healthcare. Publications were identified through targeted literature searches in Ovid MEDLINE, PsycINFO, Web of Science, CINAHL, and ProQuest Dissertations & Theses through March of 2015. Two coders extracted data to calculate effect sizes and potential moderators. We calculated Pearson's r for all independent relationships between burnout and quality measures, using a random effects model. Data were assessed for potential impact of study rigor, outliers, and publication bias. Eighty-two studies including 210,669 healthcare providers were included. Statistically significant negative relationships emerged between burnout and quality (r = -0.26, 95 % CI [-0.29, -0.23]) and safety (r = -0.23, 95 % CI [-0.28, -0.17]). In both cases, the negative relationship implied that greater burnout among healthcare providers was associated with poorer-quality healthcare and reduced safety for patients. Moderators for the quality relationship included dimension of burnout, unit of analysis, and quality data source. Moderators for the relationship between burnout and safety were safety indicator type, population, and country. Rigor of the study was not a significant moderator. This is the first study to systematically, quantitatively analyze the links between healthcare provider burnout and healthcare quality and safety across disciplines. Provider burnout shows consistent negative relationships with perceived quality (including patient satisfaction), quality indicators, and perceptions of safety. Though the effects are small to medium, the findings highlight the importance of effective burnout interventions for healthcare providers. Moderator analyses suggest contextual factors to consider for future study.

  11. Statistical issues in the design, conduct and analysis of two large safety studies.

    PubMed

    Gaffney, Michael

    2016-10-01

    The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.

  12. Rigorous coupled wave analysis of acousto-optics with relativistic considerations.

    PubMed

    Xia, Guoqiang; Zheng, Weijian; Lei, Zhenggang; Zhang, Ruolan

    2015-09-01

    A relativistic analysis of acousto-optics is presented, and a rigorous coupled wave analysis is generalized for the diffraction of the acousto-optical effect. An acoustic wave generates a grating with temporally and spatially modulated permittivity, hindering direct applications of the rigorous coupled wave analysis for the acousto-optical effect. In a reference frame which moves with the acoustic wave, the grating is static, the medium moves, and the coupled wave equations for the static grating may be derived. Floquet's theorem is then applied to cast these equations into an eigenproblem. Using a Lorentz transformation, the electromagnetic fields in the grating region are transformed to the lab frame where the medium is at rest, and relativistic Doppler frequency shifts are introduced into various diffraction orders. In the lab frame, the boundary conditions are considered and the diffraction efficiencies of various orders are determined. This method is rigorous and general, and the plane waves in the resulting expansion satisfy the dispersion relation of the medium and are propagation modes. Properties of various Bragg diffractions are results, rather than preconditions, of this method. Simulations of an acousto-optical tunable filter made by paratellurite, TeO(2), are given as examples.

  13. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  14. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  15. Technological characteristics of pre- and post-rigor deboned beef mixtures from Holstein steers and quality attributes of cooked beef sausage.

    PubMed

    Sukumaran, Anuraj T; Holtcamp, Alexander J; Campbell, Yan L; Burnett, Derris; Schilling, Mark W; Dinh, Thu T N

    2018-06-07

    The objective of this study was to determine the effects of deboning time (pre- and post-rigor), processing steps (grinding - GB; salting - SB; batter formulation - BB), and storage time on the quality of raw beef mixtures and vacuum-packaged cooked sausage, produced using a commercial formulation with 0.25% phosphate. The pH was greater in pre-rigor GB and SB than in post-rigor GB and SB (P < .001). However, deboning time had no effect on metmyoglobin reducing activity, cooking loss, and color of raw beef mixtures. Protein solubility of pre-rigor beef mixtures (124.26 mg/kg) was greater than that of post-rigor beef (113.93 mg/kg; P = .071). TBARS were increased in BB but decreased during vacuum storage of cooked sausage (P ≤ .018). Except for chewiness and saltiness being 52.9 N-mm and 0.3 points greater in post-rigor sausage (P = .040 and 0.054, respectively), texture profile analysis and trained panelists detected no difference in texture between pre- and post-rigor sausage. Published by Elsevier Ltd.

  16. Developing methods for systematic reviewing in health services delivery and organization: an example from a review of access to health care for people with learning disabilities. Part 2. Evaluation of the literature--a practical guide.

    PubMed

    Alborz, Alison; McNally, Rosalind

    2004-12-01

    To develop methods to facilitate the 'systematic' review of evidence from a range of methodologies on diffuse or 'soft' topics, as exemplified by 'access to health care'. Twenty-eight bibliographic databases, research registers, organizational websites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords' model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesized. Quality assessment was by an initial set of 'generic' quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Eighty-two studies were fully evaluated. Five studies were rated 'highly rigorous', 22 'rigorous', 46 'less rigorous' and nine 'poor' papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or 'soft' topics. Synthesis can be facilitated further by using software, such as the microsoft 'access' database, for managing information.

  17. Effects of Conservation Policies on Forest Cover Change in Giant Panda Habitat Regions, China

    PubMed Central

    Li, Yu; Viña, Andrés; Yang, Wu; Chen, Xiaodong; Zhang, Jindong; Ouyang, Zhiyun; Liang, Zai; Liu, Jianguo

    2014-01-01

    After long periods of deforestation, forest transition has occurred globally, but the causes of forest transition in different countries are highly variable. Conservation policies may play important roles in facilitating forest transition around the world, including China. To restore forests and protect the remaining natural forests, the Chinese government initiated two nationwide conservation policies in the late 1990s -- the Natural Forest Conservation Program (NFCP) and the Grain-To-Green Program (GTGP). While some studies have discussed the environmental and socioeconomic effects of each of these policies independently and others have attributed forest recovery to both policies without rigorous and quantitative analysis, it is necessary to rigorously quantify the outcomes of these two conservation policies simultaneously because the two policies have been implemented at the same time. To fill the knowledge gap, this study quantitatively evaluated the effects of the two conservation policies on forest cover change between 2001 and 2008 in 108 townships located in two important giant panda habitat regions -- the Qinling Mountains region in Shaanxi Province and the Sichuan Giant Panda Sanctuary in Sichuan Province. Forest cover change was evaluated using a land-cover product (MCD12Q1) derived from the Moderate Resolution Imaging Spectroradiometer (MODIS). This product proved to be highly accurate in the study region (overall accuracy was ca. 87%, using 425 ground truth points collected in the field), thus suitable for the forest change analysis performed. Results showed that within the timeframe evaluated, most townships in both regions exhibited either increases or no changes in forest cover. After accounting for a variety of socioeconomic and biophysical attributes, an Ordinary Least Square (OLS) regression model suggests that the two policies had statistically significant positive effects on forest cover change after seven years of implementation, while population density, percent agricultural population, road density, and initial forest cover (i.e. in 2001) had significant negative effects. The methods and results from this study will be useful for continuing the implementation of these conservation policies, for the development of future giant panda habitat conservation projects, and for achieving forest sustainability in China and elsewhere. PMID:26146431

  18. Effects of Conservation Policies on Forest Cover Change in Giant Panda Habitat Regions, China.

    PubMed

    Li, Yu; Viña, Andrés; Yang, Wu; Chen, Xiaodong; Zhang, Jindong; Ouyang, Zhiyun; Liang, Zai; Liu, Jianguo

    2013-07-01

    After long periods of deforestation, forest transition has occurred globally, but the causes of forest transition in different countries are highly variable. Conservation policies may play important roles in facilitating forest transition around the world, including China. To restore forests and protect the remaining natural forests, the Chinese government initiated two nationwide conservation policies in the late 1990s -- the Natural Forest Conservation Program (NFCP) and the Grain-To-Green Program (GTGP). While some studies have discussed the environmental and socioeconomic effects of each of these policies independently and others have attributed forest recovery to both policies without rigorous and quantitative analysis, it is necessary to rigorously quantify the outcomes of these two conservation policies simultaneously because the two policies have been implemented at the same time. To fill the knowledge gap, this study quantitatively evaluated the effects of the two conservation policies on forest cover change between 2001 and 2008 in 108 townships located in two important giant panda habitat regions -- the Qinling Mountains region in Shaanxi Province and the Sichuan Giant Panda Sanctuary in Sichuan Province. Forest cover change was evaluated using a land-cover product (MCD12Q1) derived from the Moderate Resolution Imaging Spectroradiometer (MODIS). This product proved to be highly accurate in the study region (overall accuracy was ca. 87%, using 425 ground truth points collected in the field), thus suitable for the forest change analysis performed. Results showed that within the timeframe evaluated, most townships in both regions exhibited either increases or no changes in forest cover. After accounting for a variety of socioeconomic and biophysical attributes, an Ordinary Least Square (OLS) regression model suggests that the two policies had statistically significant positive effects on forest cover change after seven years of implementation, while population density, percent agricultural population, road density, and initial forest cover (i.e. in 2001) had significant negative effects. The methods and results from this study will be useful for continuing the implementation of these conservation policies, for the development of future giant panda habitat conservation projects, and for achieving forest sustainability in China and elsewhere.

  19. Component Design Report: International Transportation Energy Demand Determinants Model

    EIA Publications

    2017-01-01

    This Component Design Report discusses working design elements for a new model to replace the International Transportation Model (ITran) in the World Energy Projection System Plus (WEPS ) that is maintained by the U.S. Energy Information Administration. The key objective of the new International Transportation Energy Demand Determinants (ITEDD) model is to enable more rigorous, quantitative research related to energy consumption in the international transportation sectors.

  20. Closing the Expectations Gap: 2014 Annual Report on the Alignment of State K-12 Policies and Practice with the Demands of College and Careers

    ERIC Educational Resources Information Center

    Achieve, Inc., 2015

    2015-01-01

    Today's economy demands that "all" young people develop high-level literacy, quantitative reasoning, problem solving, communication and collaboration skills, all grounded in a rigorous, content-rich K-12 curriculum. Acquiring this knowledge and these skills ensures that high school graduates are academically prepared to pursue the…

  1. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    PubMed

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Monitoring muscle optical scattering properties during rigor mortis

    NASA Astrophysics Data System (ADS)

    Xia, J.; Ranasinghesagara, J.; Ku, C. W.; Yao, G.

    2007-09-01

    Sarcomere is the fundamental functional unit in skeletal muscle for force generation. In addition, sarcomere structure is also an important factor that affects the eating quality of muscle food, the meat. The sarcomere structure is altered significantly during rigor mortis, which is the critical stage involved in transforming muscle to meat. In this paper, we investigated optical scattering changes during the rigor process in Sternomandibularis muscles. The measured optical scattering parameters were analyzed along with the simultaneously measured passive tension, pH value, and histology analysis. We found that the temporal changes of optical scattering, passive tension, pH value and fiber microstructures were closely correlated during the rigor process. These results suggested that sarcomere structure changes during rigor mortis can be monitored and characterized by optical scattering, which may find practical applications in predicting meat quality.

  3. Use of mathematics to guide target selection in systems pharmacology; application to receptor tyrosine kinase (RTK) pathways.

    PubMed

    Benson, Neil; van der Graaf, Piet H; Peletier, Lambertus A

    2017-11-15

    A key element of the drug discovery process is target selection. Although the topic is subject to much discussion and experimental effort, there are no defined quantitative rules around optimal selection. Often 'rules of thumb', that have not been subject to rigorous exploration, are used. In this paper we explore the 'rule of thumb' notion that the molecule that initiates a pathway signal is the optimal target. Given the multi-factorial and complex nature of this question, we have simplified an example pathway to its logical minimum of two steps and used a mathematical model of this to explore the different options in the context of typical small and large molecule drugs. In this paper, we report the conclusions of our analysis and describe the analysis tool and methods used. These provide a platform to enable a more extensive enquiry into this important topic. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Quantitative structure-activity relationships by neural networks and inductive logic programming. I. The inhibition of dihydrofolate reductase by pyrimidines

    NASA Astrophysics Data System (ADS)

    Hirst, Jonathan D.; King, Ross D.; Sternberg, Michael J. E.

    1994-08-01

    Neural networks and inductive logic programming (ILP) have been compared to linear regression for modelling the QSAR of the inhibition of E. coli dihydrofolate reductase (DHFR) by 2,4-diamino-5-(substitured benzyl)pyrimidines, and, in the subsequent paper [Hirst, J.D., King, R.D. and Sternberg, M.J.E., J. Comput.-Aided Mol. Design, 8 (1994) 421], the inhibition of rodent DHFR by 2,4-diamino-6,6-dimethyl-5-phenyl-dihydrotriazines. Cross-validation trials provide a statistically rigorous assessment of the predictive capabilities of the methods, with training and testing data selected randomly and all the methods developed using identical training data. For the ILP analysis, molecules are represented by attributes other than Hansch parameters. Neural networks and ILP perform better than linear regression using the attribute representation, but the difference is not statistically significant. The major benefit from the ILP analysis is the formulation of understandable rules relating the activity of the inhibitors to their chemical structure.

  5. Snippets from the past: the evolution of Wade Hampton Frost's epidemiology as viewed from the American Journal of Hygiene/Epidemiology.

    PubMed

    Morabia, Alfredo

    2013-10-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging "early" epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light.

  6. Snippets From the Past: The Evolution of Wade Hampton Frost's Epidemiology as Viewed From the American Journal of Hygiene/Epidemiology

    PubMed Central

    Morabia, Alfredo

    2013-01-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging “early” epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light. PMID:24022889

  7. Process Evaluation for Improving K12 Program Effectiveness: Case Study of a National Institutes of Health Building Interdisciplinary Research Careers in Women's Health Research Career Development Program.

    PubMed

    Raymond, Nancy C; Wyman, Jean F; Dighe, Satlaj; Harwood, Eileen M; Hang, Mikow

    2018-06-01

    Process evaluation is an important tool in quality improvement efforts. This article illustrates how a systematic and continuous evaluation process can be used to improve the quality of faculty career development programs by using the University of Minnesota's Building Interdisciplinary Research Careers in Women's Health (BIRCWH) K12 program as an exemplar. Data from a rigorous process evaluation incorporating quantitative and qualitative measurements were analyzed and reviewed by the BIRCWH program leadership on a regular basis. Examples are provided of how this evaluation model and processes were used to improve many aspects of the program, thereby improving scholar, mentor, and advisory committee members' satisfaction and scholar outcomes. A rigorous evaluation plan can increase the effectiveness and impact of a research career development plan.

  8. Testability of evolutionary game dynamics based on experimental economics data

    NASA Astrophysics Data System (ADS)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  9. Psychotherapy for cancer patients.

    PubMed

    Chong Guan, Ng; Mohamed, Salina; Kian Tiah, Lai; Kar Mun, Teoh; Sulaiman, Ahmad Hatim; Zainal, Nor Zuraida

    2016-07-01

    Objective Psychotherapy is a common non-pharmacological approach to help cancer patients in their psychological distress. The benefit of psychotherapies was documented, but the types of psychotherapies proposed are varied. Given that the previous literature review was a decade ago and no quantitative analysis was done on this topic, we again critically and systematically reviewed all published trials on psychotherapy in cancer patients. Method We identified 17 clinical trials on six types of psychotherapy for cancer patients by searching PubMed and EMBASE. Result There were four trials involved adjunct psychological therapy which were included in quantitative analysis. Each trial demonstrated that psychotherapy improved the quality of life and coping in cancer patients. There was also a reduction in distress, anxiety, and depression after a psychological intervention. However, the number and quality of clinical trials for each type of psychotherapy were poor. The meta-analysis of the four trials involved adjunct psychological therapy showed no significant change in depression, with only significant short-term improvement in anxiety but not up to a year-the standardized mean differences were -0.37 (95% confidence interval (CI) = -0.57, -0.16) at 2 months, -0.21 (95% CI = -0.42, -0.01) at 4 months, and 0.03 (95 % CI = -0.19, 0.24) at 12 months. Conclusion The evidence on the efficacy of psychotherapy in cancer patients is unsatisfactory. There is a need for more rigorous and well-designed clinical trials on this topic.

  10. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.

    PubMed

    Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P

    2016-07-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  11. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    PubMed Central

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  12. The importance of task design and behavioral control for understanding the neural basis of cognitive functions.

    PubMed

    Fetsch, Christopher R

    2016-04-01

    The success of systems neuroscience depends on the ability to forge quantitative links between neural activity and behavior. Traditionally, this process has benefited from the rigorous development and testing of hypotheses using tools derived from classical psychophysics and computational motor control. As our capacity for measuring neural activity improves, accompanied by powerful new analysis strategies, it seems prudent to remember what these traditional approaches have to offer. Here I present a perspective on the merits of principled task design and tight behavioral control, along with some words of caution about interpretation in unguided, large-scale neural recording studies. I argue that a judicious combination of new and old approaches is the best way to advance our understanding of higher brain function in health and disease. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Development of a Computer-Assisted Instrumentation Curriculum for Physics Students: Using LabVIEW and Arduino Platform

    NASA Astrophysics Data System (ADS)

    Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang

    2016-06-01

    We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum `Computer-Assisted Instrumentation in the Design of Physics Laboratories' brings rigorous algorithm and syntax protocols together with imagination, communication, scientific applications and experimental innovation. The effectiveness of the curriculum was evaluated via statistical analysis of questionnaires, interview responses, the increase in student numbers majoring in physics, and performance in a competition. The results provide quantitative support that the curriculum remove huge barriers to programming which occur in text-based environments, helped students gain knowledge of programming and instrumentation, and increased the students' confidence and motivation to learn physics and computer languages.

  14. Infrared spectroscopic ellipsometry of micrometer-sized SiO2 line gratings

    NASA Astrophysics Data System (ADS)

    Walder, Cordula; Zellmeier, Matthias; Rappich, Jörg; Ketelsen, Helge; Hinrichs, Karsten

    2017-09-01

    For the design and process control of periodic nano-structured surfaces spectroscopic ellipsometry is already established in the UV-VIS spectral regime. The objective of this work is to show the feasibility of spectroscopic ellipsometry in the infrared, exemplarily, on micrometer-sized SiO2 line gratings grown on silicon wafers. The grating period ranges from 10 to about 34 μm. The IR-ellipsometric spectra of the gratings exhibit complex changes with structure variations. Especially in the spectral range of the oxide stretching modes, the presence of a Rayleigh singularity can lead to pronounced changes of the spectrum with the sample geometry. The IR-ellipsometric spectra of the gratings are well reproducible by calculations with the RCWA method (Rigorous Coupled Wave Analysis). Therefore, infrared spectroscopic ellipsometry allows the quantitative characterization and process control of micrometer-sized structures.

  15. Identification, quantitation, and method validation for flavan-3-ols in fermented ready-to-drink teas from the Italian market using HPLC-UV/DAD and LC-MS/MS.

    PubMed

    Cordero, Chiara; Canale, Francesca; Del Rio, Daniele; Bicchi, Carlo

    2009-11-01

    The present study is focused on flavan-3-ols characterizing the antioxidant properties of fermented tea (Camellia sinensis). These bioactive compounds, object of nutritional claims in commercial products, should be quantified with rigorous analytical procedures whose accuracy and precision have been stated with a certain level of confidence. An HPLC-UV/DAD method, able to detect and quantify flavan-3-ols in infusions and ready-to-drink teas, has been developed for routine analysis and validated by characterizing several performance parameters. The accuracy assessment has been run through a series of LC-MS/MS analyses. Epigallocatechin, (+)-catechin, (-)-epigallocatechingallate, (-)-epicatechin, (-)-gallocatechingallate, (-)-epicatechingallate, and (-)-catechingallate were chosen as markers of the polyphenolic fraction. Quantitative results showed that samples obtained from tea leaves infusion were richer in polyphenolic antioxidants than those obtained through other industrial processes. The influence of shelf-life and packaging material on the flavan-3-ols content was also considered; markers decreased, with an exponential trend, as a function of time within the shelf life while packaging materials demonstrated to influence differently the flavan-3-ol fraction composition over time. The method presented here provides quantitative results with a certain level of confidence and is suitable for a routine quality control of iced teas whose antioxidant properties are object of nutritional claim.

  16. Associations between personal exposures and ambient concentrations of nitrogen dioxide: A quantitative research synthesis

    NASA Astrophysics Data System (ADS)

    Meng, Q. Y.; Svendsgaard, D.; Kotchmar, D. J.; Pinto, J. P.

    2012-09-01

    Although positive associations between ambient NO2 concentrations and personal exposures have generally been found by exposure studies, the strength of the associations varied among studies. Differences in results could be related to differences in study design and in exposure factors. However, the effects of study design, exposure factors, and sampling and measurement errors on the strength of the personal-ambient associations have not been evaluated quantitatively in a systematic manner. A quantitative research synthesis was conducted to examine these issues based on peer-reviewed publications in the past 30 years. Factors affecting the strength of the personal-ambient associations across the studies were also examined with meta-regression. Ambient NO2 was found to be significantly associated with personal NO2 exposures, with estimates of 0.42, 0.16, and 0.72 for overall pooled, longitudinal and daily average correlation coefficients based on random-effects meta-analysis. This conclusion was robust after correction for publication bias with correlation coefficients of 0.37, 0.16 and 0.45. We found that season and some population characteristics, such as pre-existing disease, were significant factors affecting the strength of the personal-ambient associations. More meaningful and rigorous comparisons would be possible if greater detail were published on the study design (e.g. local and indoor sources, housing characteristics, etc.) and data quality (e.g., detection limits and percent of data above detection limits).

  17. Quantitative proteomics and systems analysis of cultured H9C2 cardiomyoblasts during differentiation over time supports a 'function follows form' model of differentiation.

    PubMed

    Kankeu, Cynthia; Clarke, Kylie; Van Haver, Delphi; Gevaert, Kris; Impens, Francis; Dittrich, Anna; Roderick, H Llewelyn; Passante, Egle; Huber, Heinrich J

    2018-05-17

    The rat cardiomyoblast cell line H9C2 has emerged as a valuable tool for studying cardiac development, mechanisms of disease and toxicology. We present here a rigorous proteomic analysis that monitored the changes in protein expression during differentiation of H9C2 cells into cardiomyocyte-like cells over time. Quantitative mass spectrometry followed by gene ontology (GO) enrichment analysis revealed that early changes in H9C2 differentiation are related to protein pathways of cardiac muscle morphogenesis and sphingolipid synthesis. These changes in the proteome were followed later in the differentiation time-course by alterations in the expression of proteins involved in cation transport and beta-oxidation. Studying the temporal profile of the H9C2 proteome during differentiation in further detail revealed eight clusters of co-regulated proteins that can be associated with early, late, continuous and transient up- and downregulation. Subsequent reactome pathway analysis based on these eight clusters further corroborated and detailed the results of the GO analysis. Specifically, this analysis confirmed that proteins related to pathways in muscle contraction are upregulated early and transiently, and proteins relevant to extracellular matrix organization are downregulated early. In contrast, upregulation of proteins related to cardiac metabolism occurs at later time points. Finally, independent validation of the proteomics results by immunoblotting confirmed hereto unknown regulators of cardiac structure and ionic metabolism. Our results are consistent with a 'function follows form' model of differentiation, whereby early and transient alterations of structural proteins enable subsequent changes that are relevant to the characteristic physiology of cardiomyocytes.

  18. Impact of cooking and home food preparation interventions among adults: outcomes and implications for future programs

    PubMed Central

    Reicks, Marla; Trofholz, Amanda C.; Stang, Jamie S; Laska, Melissa N.

    2014-01-01

    Objective Cooking programs are growing in popularity; however an extensive review has not examined overall impact. Therefore, this study reviewed previous research on cooking/home food preparation interventions and diet and health-related outcomes among adults and identified implications for practice and research. Design Literature review and descriptive summative method. Main outcome measures Dietary intake, knowledge/skills, cooking attitudes and self-efficacy/confidence, health outcomes. Analysis Articles evaluating effectiveness of interventions that included cooking/home food preparation as the primary aim (January 1980 through December 2011) were identified via OVID MEDLINE, Agricola and Web of Science databases. Studies grouped according to design and outcomes were reviewed for validity using an established coding system. Results were summarized for several outcome categories. Results Of 28 studies identified, 12 included a control group with six as non-randomized and six as randomized controlled trials. Evaluation was done post-intervention for five studies, pre- and post-intervention for 23 and beyond post-intervention for 15. Qualitative and quantitative measures suggested a positive influence on main outcomes. However, non-rigorous study designs, varying study populations, and use of non-validated assessment tools limited stronger conclusions. Conclusions and Implications Well-designed studies are needed that rigorously evaluate long-term impact on cooking behavior, dietary intake, obesity and other health outcomes. PMID:24703245

  19. The quality of the evidence base for clinical pathway effectiveness: room for improvement in the design of evaluation trials.

    PubMed

    Rotter, Thomas; Kinsman, Leigh; James, Erica; Machotta, Andreas; Steyerberg, Ewout W

    2012-06-18

    The purpose of this article is to report on the quality of the existing evidence base regarding the effectiveness of clinical pathway (CPW) research in the hospital setting. The analysis is based on a recently published Cochrane review of the effectiveness of CPWs. An integral component of the review process was a rigorous appraisal of the methodological quality of published CPW evaluations. This allowed the identification of strengths and limitations of the evidence base for CPW effectiveness. We followed the validated Cochrane Effective Practice and Organisation of Care Group (EPOC) criteria for randomized and non-randomized clinical pathway evaluations. In addition, we tested the hypotheses that simple pre-post studies tend to overestimate CPW effects reported. Out of the 260 primary studies meeting CPW content criteria, only 27 studies met the EPOC study design criteria, with the majority of CPW studies (more than 70 %) excluded from the review on the basis that they were simple pre-post evaluations, mostly comparing two or more annual patient cohorts. Methodologically poor study designs are often used to evaluate CPWs and this compromises the quality of the existing evidence base. Cochrane EPOC methodological criteria, including the selection of rigorous study designs along with detailed descriptions of CPW development and implementation processes, are recommended for quantitative evaluations to improve the evidence base for the use of CPWs in hospitals.

  20. Systematic review and meta-analysis: tools for the information age.

    PubMed

    Weatherall, Mark

    2017-11-01

    The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Quantitative spectral and orientational analysis in surface sum frequency generation vibrational spectroscopy (SFG-VS)

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua

    Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D = /, and comparison between the PNA method with the commonly used polarization intensity ratio (PIR) method is discussed. The polarization and incident angle dependencies of the SFG-VS intensity are also reviewed, in the light of how experimental arrangements can be optimized to effectively abstract crucial information from the SFG-VS experiments. The values and models of the local field factors in the molecular layers are discussed. In order to examine the validity and limitations of the bond polarizability derivative model, the general expressions for molecular hyperpolarizability tensors and their expression with the bond polarizability derivative model for C3v, C2v and C∞v molecular groups are given in the two appendixes. We show that the bond polarizability derivative model can quantitatively describe many aspects of the intensities observed in the SFG-VS spectrum of the vapour/neat liquid interfaces in different polarizations. Using the polarization analysis in SFG-VS, polarization selection rules or guidelines are developed for assignment of the SFG-VS spectrum. Using the selection rules, SFG-VS spectra of vapour/diol, and vapour/n-normal alcohol (n˜ 1-8) interfaces are assigned, and some of the ambiguity and confusion, as well as their implications in previous IR and Raman assignment, are duly discussed. The ability to assign a SFG-VS spectrum using the polarization selection rules makes SFG-VS not only an effective and useful vibrational spectroscopy technique for interface studies, but also a complementary vibrational spectroscopy method in general condensed phase studies. These developments will put quantitative orientational and spectral analysis in SFG-VS on a more solid foundation. The formulations, concepts and issues discussed in this review are expected to find broad applications for investigations on molecular interfaces in the future.

  2. Rethinking research in the medical humanities: a scoping review and narrative synthesis of quantitative outcome studies.

    PubMed

    Dennhardt, Silke; Apramian, Tavis; Lingard, Lorelei; Torabi, Nazi; Arntfield, Shannon

    2016-03-01

    The rise of medical humanities teaching in medical education has introduced pressure to prove efficacy and utility. Review articles on the available evidence have been criticised for poor methodology and unwarranted conclusions. To support a more nuanced discussion of how the medical humanities work, we conducted a scoping review of quantitative studies of medical humanities teaching. Using a search strategy involving MEDLINE, EMBASE and ERIC, and hand searching, our scoping review located 11 045 articles that referred to the use of medical humanities teaching in medical education. Of these, 62 studies using quantitative evaluation methods were selected for review. Three iterations of analysis were performed: descriptive, conceptual, and discursive. Descriptive analysis revealed that the medical humanities as a whole cannot be easily systematised based on simple descriptive categories. Conceptual analysis supported the development of a conceptual framework in which the foci of the arts and humanities in medical education can be mapped alongside their related epistemic functions for teaching and learning. Within the framework, art functioned as expertise, as dialogue or as a means of expression and transformation. In the discursive analysis, we found three main ways in which the relationship between the arts and humanities and medicine was constructed as, respectively, intrinsic, additive and curative. This review offers a nuanced framework of how different types of medical humanities work. The epistemological assumptions and discursive positioning of medical humanities teaching frame the forms of outcomes research that are considered relevant to curriculum decision making, and shed light on why dominant review methodologies make some functions of medical humanities teaching visible and render others invisible. We recommend the use of this framework to improve the rigor and relevance of future explorations of the efficacy and utility of medical humanities teaching. © 2016 John Wiley & Sons Ltd.

  3. A case of instantaneous rigor?

    PubMed

    Pirch, J; Schulz, Y; Klintschar, M

    2013-09-01

    The question of whether instantaneous rigor mortis (IR), the hypothetic sudden occurrence of stiffening of the muscles upon death, actually exists has been controversially debated over the last 150 years. While modern German forensic literature rejects this concept, the contemporary British literature is more willing to embrace it. We present the case of a young woman who suffered from diabetes and who was found dead in an upright standing position with back and shoulders leaned against a punchbag and a cupboard. Rigor mortis was fully established, livor mortis was strong and according to the position the body was found in. After autopsy and toxicological analysis, it was stated that death most probably occurred due to a ketoacidotic coma with markedly increased values of glucose and lactate in the cerebrospinal fluid as well as acetone in blood and urine. Whereas the position of the body is most unusual, a detailed analysis revealed that it is a stable position even without rigor mortis. Therefore, this case does not further support the controversial concept of IR.

  4. Study Quality in SLA: A Cumulative and Developmental Assessment of Designs, Analyses, Reporting Practices, and Outcomes in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2011-01-01

    I began this study with two assumptions. Assumption 1: Study quality matters. If the means by which researchers design, carry out, and report on their studies lack in rigor or transparency, theory and practice are likely to be misguided or at least decelerated. Assumption 2 is an implication of Assumption 1: Quality should be measured rather than…

  5. Nine Criteria for a Measure of Scientific Output

    PubMed Central

    Kreiman, Gabriel; Maunsell, John H. R.

    2011-01-01

    Scientific research produces new knowledge, technologies, and clinical treatments that can lead to enormous returns. Often, the path from basic research to new paradigms and direct impact on society takes time. Precise quantification of scientific output in the short-term is not an easy task but is critical for evaluating scientists, laboratories, departments, and institutions. While there have been attempts to quantifying scientific output, we argue that current methods are not ideal and suffer from solvable difficulties. Here we propose criteria that a metric should have to be considered a good index of scientific output. Specifically, we argue that such an index should be quantitative, based on robust data, rapidly updated and retrospective, presented with confidence intervals, normalized by number of contributors, career stage and discipline, impractical to manipulate, and focused on quality over quantity. Such an index should be validated through empirical testing. The purpose of quantitatively evaluating scientific output is not to replace careful, rigorous review by experts but rather to complement those efforts. Because it has the potential to greatly influence the efficiency of scientific research, we have a duty to reflect upon and implement novel and rigorous ways of evaluating scientific output. The criteria proposed here provide initial steps toward the systematic development and validation of a metric to evaluate scientific output. PMID:22102840

  6. The Specificity of Observational Studies in Physical Activity and Sports Sciences: Moving Forward in Mixed Methods Research and Proposals for Achieving Quantitative and Qualitative Symmetry

    PubMed Central

    Anguera, M. Teresa; Camerino, Oleguer; Castañer, Marta; Sánchez-Algarra, Pedro; Onwuegbuzie, Anthony J.

    2017-01-01

    Mixed methods studies are been increasingly applied to a diversity of fields. In this paper, we discuss the growing use—and enormous potential—of mixed methods research in the field of sport and physical activity. A second aim is to contribute to strengthening the characteristics of mixed methods research by showing how systematic observation offers rigor within a flexible framework that can be applied to a wide range of situations. Observational methodology is characterized by high scientific rigor and flexibility throughout its different stages and allows the objective study of spontaneous behavior in natural settings, with no external influence. Mixed methods researchers need to take bold yet thoughtful decisions regarding both substantive and procedural issues. We present three fundamental and complementary ideas to guide researchers in this respect: we show why studies of sport and physical activity that use a mixed methods research approach should be included in the field of mixed methods research, we highlight the numerous possibilities offered by observational methodology in this field through the transformation of descriptive data into quantifiable code matrices, and we discuss possible solutions for achieving true integration of qualitative and quantitative findings. PMID:29312061

  7. Using multiple methods to assess learning and outcomes in an online degree-granting dental hygiene program.

    PubMed

    Springfield, Emily; Gwozdek, Anne E; Peet, Melissa; Kerschbaum, Wendy E

    2012-04-01

    Program evaluation is a necessary component of curricular change and innovation. It ascertains whether an innovation has met benchmarks and contributes to the body of knowledge about educational methodologies and supports the use of evidence-based practice in teaching. Education researchers argue that rigorous program evaluation should utilize a mixed-method approach, triangulating both qualitative and quantitative methods to understand program effectiveness. This approach was used to evaluate the University of Michigan Dental Hygiene Degree Completion E-Learning (online) Program. Quantitative data included time spent on coursework, grades, publications, course evaluation results, and survey responses. Qualitative data included student and faculty responses in focus groups and on surveys as well as students' portfolio reflections. The results showed the program was academically rigorous, fostering students' ability to connect theory with practice and apply evidence-based practice principles. These results also demonstrated that the students had learned to critically reflect on their practice and develop expanded professional identities; going beyond the role of clinician, they began to see themselves as educators, advocates, and researchers. This evaluation model is easily adaptable and is applicable to any health science or other professional degree program. This study also raised important questions regarding the effect of meta-reflection on student confidence and professional behavior.

  8. CellShape: A user-friendly image analysis tool for quantitative visualization of bacterial cell factories inside.

    PubMed

    Goñi-Moreno, Ángel; Kim, Juhyun; de Lorenzo, Víctor

    2017-02-01

    Visualization of the intracellular constituents of individual bacteria while performing as live biocatalysts is in principle doable through more or less sophisticated fluorescence microscopy. Unfortunately, rigorous quantitation of the wealth of data embodied in the resulting images requires bioinformatic tools that are not widely extended within the community-let alone that they are often subject to licensing that impedes software reuse. In this context we have developed CellShape, a user-friendly platform for image analysis with subpixel precision and double-threshold segmentation system for quantification of fluorescent signals stemming from single-cells. CellShape is entirely coded in Python, a free, open-source programming language with widespread community support. For a developer, CellShape enhances extensibility (ease of software improvements) by acting as an interface to access and use existing Python modules; for an end-user, CellShape presents standalone executable files ready to open without installation. We have adopted this platform to analyse with an unprecedented detail the tridimensional distribution of the constituents of the gene expression flow (DNA, RNA polymerase, mRNA and ribosomal proteins) in individual cells of the industrial platform strain Pseudomonas putida KT2440. While the CellShape first release version (v0.8) is readily operational, users and/or developers are enabled to expand the platform further. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Quali-quantitative analysis (QQA): why it could open new frontiers for holistic health practice.

    PubMed

    Bell, Erica

    2006-12-15

    Holistic health practice is often described as being about understanding the larger contexts of patients, their health services, and their communities. Yet do traditional quantitative and qualitative health research methods produce the best possible evidence for the holistic practices of doctors, nurses, and allied health professionals? This paper argues "no", and examines the potential of a cutting-edge, social science research method--Quali-Quantitative Research (QQA)--for providing better evidence for holistic practice, particularly in small-N populations, such as rural and remote communities. It does so with reference to the international literature on holistic medicine, as well as three holistic health projects conducted in Tasmania: about prevention of falls in older people, adolescent substance abuse, and interventions for children aged 0-5 exposed to domestic violence. The findings suggest that much health research fails to capture rigorously the contextual complexity of holistic health challenges: the multiple different needs of individual patients, and the interprofessional approaches needed to deliver multidisciplinary and multiservice health interventions tailored to meet those needs in particular community contexts. QQA offers a "configurational", case-based, diversity-oriented approach to analysing data that combines qualitative and quantitative techniques to overcome the limitations of both research traditions. The author concludes that QQA could open new frontiers for holistic health by helping doctors, nurses, and allied health professionals answer a fundamental question presented by complex health challenges: "Given this set of whole-of-patient needs, what elements of which interventions in what services would work best in this particular community?"

  10. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    PubMed

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  11. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.

  12. Response to Ridgeway, Dunston, and Qian: On Methodological Rigor: Has Rigor Mortis Set In?

    ERIC Educational Resources Information Center

    Baldwin, R. Scott; Vaughn, Sharon

    1993-01-01

    Responds to an article in the same issue of the journal presenting a meta-analysis of reading research. Expresses concern that the authors' conclusions will promote a slavish adherence to a methodology and a rigidity of thought that reading researchers can ill afford. (RS)

  13. An Assessment of Cost Improvements in the NASA COTS - CRS Program and Implications for Future NASA Missions

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2017-01-01

    This review brings rigorous life cycle cost (LCC) analysis into discussions about COTS program costs. We gather publicly available cost data, review the data for credibility, check for consistency among sources, and rigorously define and analyze specific cost metrics.

  14. Systemic Planning: An Annotated Bibliography and Literature Guide. Exchange Bibliography No. 91.

    ERIC Educational Resources Information Center

    Catanese, Anthony James

    Systemic planning is an operational approach to using scientific rigor and qualitative judgment in a complementary manner. It integrates rigorous techniques and methods from systems analysis, cybernetics, decision theory, and work programing. The annotated reference sources in this bibliography include those works that have been most influential…

  15. Toward a Mixed-Methods Research Approach to Content Analysis in The Digital Age: The Combined Content-Analysis Model and its Applications to Health Care Twitter Feeds.

    PubMed

    Hamad, Eradah O; Savundranayagam, Marie Y; Holmes, Jeffrey D; Kinsella, Elizabeth Anne; Johnson, Andrew M

    2016-03-08

    Twitter's 140-character microblog posts are increasingly used to access information and facilitate discussions among health care professionals and between patients with chronic conditions and their caregivers. Recently, efforts have emerged to investigate the content of health care-related posts on Twitter. This marks a new area for researchers to investigate and apply content analysis (CA). In current infodemiology, infoveillance and digital disease detection research initiatives, quantitative and qualitative Twitter data are often combined, and there are no clear guidelines for researchers to follow when collecting and evaluating Twitter-driven content. The aim of this study was to identify studies on health care and social media that used Twitter feeds as a primary data source and CA as an analysis technique. We evaluated the resulting 18 studies based on a narrative review of previous methodological studies and textbooks to determine the criteria and main features of quantitative and qualitative CA. We then used the key features of CA and mixed-methods research designs to propose the combined content-analysis (CCA) model as a solid research framework for designing, conducting, and evaluating investigations of Twitter-driven content. We conducted a PubMed search to collect studies published between 2010 and 2014 that used CA to analyze health care-related tweets. The PubMed search and reference list checks of selected papers identified 21 papers. We excluded 3 papers and further analyzed 18. Results suggest that the methods used in these studies were not purely quantitative or qualitative, and the mixed-methods design was not explicitly chosen for data collection and analysis. A solid research framework is needed for researchers who intend to analyze Twitter data through the use of CA. We propose the CCA model as a useful framework that provides a straightforward approach to guide Twitter-driven studies and that adds rigor to health care social media investigations. We provide suggestions for the use of the CCA model in elder care-related contexts.

  16. Toward a Mixed-Methods Research Approach to Content Analysis in The Digital Age: The Combined Content-Analysis Model and its Applications to Health Care Twitter Feeds

    PubMed Central

    Hamad, Eradah O; Savundranayagam, Marie Y; Holmes, Jeffrey D; Kinsella, Elizabeth Anne

    2016-01-01

    Background Twitter’s 140-character microblog posts are increasingly used to access information and facilitate discussions among health care professionals and between patients with chronic conditions and their caregivers. Recently, efforts have emerged to investigate the content of health care-related posts on Twitter. This marks a new area for researchers to investigate and apply content analysis (CA). In current infodemiology, infoveillance and digital disease detection research initiatives, quantitative and qualitative Twitter data are often combined, and there are no clear guidelines for researchers to follow when collecting and evaluating Twitter-driven content. Objective The aim of this study was to identify studies on health care and social media that used Twitter feeds as a primary data source and CA as an analysis technique. We evaluated the resulting 18 studies based on a narrative review of previous methodological studies and textbooks to determine the criteria and main features of quantitative and qualitative CA. We then used the key features of CA and mixed-methods research designs to propose the combined content-analysis (CCA) model as a solid research framework for designing, conducting, and evaluating investigations of Twitter-driven content. Methods We conducted a PubMed search to collect studies published between 2010 and 2014 that used CA to analyze health care-related tweets. The PubMed search and reference list checks of selected papers identified 21 papers. We excluded 3 papers and further analyzed 18. Results Results suggest that the methods used in these studies were not purely quantitative or qualitative, and the mixed-methods design was not explicitly chosen for data collection and analysis. A solid research framework is needed for researchers who intend to analyze Twitter data through the use of CA. Conclusions We propose the CCA model as a useful framework that provides a straightforward approach to guide Twitter-driven studies and that adds rigor to health care social media investigations. We provide suggestions for the use of the CCA model in elder care-related contexts. PMID:26957477

  17. Measuring coherence with entanglement concurrence

    NASA Astrophysics Data System (ADS)

    Qi, Xianfei; Gao, Ting; Yan, Fengli

    2017-07-01

    Quantum coherence is a fundamental manifestation of the quantum superposition principle. Recently, Baumgratz et al (2014 Phys. Rev. Lett. 113 140401) presented a rigorous framework to quantify coherence from the view of theory of physical resource. Here we propose a new valid quantum coherence measure which is a convex roof measure, for a quantum system of arbitrary dimension, essentially using the generalized Gell-Mann matrices. Rigorous proof shows that the proposed coherence measure, coherence concurrence, fulfills all the requirements dictated by the resource theory of quantum coherence measures. Moreover, strong links between the resource frameworks of coherence concurrence and entanglement concurrence is derived, which shows that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. Our work provides a clear quantitative and operational connection between coherence and entanglement based on two kinds of concurrence. This new coherence measure, coherence concurrence, may also be beneficial to the study of quantum coherence.

  18. Establishing Competitive Production Sources. A Handbook for Program Managers

    DTIC Science & Technology

    1984-08-01

    purp’ ose of t6-is handbocok is to ’orovide :he ’ projram ma~iager and otner aca ’u1sitiofl cff’caS with a systenatic guide to the assessment ...manager must assess the feasibility and effectiveness of production competition in - a highly detailed, rigorous, and quantitative way. These...the program manager and other acquisition officials with a systematic guide to the assessment , implementation, and execution of production competition

  19. All biology is computational biology.

    PubMed

    Markowetz, Florian

    2017-03-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.

  20. High and low rigor temperature effects on sheep meat tenderness and ageing.

    PubMed

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (P<0.001). The mean sarcomere length values of meat samples for 18 and 35°C rigor at each ageing time were significantly different (P<0.001), the samples at 35°C being shorter. When the short sarcomere length values and corresponding shear force values were removed for further data analysis, the shear force values for the 35°C rigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  1. qF-SSOP: real-time optical property corrected fluorescence imaging

    PubMed Central

    Valdes, Pablo A.; Angelo, Joseph P.; Choi, Hak Soo; Gioux, Sylvain

    2017-01-01

    Fluorescence imaging is well suited to provide image guidance during resections in oncologic and vascular surgery. However, the distorting effects of tissue optical properties on the emitted fluorescence are poorly compensated for on even the most advanced fluorescence image guidance systems, leading to subjective and inaccurate estimates of tissue fluorophore concentrations. Here we present a novel fluorescence imaging technique that performs real-time (i.e., video rate) optical property corrected fluorescence imaging. We perform full field of view simultaneous imaging of tissue optical properties using Single Snapshot of Optical Properties (SSOP) and fluorescence detection. The estimated optical properties are used to correct the emitted fluorescence with a quantitative fluorescence model to provide quantitative fluorescence-Single Snapshot of Optical Properties (qF-SSOP) images with less than 5% error. The technique is rigorous, fast, and quantitative, enabling ease of integration into the surgical workflow with the potential to improve molecular guidance intraoperatively. PMID:28856038

  2. Mixed methods research design for pragmatic psychoanalytic studies.

    PubMed

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena.

  3. Nuclear magnetic relaxation by the dipolar EMOR mechanism: General theory with applications to two-spin systems.

    PubMed

    Chang, Zhiwei; Halle, Bertil

    2016-02-28

    In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.

  4. Nuclear magnetic relaxation by the dipolar EMOR mechanism: General theory with applications to two-spin systems

    NASA Astrophysics Data System (ADS)

    Chang, Zhiwei; Halle, Bertil

    2016-02-01

    In aqueous systems with immobilized macromolecules, including biological tissue, the longitudinal spin relaxation of water protons is primarily induced by exchange-mediated orientational randomization (EMOR) of intra- and intermolecular magnetic dipole-dipole couplings. We have embarked on a systematic program to develop, from the stochastic Liouville equation, a general and rigorous theory that can describe relaxation by the dipolar EMOR mechanism over the full range of exchange rates, dipole coupling strengths, and Larmor frequencies. Here, we present a general theoretical framework applicable to spin systems of arbitrary size with symmetric or asymmetric exchange. So far, the dipolar EMOR theory is only available for a two-spin system with symmetric exchange. Asymmetric exchange, when the spin system is fragmented by the exchange, introduces new and unexpected phenomena. Notably, the anisotropic dipole couplings of non-exchanging spins break the axial symmetry in spin Liouville space, thereby opening up new relaxation channels in the locally anisotropic sites, including longitudinal-transverse cross relaxation. Such cross-mode relaxation operates only at low fields; at higher fields it becomes nonsecular, leading to an unusual inverted relaxation dispersion that splits the extreme-narrowing regime into two sub-regimes. The general dipolar EMOR theory is illustrated here by a detailed analysis of the asymmetric two-spin case, for which we present relaxation dispersion profiles over a wide range of conditions as well as analytical results for integral relaxation rates and time-dependent spin modes in the zero-field and motional-narrowing regimes. The general theoretical framework presented here will enable a quantitative analysis of frequency-dependent water-proton longitudinal relaxation in model systems with immobilized macromolecules and, ultimately, will provide a rigorous link between relaxation-based magnetic resonance image contrast and molecular parameters.

  5. Binding proteins enhance specific uptake rate by increasing the substrate-transporter encounter rate.

    PubMed

    Bosdriesz, Evert; Magnúsdóttir, Stefanía; Bruggeman, Frank J; Teusink, Bas; Molenaar, Douwe

    2015-06-01

    Microorganisms rely on binding-protein assisted, active transport systems to scavenge for scarce nutrients. Several advantages of using binding proteins in such uptake systems have been proposed. However, a systematic, rigorous and quantitative analysis of the function of binding proteins is lacking. By combining knowledge of selection pressure and physiochemical constraints, we derive kinetic, thermodynamic, and stoichiometric properties of binding-protein dependent transport systems that enable a maximal import activity per amount of transporter. Under the hypothesis that this maximal specific activity of the transport complex is the selection objective, binding protein concentrations should exceed the concentration of both the scarce nutrient and the transporter. This increases the encounter rate of transporter with loaded binding protein at low substrate concentrations, thereby enhancing the affinity and specific uptake rate. These predictions are experimentally testable, and a number of observations confirm them. © 2015 FEBS.

  6. Optimizing Hybrid Metrology: Rigorous Implementation of Bayesian and Combined Regression.

    PubMed

    Henn, Mark-Alexander; Silver, Richard M; Villarrubia, John S; Zhang, Nien Fan; Zhou, Hui; Barnes, Bryan M; Ming, Bin; Vladár, András E

    2015-01-01

    Hybrid metrology, e.g., the combination of several measurement techniques to determine critical dimensions, is an increasingly important approach to meet the needs of the semiconductor industry. A proper use of hybrid metrology may yield not only more reliable estimates for the quantitative characterization of 3-D structures but also a more realistic estimation of the corresponding uncertainties. Recent developments at the National Institute of Standards and Technology (NIST) feature the combination of optical critical dimension (OCD) measurements and scanning electron microscope (SEM) results. The hybrid methodology offers the potential to make measurements of essential 3-D attributes that may not be otherwise feasible. However, combining techniques gives rise to essential challenges in error analysis and comparing results from different instrument models, especially the effect of systematic and highly correlated errors in the measurement on the χ 2 function that is minimized. Both hypothetical examples and measurement data are used to illustrate solutions to these challenges.

  7. Interpretation of HCMM images: A regional study

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Potential users of HCMM data, especially those with only a cursory background in thermal remote sensing are familiarized with the kinds of information contained in the images that can be extracted with some reliability solely from inspection of such standard products as those generated at NASA/GSFC and now achieved in the National Space Science Data Center. Visual analysis of photoimagery is prone to various misimpressions and outright errors brought on by unawareness of the influence of physical factors as well as by sometimes misleading tonal patterns introduced during photoprocessing. The quantitative approach, which relies on computer processing of digital HCMM data, field measurements, and integration of rigorous mathematical models, can usually be used to identify, compensate for, or correct the contributions from at least some of the natural factors and those associated with photoprocessing. Color composite, day-IR, night-IR and visible images of California and Nevada are examined.

  8. Dependence of exponents on text length versus finite-size scaling for word-frequency distributions

    NASA Astrophysics Data System (ADS)

    Corral, Álvaro; Font-Clos, Francesc

    2017-08-01

    Some authors have recently argued that a finite-size scaling law for the text-length dependence of word-frequency distributions cannot be conceptually valid. Here we give solid quantitative evidence for the validity of this scaling law, using both careful statistical tests and analytical arguments based on the generalized central-limit theorem applied to the moments of the distribution (and obtaining a novel derivation of Heaps' law as a by-product). We also find that the picture of word-frequency distributions with power-law exponents that decrease with text length [X. Yan and P. Minnhagen, Physica A 444, 828 (2016), 10.1016/j.physa.2015.10.082] does not stand with rigorous statistical analysis. Instead, we show that the distributions are perfectly described by power-law tails with stable exponents, whose values are close to 2, in agreement with the classical Zipf's law. Some misconceptions about scaling are also clarified.

  9. Human life support during interplanetary travel and domicile. II - Generic Modular Flow Schematic modeling

    NASA Technical Reports Server (NTRS)

    Farral, Joseph F.; Seshan, P. K.; Rohatgi, Naresh K.

    1991-01-01

    This paper describes the Generic Modular Flow Schematic (GMFS) architecture capable of encompassing all functional elements of a physical/chemical life support system (LSS). The GMFS can be implemented to synthesize, model, analyze, and quantitatively compare many configurations of LSSs, from a simple, completely open-loop to a very complex closed-loop. The GMFS model is coded in ASPEN, a state-of-the-art chemical process simulation program, to accurately compute the material, heat, and power flow quantities for every stream in each of the subsystem functional elements (SFEs) in the chosen configuration of a life support system. The GMFS approach integrates the various SFEs and subsystems in a hierarchical and modular fashion facilitating rapid substitutions and reconfiguration of a life support system. The comprehensive ASPEN material and energy balance output is transferred to a systems and technology assessment spreadsheet for rigorous system analysis and trade studies.

  10. Molecular details of dimerization kinetics reveal negligible populations of transient µ-opioid receptor homodimers at physiological concentrations.

    PubMed

    Meral, Derya; Provasi, Davide; Prada-Gracia, Diego; Möller, Jan; Marino, Kristen; Lohse, Martin J; Filizola, Marta

    2018-05-16

    Various experimental and computational techniques have been employed over the past decade to provide structural and thermodynamic insights into G Protein-Coupled Receptor (GPCR) dimerization. Here, we use multiple microsecond-long, coarse-grained, biased and unbiased molecular dynamics simulations (a total of ~4 milliseconds) combined with multi-ensemble Markov state models to elucidate the kinetics of homodimerization of a prototypic GPCR, the µ-opioid receptor (MOR), embedded in a 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine (POPC)/cholesterol lipid bilayer. Analysis of these computations identifies kinetically distinct macrostates comprising several different short-lived dimeric configurations of either inactive or activated MOR. Calculated kinetic rates and fractions of dimers at different MOR concentrations suggest a negligible population of MOR homodimers at physiological concentrations, which is supported by acceptor photobleaching fluorescence resonance energy transfer (FRET) experiments. This study provides a rigorous, quantitative explanation for some conflicting experimental data on GPCR oligomerization.

  11. Immaterial Boys? A Large-Scale Exploration of Gender-Based Differences in Child Sexual Exploitation Service Users.

    PubMed

    Cockbain, Ella; Ashby, Matthew; Brayley, Helen

    2017-10-01

    Child sexual exploitation is increasingly recognized nationally and internationally as a pressing child protection, crime prevention, and public health issue. In the United Kingdom, for example, a recent series of high-profile cases has fueled pressure on policy makers and practitioners to improve responses. Yet, prevailing discourse, research, and interventions around child sexual exploitation have focused overwhelmingly on female victims. This study was designed to help redress fundamental knowledge gaps around boys affected by sexual exploitation. This was achieved through rigorous quantitative analysis of individual-level data for 9,042 users of child sexual exploitation services in the United Kingdom. One third of the sample were boys, and gender was associated with statistically significant differences on many variables. The results of this exploratory study highlight the need for further targeted research and more nuanced and inclusive counter-strategies.

  12. Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.

    PubMed

    Montalvo-Acosta, Joel José; Cecchini, Marco

    2016-12-01

    The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Science curriculum effects in high school: A quantitative synthesis

    NASA Astrophysics Data System (ADS)

    Weinstein, Thomas; Boulanger, F. David; Walberg, Herbert J.

    To assess the impact of the innovative precollege science curricula of the past twenty years on learning, a search was conducted using the computer-assisted Bibliographic Retrieval System (BRS), the ERIC Annual Summaries of Research in Science Education, and Dissertation Abstracts International. A total of 151 effect sizes were obtained from 33 studies representing 19,149 junior and senior high school students in the United States, Great Britain, and Israel. Study-weighted analysis yielded an overall mean effect size of 0.31 significantly favorable to the innovative curricula [t(25) = 2.183, p < 0.05] on all outcomes. Student performance in innovative curricula averaged in the 62nd percentile relative to the control norm. Tabulation of signed comparisons indicated that sixty-four out of eighty-one unweighted outcomes were favorable to the innovative curricula. Separate analyses for test content bias, methodological rigor, type of learning, and student characteristics showed no significant differences across these categories.

  14. Gene expression of Caenorhabditis elegans neurons carries information on their synaptic connectivity.

    PubMed

    Kaufman, Alon; Dror, Gideon; Meilijson, Isaac; Ruppin, Eytan

    2006-12-08

    The claim that genetic properties of neurons significantly influence their synaptic network structure is a common notion in neuroscience. The nematode Caenorhabditis elegans provides an exciting opportunity to approach this question in a large-scale quantitative manner. Its synaptic connectivity network has been identified, and, combined with cellular studies, we currently have characteristic connectivity and gene expression signatures for most of its neurons. By using two complementary analysis assays we show that the expression signature of a neuron carries significant information about its synaptic connectivity signature, and identify a list of putative genes predicting neural connectivity. The current study rigorously quantifies the relation between gene expression and synaptic connectivity signatures in the C. elegans nervous system and identifies subsets of neurons where this relation is highly marked. The results presented and the genes identified provide a promising starting point for further, more detailed computational and experimental investigations.

  15. Imaging the Localized Plasmon Resonance Modes in Graphene Nanoribbons

    DOE PAGES

    Hu, F.; Luan, Y.; Fei, Z.; ...

    2017-08-14

    Here, we report a nanoinfrared (IR) imaging study of the localized plasmon resonance modes of graphene nanoribbons (GNRs) using a scattering-type scanning near-field optical microscope (s-SNOM). By comparing the imaging data of GNRs that are aligned parallel and perpendicular to the in-plane component of the excitation laser field, we observed symmetric and asymmetric plasmonic interference fringes, respectively. Theoretical analysis indicates that the asymmetric fringes are formed due to the interplay between the localized surface plasmon resonance (SPR) mode excited by the GNRs and the propagative surface plasmon polariton (SPP) mode launched by the s-SNOM tip. And with rigorous simulations, wemore » reproduce the observed fringe patterns and address quantitatively the role of the s-SNOM tip on both the SPR and SPP modes. Moreover, we have seen real-space signatures of both the dipole and higher-order SPR modes by varying the ribbon width.« less

  16. Quantitative validation of carbon-fiber laminate low velocity impact simulations

    DOE PAGES

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    2015-09-26

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  17. Quantitative comparison of cognitive behavioral therapy and music therapy research: a methodological best-practices analysis to guide future investigation for adult psychiatric patients.

    PubMed

    Silverman, Michael J

    2008-01-01

    While the music therapy profession is relatively young and small in size, it can treat a variety of clinical populations and has established a diverse research base. However, although the profession originated working with persons diagnosed with mental illnesses, there is a considerable lack of quantitative research concerning the effects of music therapy with this population. Music therapy clinicians and researchers have reported on this lack of evidence and the difficulty in conducting psychosocial research on their interventions (Choi, 1997; Silverman, 2003a). While published studies have provided suggestions for future research, no studies have provided detailed propositions for the methodology and design of meticulous high quality randomized controlled psychiatric music therapy research. How do other psychotherapies accomplish their databases and could the music therapy field borrow from their rigorous "methodological best practices" to strengthen its own literature base? Therefore, as the National Institutes of Mental Health state the treatment of choice for evidence-based psychotherapy is cognitive behavioral therapy (CBT), aspects of this psychotherapy's literature base were analyzed. The purpose of this literature analysis was to (a) analyze and identify components of high-quality quantitative CBT research for adult psychiatric consumers, (b) analyze and identify the variables and other elements of existing quantitative psychiatric music therapy research for adult consumers, and (c) compare the two data sets to identify the best methodological designs and variables for future quantitative music therapy research with the mental health population. A table analyzing randomized and thoroughly controlled studies involving the use of CBT for persons with severe mental illnesses is included to determine chief components of high-quality experimental research designs and implementation of quantitative clinical research. The table also shows the same analyzed components for existing quantitative psychiatric music therapy research with adult consumers, thus highlighting potential areas and elements for future investigations. A second table depicts a number of potential dependent measures and their sources to be evaluated in future music therapy studies. A third table providing suggestions for future research is derived from a synthesis of the tables and is included to guide researchers and encourage the advancement and expansion of the current literature base. The body of the paper is a discussion of the results of the literature analysis derived from the tables, meta-analyses, and reviews of literature. It is hoped that this report will lead to the addition of future high-quality quantitative research to the psychiatric music therapy literature base and thus provide evidence-based services to as many persons with mental illnesses as possible.

  18. Space radiator simulation system analysis

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.

  19. Rethinking Meta-Analysis: Applications for Air Pollution Data and Beyond

    PubMed Central

    Goodman, Julie E; Petito Boyce, Catherine; Sax, Sonja N; Beyer, Leslie A; Prueitt, Robyn L

    2015-01-01

    Meta-analyses offer a rigorous and transparent systematic framework for synthesizing data that can be used for a wide range of research areas, study designs, and data types. Both the outcome of meta-analyses and the meta-analysis process itself can yield useful insights for answering scientific questions and making policy decisions. Development of the National Ambient Air Quality Standards illustrates many potential applications of meta-analysis. These applications demonstrate the strengths and limitations of meta-analysis, issues that arise in various data realms, how meta-analysis design choices can influence interpretation of results, and how meta-analysis can be used to address bias and heterogeneity. Reviewing available data from a meta-analysis perspective can provide a useful framework and impetus for identifying and refining strategies for future research. Moreover, increased pervasiveness of a meta-analysis mindset—focusing on how the pieces of the research puzzle fit together—would benefit scientific research and data syntheses regardless of whether or not a quantitative meta-analysis is undertaken. While an individual meta-analysis can only synthesize studies addressing the same research question, the results of separate meta-analyses can be combined to address a question encompassing multiple data types. This observation applies to any scientific or policy area where information from a variety of disciplines must be considered to address a broader research question. PMID:25969128

  20. Analysis of the National Ignition Facility Ignition Hohlraum Energetics Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Town, R J; Rosen, M D; Michel, P A

    2010-11-22

    A series of forty experiments on the National Ignition Facility (NIF) [E. I. Moses et al., Phys. Plasmas 16, 041006 (2009)] to study energy balance and implosion symmetry in reduced- and full-scale ignition hohlraums was shot at energies up to 1.3 MJ. This paper reports the findings of the analysis of the ensemble of experimental data obtained that has produced an improved model for simulating ignition hohlraums. Last year the first observation in a NIF hohlraum of energy transfer between cones of beams as a function of wavelength shift between those cones was reported [P. Michel, et al, Phys ofmore » Plasmas, 17, 056305, (2010)]. Detailed analysis of hohlraum wall emission as measured through the laser entrance hole (LEH) has allowed the amount of energy transferred versus wavelength shift to be quantified. The change in outer beam brightness is found to be quantitatively consistent with LASNEX [G. B. Zimmerman and W. L. Kruer, Comments Plasma Phys. Control. Fusion 2, 51 (1975)] simulations using the predicted energy transfer when possible saturation of the plasma wave mediating the transfer is included. The effect of the predicted energy transfer on implosion symmetry is also found to be in good agreement with gated x-ray framing camera images. Hohlraum energy balance, as measured by x-ray power escaping the LEH, is quantitatively consistent with revised estimates of backscatter and incident laser energy combined with a more rigorous non-local-thermodynamic-equilibrium atomic physics model with greater emissivity than the simpler average-atom model used in the original design of NIF targets.« less

  1. Evaluating and learning from RNA pseudotorsional space: quantitative validation of a reduced representation for RNA structure.

    PubMed

    Wadley, Leven M; Keating, Kevin S; Duarte, Carlos M; Pyle, Anna Marie

    2007-09-28

    Quantitatively describing RNA structure and conformational elements remains a formidable problem. Seven standard torsion angles and the sugar pucker are necessary to characterize the conformation of an RNA nucleotide completely. Progress has been made toward understanding the discrete nature of RNA structure, but classifying simple and ubiquitous structural elements such as helices and motifs remains a difficult task. One approach for describing RNA structure in a simple, mathematically consistent, and computationally accessible manner involves the invocation of two pseudotorsions, eta (C4'(n-1), P(n), C4'(n), P(n+1)) and theta (P(n), C4'(n), P(n+1), C4'(n+1)), which can be used to describe RNA conformation in much the same way that varphi and psi are used to describe backbone configuration of proteins. Here, we conduct an exploration and statistical evaluation of pseudotorsional space and of the Ramachandran-like eta-theta plot. We show that, through the rigorous quantitative analysis of the eta-theta plot, the pseudotorsional descriptors eta and theta, together with sugar pucker, are sufficient to describe RNA backbone conformation fully in most cases. These descriptors are also shown to contain considerable information about nucleotide base conformation, revealing a previously uncharacterized interplay between backbone and base orientation. A window function analysis is used to discern statistically relevant regions of density in the eta-theta scatter plot and then nucleotides in colocalized clusters in the eta-theta plane are shown to have similar 3-D structures through RMSD analysis of the RNA structural constituents. We find that major clusters in the eta-theta plot are few, underscoring the discrete nature of RNA backbone conformation. Like the Ramachandran plot, the eta-theta plot is a valuable system for conceptualizing biomolecular conformation, it is a useful tool for analyzing RNA tertiary structures, and it is a vital component of new approaches for solving the 3-D structures of large RNA molecules and RNA assemblies.

  2. A quantitative systematic review of the efficacy of mobile phone interventions to improve medication adherence.

    PubMed

    Park, Linda G; Howie-Esquivel, Jill; Dracup, Kathleen

    2014-09-01

    To evaluate the characteristics and efficacy of mobile phone interventions to improve medication adherence. Secondary aims are to explore participants' acceptability and satisfaction with mobile phone interventions and to evaluate the selected studies in terms of study rigour, impact, cost and resource feasibility, generalizability and implications for nursing practice and research. Medication non-adherence is a major global challenge. Mobile phones are the most commonly used form of technology worldwide and have the potential to promote medication adherence. Guidelines from the Centre for Reviews and Dissemination were followed for this systematic review. A comprehensive search of databases (PubMed, Web of Science, CINAHL, PsycInfo, Google Chrome and Cochrane) and bibliographies from related articles was performed from January 2002-January 2013 to identify the included studies. A quantitative systematic review without meta-analysis was conducted and the selected studies were critically evaluated to extract and summarize pertinent characteristics and outcomes. The literature search produced 29 quantitative research studies related to mobile phones and medication adherence. The studies were conducted for prevention purposes as well as management of acute and chronic illnesses. All of the studies used text messaging. Eighteen studies found significant improvement in medication adherence. While the majority of investigators found improvement in medication adherence, long-term studies characterized by rigorous research methodologies, appropriate statistical and economic analyses and the test of theory-based interventions are needed to determine the efficacy of mobile phones to influence medication adherence. © 2014 John Wiley & Sons Ltd.

  3. A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.

    PubMed

    Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao

    2015-06-15

    ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Simulated Patients in Physical Therapy Education: Systematic Review and Meta-Analysis.

    PubMed

    Pritchard, Shane A; Blackstock, Felicity C; Nestel, Debra; Keating, Jenny L

    2016-09-01

    Traditional models of physical therapy clinical education are experiencing unprecedented pressures. Simulation-based education with simulated (standardized) patients (SPs) is one alternative that has significant potential value, and implementation is increasing globally. However, no review evaluating the effects of SPs on professional (entry-level) physical therapy education is available. The purpose of this study was to synthesize and critically appraise the findings of empirical studies evaluating the contribution of SPs to entry-level physical therapy education, compared with no SP interaction or an alternative education strategy, on any outcome relevant to learning. A systematic search was conducted of Ovid MEDLINE, PubMed, AMED, ERIC, and CINAHL Plus databases and reference lists of included articles, relevant reviews, and gray literature up to May 2015. Articles reporting quantitative or qualitative data evaluating the contribution of SPs to entry-level physical therapy education were included. Two reviewers independently extracted study characteristics, intervention details, and quantitative and qualitative evaluation data from the 14 articles that met the eligibility criteria. Pooled random-effects meta-analysis indicated that replacing up to 25% of authentic patient-based physical therapist practice with SP-based education results in comparable competency (mean difference=1.55/100; 95% confidence interval=-1.08, 4.18; P=.25). Thematic analysis of qualitative data indicated that students value learning with SPs. Assumptions were made to enable pooling of data, and the search strategy was limited to English. Simulated patients appear to have an effect comparable to that of alternative educational strategies on development of physical therapy clinical practice competencies and serve a valuable role in entry-level physical therapy education. However, available research lacks the rigor required for confidence in findings. Given the potential advantages for students, high-quality studies that include an economic analysis should be conducted. © 2016 American Physical Therapy Association.

  5. Phenomenological Models and Animations of Welding and their Impact

    NASA Astrophysics Data System (ADS)

    DebRoy, Tarasankar

    Professor Robertson's recognized research on metallurgical thermodynamics and kinetics for over 40 years facilitated the emergence of rigorous quantitative understanding of many complex metallurgical processes. The author had the opportunity to work with Professor Robertson on liquid metals in the 1970s. This paper is intended to review the advances in the quantitative understanding of welding processes and weld metal attributes in recent decades. Over this period, phenomenological models have been developed to better understand and control various welding processes and the structure and properties of welded materials. Numerical models and animations of melting, solidification and the evolution of micro and macro-structural features will be presented to critically examine their impact on the practice of welding and the underlying science.

  6. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  7. Management of natural resources through automatic cartographic inventory. [Spanish Catalonia and Landes of Gascony (France)

    NASA Technical Reports Server (NTRS)

    Rey, P. A.; Gourinard, Y.; Cambou, F. (Principal Investigator); Guyader, J. C.; Gouaux, P.; Letoan, T.; Monchant, M.; Donville, B.; Loubet, D.

    1973-01-01

    The author has identified the following significant results. Significant results of the ARNICA program (February - December 1973) were: (1) The quantitative processing of ERTS-1 data was developed along two lines: the study of geological structures and lineaments of Spanish Catalonia, and the phytogeographical study of the forest region of the Landes of Gascony (France). In both cases it is shown that the ERTS-1 imagery can be used in establishing zonings of equal quantitative interpretation value. (2) In keeping with the operational transfer program proposed in previous reports between exploration of the imagery and charting of the object, a precise data processing method was developed, concerning more particularly the selection of digital equidensity samples computer display and rigorous referencing.

  8. The Nature of Electrochemical Delithiation of Li-Mg Alloy Electrodes: Neutron Computed Tomography and Modeling of Li Diffusion and Delithiation Phenomenon

    DOE PAGES

    Zhang, Yuxuan; Chandran, K.S. Ravi; Jagannathan, M.; ...

    2016-12-05

    Li-Mg alloys are promising as positive electrodes (anodes) for Li-ion batteries due to the high Li storage capacity and the relatively lower volume change during the lithiation/delithiation process. They also present a unique opportunity to image the Li distribution through the electrode thickness at various delithiation states. In this work, spatial distributions of Li in electrochemically delithiated Li-Mg alloy electrodes have been quantitatively determined using neutron tomography. Specifically, the Li concentration profiles along thickness direction are determined. A rigorous analytical model to quantify the diffusion-controlled delithiation, accompanied by phase transition and boundary movement, has also been developed to explain themore » delithiation mechanism. The analytical modeling scheme successfully predicted the Li concentration profiles which agreed well with the experimental data. It is demonstrated that during discharge Li is removed by diffusion through the solid solution Li-Mg phases and this proceeds with β→α phase transition and the associated phase boundary movement through the thickness of the electrode. This is also accompanied by electrode thinning due to the change in molar volume during delithiation. In conclusion, following the approaches developed here, one can develop a rigorous and quantitative understanding of electrochemical delithiation in electrodes of electrochemical cells, similar to that in the present Li-Mg electrodes.« less

  9. The Nature of Electrochemical Delithiation of Li-Mg Alloy Electrodes: Neutron Computed Tomography and Modeling of Li Diffusion and Delithiation Phenomenon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yuxuan; Chandran, K.S. Ravi; Jagannathan, M.

    Li-Mg alloys are promising as positive electrodes (anodes) for Li-ion batteries due to the high Li storage capacity and the relatively lower volume change during the lithiation/delithiation process. They also present a unique opportunity to image the Li distribution through the electrode thickness at various delithiation states. In this work, spatial distributions of Li in electrochemically delithiated Li-Mg alloy electrodes have been quantitatively determined using neutron tomography. Specifically, the Li concentration profiles along thickness direction are determined. A rigorous analytical model to quantify the diffusion-controlled delithiation, accompanied by phase transition and boundary movement, has also been developed to explain themore » delithiation mechanism. The analytical modeling scheme successfully predicted the Li concentration profiles which agreed well with the experimental data. It is demonstrated that during discharge Li is removed by diffusion through the solid solution Li-Mg phases and this proceeds with β→α phase transition and the associated phase boundary movement through the thickness of the electrode. This is also accompanied by electrode thinning due to the change in molar volume during delithiation. In conclusion, following the approaches developed here, one can develop a rigorous and quantitative understanding of electrochemical delithiation in electrodes of electrochemical cells, similar to that in the present Li-Mg electrodes.« less

  10. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-03-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.

  11. ASSESSMENT OF MAST IN EUROPEAN PATIENT-CENTERED TELEMEDICINE PILOTS.

    PubMed

    Ekeland, Anne Granstrøm; Grøttland, Astrid

    2015-01-01

    Model for ASsessment of Telemedicine Applications (MAST) is a health technology assessment (HTA) inspired framework for assessing the effectiveness and contribution to quality of telemedicine applications based on rigorous, scientific data. This study reports from a study of how it was used and perceived in twenty-one pilots of the European project RENEWING HEALTH (RH). The objectives of RH were to implement large-scale, real-life test beds for the validation and subsequent evaluation of innovative patient-centered telemedicine services. The study is a contribution to the appraisal of HTA methods. A questionnaire was administered for project leaders of the pilots. It included questions about use and usefulness of MAST for (i) preceding considerations, (ii) evaluation of outcomes within seven domains, and (iii) considerations of transferability. Free text spaces allowed for proposals of improvement. The responses covered all pilots. A quantitative summary of use and a qualitative analysis of usefulness were performed. MAST was used and considered useful for pilot evaluations. Challenges included problems to scientifically determine alternative service options and outcome within the seven domains. Proposals for improvement included process studies and adding domains of technological usability, responsible innovation, health literacy, behavior change, caregiver perspectives and motivational issues of professionals. MAST was used according to its structure. Its usefulness in patient centered pilots can be improved by adding new stakeholder groups. Interdependencies between scientific rigor, resources and timeliness should be addressed. Operational options for improvements include process studies, literature reviews and sequential mini-HTAs for identification of areas for more elaborate investigations.

  12. Towards rigorous analysis of the Levitov-Mirlin-Evers recursion

    NASA Astrophysics Data System (ADS)

    Fyodorov, Y. V.; Kupiainen, A.; Webb, C.

    2016-12-01

    This paper aims to develop a rigorous asymptotic analysis of an approximate renormalization group recursion for inverse participation ratios P q of critical powerlaw random band matrices. The recursion goes back to the work by Mirlin and Evers (2000 Phys. Rev. B 62 7920) and earlier works by Levitov (1990 Phys. Rev. Lett. 64 547, 1999 Ann. Phys. 8 697-706) and is aimed to describe the ensuing multifractality of the eigenvectors of such matrices. We point out both similarities and dissimilarities between the LME recursion and those appearing in the theory of multiplicative cascades and branching random walks and show that the methods developed in those fields can be adapted to the present case. In particular the LME recursion is shown to exhibit a phase transition, which we expect is a freezing transition, where the role of temperature is played by the exponent q. However, the LME recursion has features that make its rigorous analysis considerably harder and we point out several open problems for further study.

  13. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  14. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    NASA Astrophysics Data System (ADS)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming, on paleoclimate scales the cause-effect relation may be completely reversed. Key words: Causation, Information flow, Uncertainty Generation, El Niño, IOD, CO2/Global warming Reference : Liang, 2014: Unraveling the cause-effect relation between time series. PRE 90, 052150 News Report: http://scitation.aip.org/content/aip/magazine/physicstoday/news/10.1063/PT.5.7124

  15. The accurate assessment of small-angle X-ray scattering data

    DOE PAGES

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  16. Skill Assessment in Ocean Biological Data Assimilation

    NASA Technical Reports Server (NTRS)

    Gregg, Watson W.; Friedrichs, Marjorie A. M.; Robinson, Allan R.; Rose, Kenneth A.; Schlitzer, Reiner; Thompson, Keith R.; Doney, Scott C.

    2008-01-01

    There is growing recognition that rigorous skill assessment is required to understand the ability of ocean biological models to represent ocean processes and distributions. Statistical analysis of model results with observations represents the most quantitative form of skill assessment, and this principle serves as well for data assimilation models. However, skill assessment for data assimilation requires special consideration. This is because there are three sets of information in the free-run model, data, and the assimilation model, which uses Data assimilation information from both the flee-run model and the data. Intercom parison of results among the three sets of information is important and useful for assessment, but is not conclusive since the three information sets are intertwined. An independent data set is necessary for an objective determination. Other useful measures of ocean biological data assimilation assessment include responses of unassimilated variables to the data assimilation, performance outside the prescribed region/time of interest, forecasting, and trend analysis. Examples of each approach from the literature are provided. A comprehensive list of ocean biological data assimilation and their applications of skill assessment, in both ecosystem/biogeochemical and fisheries efforts, is summarized.

  17. Water Adsorption and Dissociation on Polycrystalline Copper Oxides: Effects of Environmental Contamination and Experimental Protocol

    DOE PAGES

    Trotochaud, Lena; Head, Ashley R.; Pletincx, Sven; ...

    2017-11-02

    We use ambient-pressure X-ray photoelectron spectroscopy (APXPS) to study chemical changes, including hydroxylation and water adsorption, at copper oxide surfaces from ultrahigh vacuum to ambient relative humidities of ~5%. Polycrystalline CuO and Cu 2O surfaces were prepared by selective oxidation of metallic copper foils. For both oxides, hydroxylation occurs readily, even at high-vacuum conditions. Hydroxylation on both oxides plateaus near ~0.01% relative humidity (RH) at a coverage of ~1 monolayer. In contrast to previous studies, neither oxide shows significant accumulation of molecular water; rather, both surfaces show a high affinity for adventitious carbon contaminants. Results of isobaric and isothermic experimentsmore » are compared, and the strengths and potential drawbacks of each method are discussed. We also provide critical evaluations of the effects of the hot filament of the ion pressure gauge on the reactivity of gas-phase species, the peak fitting procedure on the quantitative analysis of spectra, and rigorous accounting of carbon contamination on data analysis and interpretation. Lastly, this work underscores the importance of considering experimental design and data analysis protocols during APXPS experiments with water vapor in order to minimize misinterpretations arising from these factors.« less

  18. Enhancing the quality and credibility of qualitative analysis.

    PubMed

    Patton, M Q

    1999-12-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems.

  19. PyXRF: Python-based X-ray fluorescence analysis package

    NASA Astrophysics Data System (ADS)

    Li, Li; Yan, Hanfei; Xu, Wei; Yu, Dantong; Heroux, Annie; Lee, Wah-Keat; Campbell, Stuart I.; Chu, Yong S.

    2017-09-01

    We developed a python-based fluorescence analysis package (PyXRF) at the National Synchrotron Light Source II (NSLS-II) for the X-ray fluorescence-microscopy beamlines, including Hard X-ray Nanoprobe (HXN), and Submicron Resolution X-ray Spectroscopy (SRX). This package contains a high-level fitting engine, a comprehensive commandline/ GUI design, rigorous physics calculations, and a visualization interface. PyXRF offers a method of automatically finding elements, so that users do not need to spend extra time selecting elements manually. Moreover, PyXRF provides a convenient and interactive way of adjusting fitting parameters with physical constraints. This will help us perform quantitative analysis, and find an appropriate initial guess for fitting. Furthermore, we also create an advanced mode for expert users to construct their own fitting strategies with a full control of each fitting parameter. PyXRF runs single-pixel fitting at a fast speed, which opens up the possibilities of viewing the results of fitting in real time during experiments. A convenient I/O interface was designed to obtain data directly from NSLS-II's experimental database. PyXRF is under open-source development and designed to be an integral part of NSLS-II's scientific computation library.

  20. Water Adsorption and Dissociation on Polycrystalline Copper Oxides: Effects of Environmental Contamination and Experimental Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trotochaud, Lena; Head, Ashley R.; Pletincx, Sven

    We use ambient-pressure X-ray photoelectron spectroscopy (APXPS) to study chemical changes, including hydroxylation and water adsorption, at copper oxide surfaces from ultrahigh vacuum to ambient relative humidities of ~5%. Polycrystalline CuO and Cu 2O surfaces were prepared by selective oxidation of metallic copper foils. For both oxides, hydroxylation occurs readily, even at high-vacuum conditions. Hydroxylation on both oxides plateaus near ~0.01% relative humidity (RH) at a coverage of ~1 monolayer. In contrast to previous studies, neither oxide shows significant accumulation of molecular water; rather, both surfaces show a high affinity for adventitious carbon contaminants. Results of isobaric and isothermic experimentsmore » are compared, and the strengths and potential drawbacks of each method are discussed. We also provide critical evaluations of the effects of the hot filament of the ion pressure gauge on the reactivity of gas-phase species, the peak fitting procedure on the quantitative analysis of spectra, and rigorous accounting of carbon contamination on data analysis and interpretation. Lastly, this work underscores the importance of considering experimental design and data analysis protocols during APXPS experiments with water vapor in order to minimize misinterpretations arising from these factors.« less

  1. Statistical thermodynamics unveils the dissolution mechanism of cellobiose.

    PubMed

    Nicol, Thomas W J; Isobe, Noriyuki; Clark, James H; Shimizu, Seishi

    2017-08-30

    In the study of the cellulose dissolution mechanism opinion is still divided. Here, the solution interaction components of the most prominent hypotheses for the driving force of cellulose dissolution were evaluated quantitatively. Combining a rigorous statistical thermodynamic theory and cellobiose solubility data in the presence of chloride salts, whose cations progress in the Hofmeister series (KCl, NaCl, LiCl and ZnCl 2 ), we have shown that cellobiose solubilization is driven by the preferential accumulation of salts around the solutes which is stronger than cellobiose hydration. Yet contrary to the classical chaotropy hypothesis, increasing salt concentration leads to cellobiose dehydration in the presence of the strongest solubilizer ZnCl 2 . However, thanks to cellobiose dehydration, cellobiose-salt interaction still remains preferential despite weakening salt accumulation. Based on such insights, the previous hypotheses based on hydrophobicity and polymer charging have also been evaluated quantitatively. Thus, our present study successfully paved a way towards identifying the basic driving forces for cellulose solubilization in a quantitative manner for the first time. When combined with unit additivity methods this quantitative information could lead to a full understanding of cellulose solubility.

  2. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.

  3. Using the Principles of BIO2010 to Develop an Introductory, Interdisciplinary Course for Biology Students

    PubMed Central

    Adams, Peter; Goos, Merrilyn

    2010-01-01

    Modern biological sciences require practitioners to have increasing levels of knowledge, competence, and skills in mathematics and programming. A recent review of the science curriculum at the University of Queensland, a large, research-intensive institution in Australia, resulted in the development of a more quantitatively rigorous undergraduate program. Inspired by the National Research Council's BIO2010 report, a new interdisciplinary first-year course (SCIE1000) was created, incorporating mathematics and computer programming in the context of modern science. In this study, the perceptions of biological science students enrolled in SCIE1000 in 2008 and 2009 are measured. Analysis indicates that, as a result of taking SCIE1000, biological science students gained a positive appreciation of the importance of mathematics in their discipline. However, the data revealed that SCIE1000 did not contribute positively to gains in appreciation for computing and only slightly influenced students' motivation to enroll in upper-level quantitative-based courses. Further comparisons between 2008 and 2009 demonstrated the positive effect of using genuine, real-world contexts to enhance student perceptions toward the relevance of mathematics. The results support the recommendation from BIO2010 that mathematics should be introduced to biology students in first-year courses using real-world examples, while challenging the benefits of introducing programming in first-year courses. PMID:20810961

  4. Placebos in 19th century medicine: a quantitative analysis of the BMJ.

    PubMed

    Raicek, Jacqueline E; Stone, Bradley H; Kaptchuk, Ted J

    2012-12-18

    To provide the first quantitative data on the use of the term "placebo" in the 19th century. Computer search of BMJ's archival database from January 1840 (the first issue) through December 1899 for uses of the words "placebo(s)." Grounded theory was used to categorise the implications of uses of the term. 71 citations contained the term "placebo(s)." Of these, 22 (31%) used the term to mean "no effect" or as a general pejorative term, 18 (25%) portrayed placebo treatment as permitting the unfolding of the natural history (the normal waxing and waning of illness), 14 (20%) described placebo as important to satisfy patients, 7 (10%) described it as fulfilling a physician's performance role, 3 (4%) described its use to buy time, 3 (4%) described its use for financial gain, 2 (3%) used it in a manner similar to a placebo control, and only one implied that placebo could have a clinical effect. Only one citation mentioned telling the patient about his placebo treatment. Nineteenth century physicians had diverse a priori assumptions about placebos. These findings remind us that contemporary medicine needs to use rigorous science to separate fact from its own beliefs concerning the "provision of care." As in previous generations, ethical issues concerning placebos continue to challenge medicine.

  5. Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.

    PubMed

    Morse, Janice M

    2015-09-01

    Criteria for determining the trustworthiness of qualitative research were introduced by Guba and Lincoln in the 1980s when they replaced terminology for achieving rigor, reliability, validity, and generalizability with dependability, credibility, and transferability. Strategies for achieving trustworthiness were also introduced. This landmark contribution to qualitative research remains in use today, with only minor modifications in format. Despite the significance of this contribution over the past four decades, the strategies recommended to achieve trustworthiness have not been critically examined. Recommendations for where, why, and how to use these strategies have not been developed, and how well they achieve their intended goal has not been examined. We do not know, for example, what impact these strategies have on the completed research. In this article, I critique these strategies. I recommend that qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability. I then make recommendations for the appropriate use of the strategies recommended to achieve rigor: prolonged engagement, persistent observation, and thick, rich description; inter-rater reliability, negative case analysis; peer review or debriefing; clarifying researcher bias; member checking; external audits; and triangulation. © The Author(s) 2015.

  6. Endobiogeny: a global approach to systems biology (part 1 of 2).

    PubMed

    Lapraz, Jean-Claude; Hedayat, Kamyar M

    2013-01-01

    Endobiogeny is a global systems approach to human biology that may offer an advancement in clinical medicine based in scientific principles of rigor and experimentation and the humanistic principles of individualization of care and alleviation of suffering with minimization of harm. Endobiogeny is neither a movement away from modern science nor an uncritical embracing of pre-rational methods of inquiry but a synthesis of quantitative and qualitative relationships reflected in a systems-approach to life and based on new mathematical paradigms of pattern recognition.

  7. Rapid Creation and Quantitative Monitoring of High Coverage shRNA Libraries

    PubMed Central

    Bassik, Michael C.; Lebbink, Robert Jan; Churchman, L. Stirling; Ingolia, Nicholas T.; Patena, Weronika; LeProust, Emily M.; Schuldiner, Maya; Weissman, Jonathan S.; McManus, Michael T.

    2009-01-01

    Short hairpin RNA (shRNA) libraries are limited by the low efficacy of many shRNAs, giving false negatives, and off-target effects, giving false positives. Here we present a strategy for rapidly creating expanded shRNA pools (∼30 shRNAs/gene) that are analyzed by deep-sequencing (EXPAND). This approach enables identification of multiple effective target-specific shRNAs from a complex pool, allowing a rigorous statistical evaluation of whether a gene is a true hit. PMID:19448642

  8. Distinguishing cause from correlation in tokamak experiments to trigger edge-localised plasma instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Anthony J.; CCFE, Culham Science Centre, Abingdon OX14 3DB

    2014-11-15

    The generic question is considered: How can we determine the probability of an otherwise quasi-random event, having been triggered by an external influence? A specific problem is the quantification of the success of techniques to trigger, and hence control, edge-localised plasma instabilities (ELMs) in magnetically confined fusion (MCF) experiments. The development of such techniques is essential to ensure tolerable heat loads on components in large MCF fusion devices, and is necessary for their development into economically successful power plants. Bayesian probability theory is used to rigorously formulate the problem and to provide a formal solution. Accurate but pragmatic methods aremore » developed to estimate triggering probabilities, and are illustrated with experimental data. These allow results from experiments to be quantitatively assessed, and rigorously quantified conclusions to be formed. Example applications include assessing whether triggering of ELMs is a statistical or deterministic process, and the establishment of thresholds to ensure that ELMs are reliably triggered.« less

  9. Regulatory observations in bioanalytical determinations.

    PubMed

    Viswanathan, C T

    2010-07-01

    The concept of measuring analytes in biological media is a long-established area of the quantitative sciences that is employed in many sectors. While academic research and R&D units of private firms have been in the forefront of developing complex methodologies, it is the regulatory environment that has brought the focus and rigor to the quality control of the quantitative determination of drug concentration in biological samples. In this article, the author examines the regulatory findings discovered during the course of several years of auditing bioanalytical work. The outcomes of these findings underscore the importance of quality method validation to ensure the reliability of the data generated. The failure to ensure the reliability of these data can lead to potential risks in the health management of millions of people in the USA.

  10. Floquet spectrum and driven conductance in Dirac materials: Effects of Landau-Zener-Stuckelberg-Majorana interferometry

    NASA Astrophysics Data System (ADS)

    Rodionov, Yaroslav; Kugel, Kliment; Nori, Franco

    Using the Landau-Zener-Stückelberg-Majorana-type (LZSM) semiclassical approach, we study both graphene and a thin film of a Weyl semimetal subjected to a strong ac electromagnetic field. The spectrum of quasienergies in the Weyl semimetal turns out to be similar to that of a graphene sheet. It has been predicted qualitatively that the transport properties of strongly irradiated graphene oscillate as a function of the radiation intensity. Here we obtain rigorous quantitative results for a driven linear conductance of graphene and a thin film of a Weyl semimetal. The exact quantitative structure of oscillations exhibits two contributions. The first one is a manifestation of the Ramsauer-Townsend effect, while the second contribution is a consequence of the LZSM interference defining the spectrum of quasienergies.

  11. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    PubMed Central

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  12. Which Interventions Have the Greatest Effect on Student Learning in Sub-Saharan Africa? "A Meta-Analysis of Rigorous Impact Evaluations"

    ERIC Educational Resources Information Center

    Conn, Katharine

    2014-01-01

    In the last three decades, there has been a large increase in the number of rigorous experimental and quasi-experimental evaluations of education programs in developing countries. These impact evaluations have taken place all over the globe, including a large number in Sub-Saharan Africa (SSA). The fact that the developing world is socially and…

  13. Virus replication as a phenotypic version of polynucleotide evolution.

    PubMed

    Antoneli, Fernando; Bosco, Francisco; Castro, Diogo; Janini, Luiz Mario

    2013-04-01

    In this paper, we revisit and adapt to viral evolution an approach based on the theory of branching process advanced by Demetrius et al. (Bull. Math. Biol. 46:239-262, 1985), in their study of polynucleotide evolution. By taking into account beneficial effects, we obtain a non-trivial multivariate generalization of their single-type branching process model. Perturbative techniques allows us to obtain analytical asymptotic expressions for the main global parameters of the model, which lead to the following rigorous results: (i) a new criterion for "no sure extinction", (ii) a generalization and proof, for this particular class of models, of the lethal mutagenesis criterion proposed by Bull et al. (J. Virol. 18:2930-2939, 2007), (iii) a new proposal for the notion of relaxation time with a quantitative prescription for its evaluation, (iv) the quantitative description of the evolution of the expected values in four distinct "stages": extinction threshold, lethal mutagenesis, stationary "equilibrium", and transient. Finally, based on these quantitative results, we are able to draw some qualitative conclusions.

  14. Systemic errors in quantitative polymerase chain reaction titration of self-complementary adeno-associated viral vectors and improved alternative methods.

    PubMed

    Fagone, Paolo; Wright, J Fraser; Nathwani, Amit C; Nienhuis, Arthur W; Davidoff, Andrew M; Gray, John T

    2012-02-01

    Self-complementary AAV (scAAV) vector genomes contain a covalently closed hairpin derived from a mutated inverted terminal repeat that connects the two monomer single-stranded genomes into a head-to-head or tail-to-tail dimer. We found that during quantitative PCR (qPCR) this structure inhibits the amplification of proximal amplicons and causes the systemic underreporting of copy number by as much as 10-fold. We show that cleavage of scAAV vector genomes with restriction endonuclease to liberate amplicons from the covalently closed terminal hairpin restores quantitative amplification, and we implement this procedure in a simple, modified qPCR titration method for scAAV vectors. In addition, we developed and present an AAV genome titration procedure based on gel electrophoresis that requires minimal sample processing and has low interassay variability, and as such is well suited for the rigorous quality control demands of clinical vector production facilities.

  15. Space radiator simulation manual for computer code

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.

  16. Analysis of neoplastic lesions in magnetic resonance imaging using self-organizing maps.

    PubMed

    Mei, Paulo Afonso; de Carvalho Carneiro, Cleyton; Fraser, Stephen J; Min, Li Li; Reis, Fabiano

    2015-12-15

    To provide an improved method for the identification and analysis of brain tumors in MRI scans using a semi-automated computational approach, that has the potential to provide a more objective, precise and quantitatively rigorous analysis, compared to human visual analysis. Self-Organizing Maps (SOM) is an unsupervised, exploratory data analysis tool, which can automatically domain an image into selfsimilar regions or clusters, based on measures of similarity. It can be used to perform image-domain of brain tissue on MR images, without prior knowledge. We used SOM to analyze T1, T2 and FLAIR acquisitions from two MRI machines in our service from 14 patients with brain tumors confirmed by biopsies--three lymphomas, six glioblastomas, one meningioma, one ganglioglioma, two oligoastrocytomas and one astrocytoma. The SOM software was used to analyze the data from the three image acquisitions from each patient and generated a self-organized map for each containing 25 clusters. Damaged tissue was separated from the normal tissue using the SOM technique. Furthermore, in some cases it allowed to separate different areas from within the tumor--like edema/peritumoral infiltration and necrosis. In lesions with less precise boundaries in FLAIR, the estimated damaged tissue area in the resulting map appears bigger. Our results showed that SOM has the potential to be a powerful MR imaging analysis technique for the assessment of brain tumors. Copyright © 2015. Published by Elsevier B.V.

  17. Comparison of analytical methods for profiling N- and O-linked glycans from cultured cell lines

    PubMed Central

    Togayachi, Akira; Azadi, Parastoo; Ishihara, Mayumi; Geyer, Rudolf; Galuska, Christina; Geyer, Hildegard; Kakehi, Kazuaki; Kinoshita, Mitsuhiro; Karlsson, Niclas G.; Jin, Chunsheng; Kato, Koichi; Yagi, Hirokazu; Kondo, Sachiko; Kawasaki, Nana; Hashii, Noritaka; Kolarich, Daniel; Stavenhagen, Kathrin; Packer, Nicolle H.; Thaysen-Andersen, Morten; Nakano, Miyako; Taniguchi, Naoyuki; Kurimoto, Ayako; Wada, Yoshinao; Tajiri, Michiko; Yang, Pengyuan; Cao, Weiqian; Li, Hong; Rudd, Pauline M.; Narimatsu, Hisashi

    2016-01-01

    The Human Disease Glycomics/Proteome Initiative (HGPI) is an activity in the Human Proteome Organization (HUPO) supported by leading researchers from international institutes and aims at development of disease-related glycomics/glycoproteomics analysis techniques. Since 2004, the initiative has conducted three pilot studies. The first two were N- and O-glycan analyses of purified transferrin and immunoglobulin-G and assessed the most appropriate analytical approach employed at the time. This paper describes the third study, which was conducted to compare different approaches for quantitation of N- and O-linked glycans attached to proteins in crude biological samples. The preliminary analysis on cell pellets resulted in wildly varied glycan profiles, which was probably the consequence of variations in the pre-processing sample preparation methodologies. However, the reproducibility of the data was not improved dramatically in the subsequent analysis on cell lysate fractions prepared in a specified method by one lab. The study demonstrated the difficulty of carrying out a complete analysis of the glycome in crude samples by any single technology and the importance of rigorous optimization of the course of analysis from preprocessing to data interpretation. It suggests that another collaborative study employing the latest technologies in this rapidly evolving field will help to realize the requirements of carrying out the large-scale analysis of glycoproteins in complex cell samples. PMID:26511985

  18. Optimal design and evaluation of a color separation grating using rigorous coupled wave analysis

    NASA Astrophysics Data System (ADS)

    Nagayoshi, Mayumi; Oka, Keiko; Klaus, Werner; Komai, Yuki; Kodate, Kashiko

    2006-02-01

    In recent years, the technology which separates white light into the three primary colors of Red (R), Green (G) and Blue (B) and adjusts each optical intensity and composites R, G and B to display various colors is required in the development and spread of color visual equipments. Various color separation devices have been proposed and have been put to practical use in color visual equipments. We have focused on a small and light grating-type device which has the possibility of reduction in cost and large-scale production and generates only the three primary colors of R, G and B so that a high saturation level can be obtained. To perform a rigorous analysis and design of color separation gratings, our group has developed a program that is based on the Rigorous Coupled Wave Analysis (RCWA). We then calculated the parameters to obtain a diffraction efficiency of higher than 70% and the color gamut of about 70%. We will report on the design, fabrication and evaluation of color separation gratings that have been optimized for fabrication by laser drawing.

  19. [Prognostic value of chosen parameters of mechanical and bioelectrical uterine activity in prediction of threatening preterm labour].

    PubMed

    Zietek, Jerzy; Sikora, Jerzy; Horoba, Krzysztof; Matonia, Adam; Jezewski, Janusz; Magnucki, Jacek; Kobielska, Lucyna

    2009-03-01

    To record and analyse bioelectrical activity of the uterine muscle in the course of physiological pregnancy, labour and threatening premature labour; to define which parameters from the analysis of both electrohysterogram and mechanical activity signal allow us to predict threatening premature labour. Material comprised 62 pregnant women: Group I--27 patients in their first physiological pregnancy, Group II--21 patients in their first pregnancy with symptoms of threatening premature labour, and Group III--14 patients in the first labour period. The on-line analysis of the mechanical (TOCO) and electrical (EHG) contraction activity relied on determination of quantitative parameters of detected uterine contractions. The obtained statistical results demonstrated a possibility to differentiate between Group I and II through the amplitude and contraction area for EHG signal, and only the contraction amplitude for TOCO signal. Additionally, significant differentiating parameters for electrohysterogram are: contraction power and its median frequency. Analyzing Group I and III, significant differences were noted for contraction amplitude and area obtained both from EHG and TOCO signals. Similarly, the contraction power (from EHG) enables us to assign the contractions either to records from Group I or to labour type. There was no significant difference noted between Group II and III. Identification of pregnant women at risk of premature labour should lead to their inclusion in rigorous perinatal surveillance. This requires novel, more sensitive methods that are able to detect early symptoms of the uterine contraction activity increase. Electrohysterography provides complete information on principles of bioelectrical uterine activity. Quantitative parameters of EHG analysis enable the detection of records (contractions) with the symptoms of premature uterine contraction activity.

  20. Rapid Quantitative Analysis of Microcystins in Raw Surface Waters with MALDI MS Utilizing Easily Synthesized Internal Standards

    PubMed Central

    Roegner, Amber F.; Schirmer, Macarena Pírez; Puschner, Birgit; Brena, Beatriz; Gonzalez-Sapienza, Gualberto

    2014-01-01

    The freshwater cyanotoxins, microcystins (MCs), pose a global public health threat as potent hepatotoxins in cyanobacterial blooms; their persistence in drinking and recreational water has been associated with potential chronic effects in addition to acute intoxications. Rapid and accurate detection of the over 80 structural congeners is challenged by the rigorous and time consuming clean up required to overcome interference found in raw water samples. MALDI-MS has shown promise for rapid quantification of individual congeners in raw water samples, with very low operative cost, but so far limited sensitivity and lack of available and versatile internal standards (ISs) has limited its use. Two easily synthesized S-hydroxyethyl–Cys(7)-MC-LR and –RR ISs were used to generate linear standard curves in a reflectron MALDI instrument, reproducible across several orders of magnitude for MC –LR, - RR and –YR. Minimum quantification limits in direct water samples with no clean up or concentration step involved were consistently below 7 μg/L, with recoveries from spiked samples between 80 and 119%. This method improves sensitivity by 30 fold over previous reports of quantitative MALDI-TOF applications to MCs and provides a salient option for rapid throughput analysis for multiple MC congeners in untreated raw surface water blooms as a means to identify source public health threats and target intervention strategies within a watershed. As demonstrated by analysis of a set of samples from Uruguay, utilizing the reaction of different MC congeners with alternate sulfhydryl compounds, the m/z of the IS can be customized to avoid overlap with interfering compounds in local surface water samples. PMID:24388801

  1. Qualitative research within trials: developing a standard operating procedure for a clinical trials unit

    PubMed Central

    2013-01-01

    Background Qualitative research methods are increasingly used within clinical trials to address broader research questions than can be addressed by quantitative methods alone. These methods enable health professionals, service users, and other stakeholders to contribute their views and experiences to evaluation of healthcare treatments, interventions, or policies, and influence the design of trials. Qualitative data often contribute information that is better able to reform policy or influence design. Methods Health services researchers, including trialists, clinicians, and qualitative researchers, worked collaboratively to develop a comprehensive portfolio of standard operating procedures (SOPs) for the West Wales Organisation for Rigorous Trials in Health (WWORTH), a clinical trials unit (CTU) at Swansea University, which has recently achieved registration with the UK Clinical Research Collaboration (UKCRC). Although the UKCRC requires a total of 25 SOPs from registered CTUs, WWORTH chose to add an additional qualitative-methods SOP (QM-SOP). Results The qualitative methods SOP (QM-SOP) defines good practice in designing and implementing qualitative components of trials, while allowing flexibility of approach and method. Its basic principles are that: qualitative researchers should be contributors from the start of trials with qualitative potential; the qualitative component should have clear aims; and the main study publication should report on the qualitative component. Conclusions We recommend that CTUs consider developing a QM-SOP to enhance the conduct of quantitative trials by adding qualitative data and analysis. We judge that this improves the value of quantitative trials, and contributes to the future development of multi-method trials. PMID:23433341

  2. Rigorous Electromagnetic Analysis of the Focusing Action of Refractive Cylindrical Microlens

    NASA Astrophysics Data System (ADS)

    Liu, Juan; Gu, Ben-Yuan; Dong, Bi-Zhen; Yang, Guo-Zhen

    The focusing action of refractive cylindrical microlens is investigated based on the rigorous electromagnetic theory with the use of the boundary element method. The focusing behaviors of these refractive microlenses with continuous and multilevel surface-envelope are characterized in terms of total electric-field patterns, the electric-field intensity distributions on the focal plane, and their diffractive efficiencies at the focal spots. The obtained results are also compared with the ones obtained by Kirchhoff's scalar diffraction theory. The present numerical and graphical results may provide useful information for the analysis and design of refractive elements in micro-optics.

  3. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research.

    PubMed

    Bandyopadhyay, Mridula

    2011-11-25

    The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people's social and cultural lives. I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health.

  4. The importance of stories in understanding people's relationship to food: narrative inquiry methodology has much to offer the public health nutrition researcher and practitioner.

    PubMed

    O'Kane, Gabrielle; Pamphilon, Barbara

    2016-03-01

    Despite the usefulness of quantitative research, qualitative research methodologies are equally needed to allow researchers to better understand the important social and environmental factors affecting food choice and eating habits. The present paper contributes insights from narrative inquiry, a well-established qualitative methodology, to a food-related doctoral research study. The connections between food shoppers and the producer, family, friends and others in the food system, between eaters and the earth, and how these connections affect people's meaning-making of food and pathways to food citizenship, were explored in the research. The research used narrative inquiry methodology and focus groups for data collection. Five different food-ways in the Canberra region of Australia were selected for the present research; that is, community gardens, community-supported agriculture, farmers' markets, fresh food markets and supermarkets. Fifty-two people voluntarily attended eight focus groups with four to nine participants in each. From a practical perspective, the present paper offers a guide to the way in which narrative inquiry has been applied to one research project. The paper describes the application of narrative inquiry methodology, revealing the important place of narratives in generating new knowledge. The paper further outlines how phased narrative analysis can lead to a defensible and rigorous interpretive framework grounded in the data generated from people's stories and meaning-making. We argue that individual, social and system change will not be possible without further rigorous qualitative studies to inform and complement the empirical basis of public health nutrition practice.

  5. Staff Acceptance of Tele-ICU Coverage

    PubMed Central

    Chan, Paul S.; Cram, Peter

    2011-01-01

    Background: Remote coverage of ICUs is increasing, but staff acceptance of this new technology is incompletely characterized. We conducted a systematic review to summarize existing research on acceptance of tele-ICU coverage among ICU staff. Methods: We searched for published articles pertaining to critical care telemedicine systems (aka, tele-ICU) between January 1950 and March 2010 using PubMed, Cumulative Index to Nursing and Allied Health Literature, Global Health, Web of Science, and the Cochrane Library and abstracts and presentations delivered at national conferences. Studies were included if they provided original qualitative or quantitative data on staff perceptions of tele-ICU coverage. Studies were imported into content analysis software and coded by tele-ICU configuration, methodology, participants, and findings (eg, positive and negative staff evaluations). Results: Review of 3,086 citations yielded 23 eligible studies. Findings were grouped into four categories of staff evaluation: overall acceptance level of tele-ICU coverage (measured in 70% of studies), impact on patient care (measured in 96%), impact on staff (measured in 100%), and organizational impact (measured in 48%). Overall acceptance was high, despite initial ambivalence. Favorable impact on patient care was perceived by > 82% of participants. Staff impact referenced enhanced collaboration, autonomy, and training, although scrutiny, malfunctions, and contradictory advice were cited as potential barriers. Staff perceived the organizational impact to vary. An important limitation of available studies was a lack of rigorous methodology and validated survey instruments in many studies. Conclusions: Initial reports suggest high levels of staff acceptance of tele-ICU coverage, but more rigorous methodologic study is required. PMID:21051386

  6. Rigorous diffraction analysis using geometrical theory of diffraction for future mask technology

    NASA Astrophysics Data System (ADS)

    Chua, Gek S.; Tay, Cho J.; Quan, Chenggen; Lin, Qunying

    2004-05-01

    Advanced lithographic techniques such as phase shift masks (PSM) and optical proximity correction (OPC) result in a more complex mask design and technology. In contrast to the binary masks, which have only transparent and nontransparent regions, phase shift masks also take into consideration transparent features with a different optical thickness and a modified phase of the transmitted light. PSM are well-known to show prominent diffraction effects, which cannot be described by the assumption of an infinitely thin mask (Kirchhoff approach) that is used in many commercial photolithography simulators. A correct prediction of sidelobe printability, process windows and linearity of OPC masks require the application of rigorous diffraction theory. The problem of aerial image intensity imbalance through focus with alternating Phase Shift Masks (altPSMs) is performed and compared between a time-domain finite-difference (TDFD) algorithm (TEMPEST) and Geometrical theory of diffraction (GTD). Using GTD, with the solution to the canonical problems, we obtained a relationship between the edge on the mask and the disturbance in image space. The main interest is to develop useful formulations that can be readily applied to solve rigorous diffraction for future mask technology. Analysis of rigorous diffraction effects for altPSMs using GTD approach will be discussed.

  7. A broader understanding of care managers' attitudes of advance care planning: a concurrent nested design.

    PubMed

    Aultman, Julie; Baughman, Kristin R; Ludwick, Ruth

    2018-05-18

    To examine barriers of advance care planning (ACP) experienced by care managers (CMs) through a mixed methods approach. A concurrent nested design was used to acquire a deeper understanding of ACP, and to identify nuances between quantitative and qualitative data. Past quantitative studies on providers have identified barriers related to time, culture, knowledge, responsibility, and availability of legal documents. These barriers, and accompanying attitudes and feelings, have been taken at face value without rich qualitative data to identify under what conditions and to what extent a barrier impacts care. A two-part multisite, mixed methods study was conducted using surveys and focus groups. Surveys were completed by 458 CMs at 10 Area Agencies on Aging and 62 participated in one of 8 focus groups. Data were analyzed using a concurrent nested design with individual data analysis and a merged data approach. There were three main distinctions between the quantitative and qualitative data. First, while CMs reported on the survey that ACP was not too time consuming, focus group data revealed that time was an issue especially related to competing priorities. Second on the survey 60% of the CMS reported they had enough knowledge, but qualitative data revealed about more nuances. Last, the reported comfort levels in the quantitative data were less overt in the qualitative date where additional feelings and attitudes were revealed, e.g., frustration with families, preferences for more physician involvement. CMs reported their attitudes about ACP, clarified through a rigorous mixed methods analysis. Care managers can successfully lead ACP discussions, but require further education, resources, and team-based guidance. ACP is essential for reducing emotional, social, and financial burdens associated with health care decision-making, and CMs can positively impact ACP discussions when appropriately supported by the clinical community. The many nuances in the ACP process that we found illustrate the need for ongoing discussions, education and research on this important topic. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more reliability and coverage to each measurement, taking advantages from the strong points of each technique.

  9. A transformative model for undergraduate quantitative biology education.

    PubMed

    Usher, David C; Driscoll, Tobin A; Dhurjati, Prasad; Pelesko, John A; Rossi, Louis F; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions.

  10. A Transformative Model for Undergraduate Quantitative Biology Education

    PubMed Central

    Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions. PMID:20810949

  11. Bayesian networks and information theory for audio-visual perception modeling.

    PubMed

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  12. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology

    PubMed Central

    Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.

    2017-01-01

    Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625

  13. Geoscience Education Research Methods: Thinking About Sample Size

    NASA Astrophysics Data System (ADS)

    Slater, S. J.; Slater, T. F.; CenterAstronomy; Physics Education Research

    2011-12-01

    Geoscience education research is at a critical point in which conditions are sufficient to propel our field forward toward meaningful improvements in geosciences education practices. Our field has now reached a point where the outcomes of our research is deemed important to endusers and funding agencies, and where we now have a large number of scientists who are either formally trained in geosciences education research, or who have dedicated themselves to excellence in this domain. At this point we now must collectively work through our epistemology, our rules of what methodologies will be considered sufficiently rigorous, and what data and analysis techniques will be acceptable for constructing evidence. In particular, we have to work out our answer to that most difficult of research questions: "How big should my 'N' be??" This paper presents a very brief answer to that question, addressing both quantitative and qualitative methodologies. Research question/methodology alignment, effect size and statistical power will be discussed, in addition to a defense of the notion that bigger is not always better.

  14. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology.

    PubMed

    Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M

    2017-04-01

    The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.

  15. Palaeostress perturbations near the El Castillo de las Guardas fault (SW Iberian Massif)

    NASA Astrophysics Data System (ADS)

    García-Navarro, Encarnación; Fernández, Carlos

    2010-05-01

    Use of stress inversion methods on faults measured at 33 sites located at the northwestern part of the South Portuguese Zone (Variscan Iberian Massif), and analysis of the basic dyke attitude at this same region, has revealed a prominent perturbation of the stress trajectories around some large, crustal-scale faults, like the El Castillo de las Guardas fault. The results are compared with the predictions of theoretical models of palaeostress deviations near master faults. According to this comparison, the El Castillo de las Guardas fault, an old structure that probably reversed several times its slip sense, can be considered as a sinistral strike-slip fault during the Moscovian. These results also point out the main shortcomings that still hinder a rigorous quantitative use of the theoretical models of stress perturbations around major faults: the spatial variation in the parameters governing the brittle behaviour of the continental crust, and the possibility of oblique slip along outcrop-scale faults in regions subjected to general, non-plane strain.

  16. Proposal for a biometrics of the cortical surface: a statistical method for relative surface distance metrics

    NASA Astrophysics Data System (ADS)

    Bookstein, Fred L.

    1995-08-01

    Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.

  17. Optimizing Hybrid Metrology: Rigorous Implementation of Bayesian and Combined Regression

    PubMed Central

    Henn, Mark-Alexander; Silver, Richard M.; Villarrubia, John S.; Zhang, Nien Fan; Zhou, Hui; Barnes, Bryan M.; Ming, Bin; Vladár, András E.

    2015-01-01

    Hybrid metrology, e.g., the combination of several measurement techniques to determine critical dimensions, is an increasingly important approach to meet the needs of the semiconductor industry. A proper use of hybrid metrology may yield not only more reliable estimates for the quantitative characterization of 3-D structures but also a more realistic estimation of the corresponding uncertainties. Recent developments at the National Institute of Standards and Technology (NIST) feature the combination of optical critical dimension (OCD) measurements and scanning electron microscope (SEM) results. The hybrid methodology offers the potential to make measurements of essential 3-D attributes that may not be otherwise feasible. However, combining techniques gives rise to essential challenges in error analysis and comparing results from different instrument models, especially the effect of systematic and highly correlated errors in the measurement on the χ2 function that is minimized. Both hypothetical examples and measurement data are used to illustrate solutions to these challenges. PMID:26681991

  18. The Importance Of Integrating Narrative Into Health Care Decision Making.

    PubMed

    Dohan, Daniel; Garrett, Sarah B; Rendle, Katharine A; Halley, Meghan; Abramson, Corey

    2016-04-01

    When making health care decisions, patients and consumers use data but also gather stories from family and friends. When advising patients, clinicians consult the medical evidence but also use professional judgment. These stories and judgments, as well as other forms of narrative, shape decision making but remain poorly understood. Furthermore, qualitative research methods to examine narrative are rarely included in health science research. We illustrate how narratives shape decision making and explain why it is difficult but necessary to integrate qualitative research on narrative into the health sciences. We draw on social-scientific insights on rigorous qualitative research and our ongoing studies of decision making by patients with cancer, and we describe new tools and approaches that link qualitative research findings with the predominantly quantitative health science scholarship. Finally, we highlight the benefits of more fully integrating qualitative research and narrative analysis into the medical evidence base and into evidence-based medical practice. Project HOPE—The People-to-People Health Foundation, Inc.

  19. Methods for Kinetic and Thermodynamic Analysis of Aminoacyl-tRNA Synthetases

    PubMed Central

    Francklyn, Christopher S.; First, Eric A.; Perona, John J.; Hou, Ya-Ming

    2008-01-01

    The accuracy of protein synthesis relies on the ability of aminoacyl-tRNA synthetases (aaRSs) to discriminate among true and near cognate substrates. To date, analysis of aaRSs function, including identification of residues of aaRS participating in amino acid and tRNA discrimination, has largely relied on the steady state kinetic pyrophosphate exchange and aminoacylation assays. Pre-steady state kinetic studies investigating a more limited set of aaRS systems have also been undertaken to assess the energetic contributions of individual enzyme-substrate interactions, particularly in the adenylation half reaction. More recently, a renewed interest in the use of rapid kinetics approaches for aaRSs has led to their application to several new aaRS systems, resulting in the identification of mechanistic differences that distinguish the two structurally distinct aaRS classes. Here, we review the techniques for thermodynamic and kinetic analysis of aaRS function. Following a brief survey of methods for the preparation of materials and for steady state kinetic analysis, this review will describe pre-steady state kinetic methods employing rapid quench and stopped-flow fluorescence for analysis of the activation and aminoacyl transfer reactions. Application of these methods to any aaRS system allows the investigator to derive detailed kinetic mechanisms for the activation and aminoacyl transfer reactions, permitting issues of substrate specificity, stereochemical mechanism, and inhibitor interaction to be addressed in a rigorous and quantitative fashion. PMID:18241792

  20. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  1. Measuring use of research evidence in public health policy: a policy content analysis

    PubMed Central

    2014-01-01

    Background There are few Australian studies showing how research evidence is used to inform the development of public health policy. International research has shown that compensation for injury rehabilitation can have negative impacts on health outcomes. This study examined transport injury compensation policy in the Australian state of Victoria to: determine type and purpose of reference to information sources; and to identify the extent of reference to academic research evidence in transport related injury rehabilitation compensation policy. Methods Quantitative content analysis of injury rehabilitation compensation policies (N = 128) from the Victorian state government transport accident compensation authority. Results The most commonly referenced types of information were Internal Policy (median = 6 references per policy), Clinical/Medical (2.5), and Internal Legislation (1). Academic Research Evidence was the least often referenced source of information. The main purpose of reference to information was to support injury treatment and rehabilitation compensation claims decision-making. Conclusions Transport injury compensation policy development is complex; with multiple sources of information cited including legislation, internal policy, external policy and clinical/medical evidence. There is limited use of academic research evidence in Victorian state government injury treatment and rehabilitation compensation policies. Decisions regarding compensation for injury treatment and rehabilitation services could benefit from greater use of academic research evidence. This study is one of the first to examine the use of research evidence in existing Australian public health policy decision-making using rigorous quantitative methods. It provides a practical example of how use of research evidence in public health policy can be objectively measured. PMID:24886092

  2. Improved key-rate bounds for practical decoy-state quantum-key-distribution systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng

    2017-01-01

    The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.

  3. Central nervous system antiretroviral efficacy in HIV infection: a qualitative and quantitative review and implications for future research.

    PubMed

    Cysique, Lucette A; Waters, Edward K; Brew, Bruce J

    2011-11-22

    There is conflicting information as to whether antiretroviral drugs with better central nervous system (CNS) penetration (neuroHAART) assist in improving neurocognitive function and suppressing cerebrospinal fluid (CSF) HIV RNA. The current review aims to better synthesise existing literature by using an innovative two-phase review approach (qualitative and quantitative) to overcome methodological differences between studies. Sixteen studies, all observational, were identified using a standard citation search. They fulfilled the following inclusion criteria: conducted in the HAART era; sample size > 10; treatment effect involved more than one antiretroviral and none had a retrospective design. The qualitative phase of review of these studies consisted of (i) a blind assessment rating studies on features such as sample size, statistical methods and definitions of neuroHAART, and (ii) a non-blind assessment of the sensitivity of the neuropsychological methods to HIV-associated neurocognitive disorder (HAND). During quantitative evaluation we assessed the statistical power of studies, which achieved a high rating in the qualitative analysis. The objective of the power analysis was to determine the studies ability to assess their proposed research aims. After studies with at least three limitations were excluded in the qualitative phase, six studies remained. All six found a positive effect of neuroHAART on neurocognitive function or CSF HIV suppression. Of these six studies, only two had statistical power of at least 80%. Studies assessed as using more rigorous methods found that neuroHAART was effective in improving neurocognitive function and decreasing CSF viral load, but only two of those studies were adequately statistically powered. Because all of these studies were observational, they represent a less compelling evidence base than randomised control trials for assessing treatment effect. Therefore, large randomised trials are needed to determine the robustness of any neuroHAART effect. However, such trials must be longitudinal, include the full spectrum of HAND, ideally carefully control for co-morbidities, and be based on optimal neuropsychology methods.

  4. Identification of suitable internal controls to study expression of a Staphylococcus aureus multidrug resistance system by quantitative real-time PCR.

    PubMed

    Theis, Torsten; Skurray, Ronald A; Brown, Melissa H

    2007-08-01

    Quantitative real-time PCR (qRT-PCR) has become a routine technique for gene expression analysis. Housekeeping genes are customarily used as endogenous references for the relative quantification of genes of interest. The aim of this study was to develop a quantitative real-time PCR assay to analyze gene expression in multidrug resistant Staphylococcus aureus in the presence of cationic lipophilic substrates of multidrug transport proteins. Eleven different housekeeping genes were analyzed for their expression stability in the presence of a range of concentrations of four structurally different antimicrobial compounds. This analysis demonstrated that the genes rho, pyk and proC were least affected by rhodamine 6G and crystal violet, whereas fabD, tpiA and gyrA or fabD, proC and pyk were stably expressed in cultures grown in the presence of ethidium or berberine, respectively. Subsequently, these housekeeping genes were used as internal controls to analyze expression of the multidrug transport protein QacA and its transcriptional regulator QacR in the presence of the aforementioned compounds. Expression of qacA was induced by all four compounds, whereas qacR expression was found to be unaffected, reduced or enhanced. This study demonstrates that staphylococcal gene expression, including housekeeping genes previously used to normalize qRT-PCR data, is affected by growth in the presence of different antimicrobial compounds. Thus, identification of suitable genes usable as a control set requires rigorous testing. Identification of a such a set enabled them to be utilized as internal standards for accurate quantification of transcripts of the qac multidrug resistance system from S. aureus grown under different inducing conditions. Moreover, the qRT-PCR assay presented in this study may also be applied to gene expression studies of other multidrug transporters from S. aureus.

  5. Quantitative local analysis of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Topcu, Ufuk

    This thesis investigates quantitative methods for local robustness and performance analysis of nonlinear dynamical systems with polynomial vector fields. We propose measures to quantify systems' robustness against uncertainties in initial conditions (regions-of-attraction) and external disturbances (local reachability/gain analysis). S-procedure and sum-of-squares relaxations are used to translate Lyapunov-type characterizations to sum-of-squares optimization problems. These problems are typically bilinear/nonconvex (due to local analysis rather than global) and their size grows rapidly with state/uncertainty space dimension. Our approach is based on exploiting system theoretic interpretations of these optimization problems to reduce their complexity. We propose a methodology incorporating simulation data in formal proof construction enabling more reliable and efficient search for robustness and performance certificates compared to the direct use of general purpose solvers. This technique is adapted both to region-of-attraction and reachability analysis. We extend the analysis to uncertain systems by taking an intentionally simplistic and potentially conservative route, namely employing parameter-independent rather than parameter-dependent certificates. The conservatism is simply reduced by a branch-and-hound type refinement procedure. The main thrust of these methods is their suitability for parallel computing achieved by decomposing otherwise challenging problems into relatively tractable smaller ones. We demonstrate proposed methods on several small/medium size examples in each chapter and apply each method to a benchmark example with an uncertain short period pitch axis model of an aircraft. Additional practical issues leading to a more rigorous basis for the proposed methodology as well as promising further research topics are also addressed. We show that stability of linearized dynamics is not only necessary but also sufficient for the feasibility of the formulations in region-of-attraction analysis. Furthermore, we generalize an upper bound refinement procedure in local reachability/gain analysis which effectively generates non-polynomial certificates from polynomial ones. Finally, broader applicability of optimization-based tools stringently depends on the availability of scalable/hierarchial algorithms. As an initial step toward this direction, we propose a local small-gain theorem and apply to stability region analysis in the presence of unmodeled dynamics.

  6. Failure-Modes-And-Effects Analysis Of Software Logic

    NASA Technical Reports Server (NTRS)

    Garcia, Danny; Hartline, Thomas; Minor, Terry; Statum, David; Vice, David

    1996-01-01

    Rigorous analysis applied early in design effort. Method of identifying potential inadequacies and modes and effects of failures caused by inadequacies (failure-modes-and-effects analysis or "FMEA" for short) devised for application to software logic.

  7. The Aharonov-Bohm effect and Tonomura et al. experiments: Rigorous results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballesteros, Miguel; Weder, Ricardo

    The Aharonov-Bohm effect is a fundamental issue in physics. It describes the physically important electromagnetic quantities in quantum mechanics. Its experimental verification constitutes a test of the theory of quantum mechanics itself. The remarkable experiments of Tonomura et al. ['Observation of Aharonov-Bohm effect by electron holography', Phys. Rev. Lett 48, 1443 (1982) and 'Evidence for Aharonov-Bohm effect with magnetic field completely shielded from electron wave', Phys. Rev. Lett 56, 792 (1986)] are widely considered as the only experimental evidence of the physical existence of the Aharonov-Bohm effect. Here we give the first rigorous proof that the classical ansatz of Aharonovmore » and Bohm of 1959 ['Significance of electromagnetic potentials in the quantum theory', Phys. Rev. 115, 485 (1959)], that was tested by Tonomura et al., is a good approximation to the exact solution to the Schroedinger equation. This also proves that the electron, that is, represented by the exact solution, is not accelerated, in agreement with the recent experiment of Caprez et al. in 2007 ['Macroscopic test of the Aharonov-Bohm effect', Phys. Rev. Lett. 99, 210401 (2007)], that shows that the results of the Tonomura et al. experiments can not be explained by the action of a force. Under the assumption that the incoming free electron is a Gaussian wave packet, we estimate the exact solution to the Schroedinger equation for all times. We provide a rigorous, quantitative error bound for the difference in norm between the exact solution and the Aharonov-Bohm Ansatz. Our bound is uniform in time. We also prove that on the Gaussian asymptotic state the scattering operator is given by a constant phase shift, up to a quantitative error bound that we provide. Our results show that for intermediate size electron wave packets, smaller than the ones used in the Tonomura et al. experiments, quantum mechanics predicts the results observed by Tonomura et al. with an error bound smaller than 10{sup -99}. It would be quite interesting to perform experiments with electron wave packets of intermediate size. Furthermore, we provide a physical interpretation of our error bound.« less

  8. The DOZZ formula from the path integral

    NASA Astrophysics Data System (ADS)

    Kupiainen, Antti; Rhodes, Rémi; Vargas, Vincent

    2018-05-01

    We present a rigorous proof of the Dorn, Otto, Zamolodchikov, Zamolodchikov formula (the DOZZ formula) for the 3 point structure constants of Liouville Conformal Field Theory (LCFT) starting from a rigorous probabilistic construction of the functional integral defining LCFT given earlier by the authors and David. A crucial ingredient in our argument is a probabilistic derivation of the reflection relation in LCFT based on a refined tail analysis of Gaussian multiplicative chaos measures.

  9. Facial patterns in a tropical social wasp correlate with colony membership

    NASA Astrophysics Data System (ADS)

    Baracchi, David; Turillazzi, Stefano; Chittka, Lars

    2016-10-01

    Social insects excel in discriminating nestmates from intruders, typically relying on colony odours. Remarkably, some wasp species achieve such discrimination using visual information. However, while it is universally accepted that odours mediate a group level recognition, the ability to recognise colony members visually has been considered possible only via individual recognition by which wasps discriminate `friends' and `foes'. Using geometric morphometric analysis, which is a technique based on a rigorous statistical theory of shape allowing quantitative multivariate analyses on structure shapes, we first quantified facial marking variation of Liostenogaster flavolineata wasps. We then compared this facial variation with that of chemical profiles (generated by cuticular hydrocarbons) within and between colonies. Principal component analysis and discriminant analysis applied to sets of variables containing pure shape information showed that despite appreciable intra-colony variation, the faces of females belonging to the same colony resemble one another more than those of outsiders. This colony-specific variation in facial patterns was on a par with that observed for odours. While the occurrence of face discrimination at the colony level remains to be tested by behavioural experiments, overall our results suggest that, in this species, wasp faces display adequate information that might be potentially perceived and used by wasps for colony level recognition.

  10. Universality and diversity of folding mechanics for three-helix bundle proteins.

    PubMed

    Yang, Jae Shick; Wallin, Stefan; Shakhnovich, Eugene I

    2008-01-22

    In this study we evaluate, at full atomic detail, the folding processes of two small helical proteins, the B domain of protein A and the Villin headpiece. Folding kinetics are studied by performing a large number of ab initio Monte Carlo folding simulations using a single transferable all-atom potential. Using these trajectories, we examine the relaxation behavior, secondary structure formation, and transition-state ensembles (TSEs) of the two proteins and compare our results with experimental data and previous computational studies. To obtain a detailed structural information on the folding dynamics viewed as an ensemble process, we perform a clustering analysis procedure based on graph theory. Moreover, rigorous p(fold) analysis is used to obtain representative samples of the TSEs and a good quantitative agreement between experimental and simulated Phi values is obtained for protein A. Phi values for Villin also are obtained and left as predictions to be tested by future experiments. Our analysis shows that the two-helix hairpin is a common partially stable structural motif that gets formed before entering the TSE in the studied proteins. These results together with our earlier study of Engrailed Homeodomain and recent experimental studies provide a comprehensive, atomic-level picture of folding mechanics of three-helix bundle proteins.

  11. Quantitative Analysis of Guanine Nucleotide Exchange Factors (GEFs) as Enzymes

    PubMed Central

    Randazzo, Paul A; Jian, Xiaoying; Chen, Pei-Wen; Zhai, Peng; Soubias, Olivier; Northup, John K

    2014-01-01

    The proteins that possess guanine nucleotide exchange factor (GEF) activity, which include about ~800 G protein coupled receptors (GPCRs),1 15 Arf GEFs,2 81 Rho GEFs,3 8 Ras GEFs,4 and others for other families of GTPases,5 catalyze the exchange of GTP for GDP on all regulatory guanine nucleotide binding proteins. Despite their importance as catalysts, relatively few exchange factors (we are aware of only eight for ras superfamily members) have been rigorously characterized kinetically.5–13 In some cases, kinetic analysis has been simplistic leading to erroneous conclusions about mechanism (as discussed in a recent review14). In this paper, we compare two approaches for determining the kinetic properties of exchange factors: (i) examining individual equilibria, and; (ii) analyzing the exchange factors as enzymes. Each approach, when thoughtfully used,14,15 provides important mechanistic information about the exchange factors. The analysis as enzymes is described in further detail. With the focus on the production of the biologically relevant guanine nucleotide binding protein complexed with GTP (G•GTP), we believe it is conceptually simpler to connect the kinetic properties to cellular effects. Further, the experiments are often more tractable than those used to analyze the equilibrium system and, therefore, more widely accessible to scientists interested in the function of exchange factors. PMID:25332840

  12. Enhancing the quality and credibility of qualitative analysis.

    PubMed Central

    Patton, M Q

    1999-01-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems. PMID:10591279

  13. A Flexible Analysis Tool for the Quantitative Acoustic Assessment of Infant Cry

    PubMed Central

    Reggiannini, Brian; Sheinkopf, Stephen J.; Silverman, Harvey F.; Li, Xiaoxue; Lester, Barry M.

    2015-01-01

    Purpose In this article, the authors describe and validate the performance of a modern acoustic analyzer specifically designed for infant cry analysis. Method Utilizing known algorithms, the authors developed a method to extract acoustic parameters describing infant cries from standard digital audio files. They used a frame rate of 25 ms with a frame advance of 12.5 ms. Cepstral-based acoustic analysis proceeded in 2 phases, computing frame-level data and then organizing and summarizing this information within cry utterances. Using signal detection methods, the authors evaluated the accuracy of the automated system to determine voicing and to detect fundamental frequency (F0) as compared to voiced segments and pitch periods manually coded from spectrogram displays. Results The system detected F0 with 88% to 95% accuracy, depending on tolerances set at 10 to 20 Hz. Receiver operating characteristic analyses demonstrated very high accuracy at detecting voicing characteristics in the cry samples. Conclusions This article describes an automated infant cry analyzer with high accuracy to detect important acoustic features of cry. A unique and important aspect of this work is the rigorous testing of the system’s accuracy as compared to ground-truth manual coding. The resulting system has implications for basic and applied research on infant cry development. PMID:23785178

  14. Theoretical modeling of the uranium 4f XPS for U(VI) and U(IV) oxides

    NASA Astrophysics Data System (ADS)

    Bagus, Paul S.; Nelin, Connie J.; Ilton, Eugene S.

    2013-12-01

    A rigorous study is presented of the physical processes related to X-Ray photoelectron spectroscopy, XPS, in the 4f level of U oxides, which, as well as being of physical interest in themselves, are representative of XPS in heavy metal oxides. In particular, we present compelling evidence for a new view of the screening of core-holes that extends prior understandings. Our analysis of the screening focuses on the covalent mixing of high lying U and O orbitals as opposed to the, more common, use of orbitals that are nominally pure U or pure O. It is shown that this covalent mixing is quite different for the initial and final, core-hole, configurations and that this difference is directly related to the XPS satellite intensity. Furthermore, we show that the high-lying U d orbitals as well as the U(5f) orbital may both contribute to the core-hole screening, in contrast with previous work that has only considered screening through the U(5f) shell. The role of modifying the U-O interaction by changing the U-O distance has been investigated and an unexpected correlation between U-O distance and XPS satellite intensity has been discovered. The role of flourite and octahedral crystal structures for U(IV) oxides has been examined and relationships established between XPS features and the covalent interactions in the different structures. The physical views of XPS satellites as arising from shake processes or as arising from ligand to metal charge transfers are contrasted; our analysis provides strong support that shake processes give a more fundamental physical understanding than charge transfer. Our theoretical studies are based on rigorous, strictly ab initio determinations of the electronic structure of embedded cluster models of U oxides with formal U(VI) and U(IV) oxidation states. Our results provide a foundation that makes it possible to establish quantitative relationships between features of the XPS spectra and materials properties.

  15. Scientific approaches to science policy.

    PubMed

    Berg, Jeremy M

    2013-11-01

    The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.

  16. Space Shuttle Communications Coverage Analysis for Thermal Tile Inspection

    NASA Technical Reports Server (NTRS)

    Kroll, Quin D.; Hwu, Shian U.; Upanavage, Matthew; Boster, John P.; Chavez, Mark A.

    2009-01-01

    The space shuttle ultra-high frequency Space-to-Space Communication System has to provide adequate communication coverage for astronauts who are performing thermal tile inspection and repair on the underside of the space shuttle orbiter (SSO). Careful planning and quantitative assessment are necessary to ensure successful system operations and mission safety in this work environment. This study assesses communication systems performance for astronauts who are working in the underside, non-line-of-sight shadow region on the space shuttle. All of the space shuttle and International Space Station (ISS) transmitting antennas are blocked by the SSO structure. To ensure communication coverage at planned inspection worksites, the signal strength and link margin between the SSO/ISS antennas and the extravehicular activity astronauts, whose line-of-sight is blocked by vehicle structure, was analyzed. Investigations were performed using rigorous computational electromagnetic modeling techniques. Signal strength was obtained by computing the reflected and diffracted fields along the signal propagation paths between transmitting and receiving antennas. Radio frequency (RF) coverage was determined for thermal tile inspection and repair missions using the results of this computation. Analysis results from this paper are important in formulating the limits on reliable communication range and RF coverage at planned underside inspection and repair worksites.

  17. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    PubMed

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  18. A model of pathways to artificial superintelligence catastrophe for risk and decision analysis

    NASA Astrophysics Data System (ADS)

    Barrett, Anthony M.; Baum, Seth D.

    2017-03-01

    An artificial superintelligence (ASI) is an artificial intelligence that is significantly more intelligent than humans in all respects. Whilst ASI does not currently exist, some scholars propose that it could be created sometime in the future, and furthermore that its creation could cause a severe global catastrophe, possibly even resulting in human extinction. Given the high stakes, it is important to analyze ASI risk and factor the risk into decisions related to ASI research and development. This paper presents a graphical model of major pathways to ASI catastrophe, focusing on ASI created via recursive self-improvement. The model uses the established risk and decision analysis modelling paradigms of fault trees and influence diagrams in order to depict combinations of events and conditions that could lead to AI catastrophe, as well as intervention options that could decrease risks. The events and conditions include select aspects of the ASI itself as well as the human process of ASI research, development and management. Model structure is derived from published literature on ASI risk. The model offers a foundation for rigorous quantitative evaluation and decision-making on the long-term risk of ASI catastrophe.

  19. A data grid for imaging-based clinical trials

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Chao, Sander S.; Lee, Jasper; Liu, Brent; Documet, Jorge; Huang, H. K.

    2007-03-01

    Clinical trials play a crucial role in testing new drugs or devices in modern medicine. Medical imaging has also become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observation and quantitative assessment. A typical imaging-based clinical trial consists of: 1) A well-defined rigorous clinical trial protocol, 2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and 3) many field sites that generate and send image studies to the radiology core. As the number of clinical trials increases, it becomes a challenge for a radiology core servicing multiple trials to have a server robust enough to administrate and quickly distribute information to participating radiologists/clinicians worldwide. The Data Grid can satisfy the aforementioned requirements of imaging based clinical trials. In this paper, we present a Data Grid architecture for imaging-based clinical trials. A Data Grid prototype has been implemented in the Image Processing and Informatics (IPI) Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial. The implementation methodology and evaluation protocol of the Data Grid are presented.

  20. Quantitative analysis of major elements in silicate minerals and glasses by micro-PIXE

    USGS Publications Warehouse

    Campbell, J.L.; Czamanske, G.K.; MacDonald, L.; Teesdale, W.J.

    1997-01-01

    The Guelph micro-PIXE facility has been modified to accommodate a second Si(Li) X-ray detector which records the spectrum due to light major elements (11 ??? Z ??? 20) with no deleterious effects from scattered 3 MeV protons. Spectra have been recorded from 30 well-characterized materials, including a broad range of silicate minerals and both natural and synthetic glasses. Sodium is mobile in some of the glasses, but not in the studied mineral lattices. The mean value of the instrumental constant H for each of the elements Mg, Al, and Si in these materials is systematically 6-8% lower than the H-value measured for the pure metals. Normalization factors are derived which permit the matrix corrections requisite for trace-element measurements in silicates to be based upon pure metal standards for Mg, Al and Si, supplemented by well-established, silicate mineral standards for the elements Na, K and Ca. Rigorous comparisons of electron microprobe and micro-PIXE analyses for the entire, 30-sample suite demonstrate the ability of micro-PIXE to produce accurate analysis for the light major elements in silicates. ?? 1997 Elsevier Science B.V.

  1. High-Resolution Free-Energy Landscape Analysis of α-Helical Protein Folding: HP35 and Its Double Mutant

    PubMed Central

    2013-01-01

    The free-energy landscape can provide a quantitative description of folding dynamics, if determined as a function of an optimally chosen reaction coordinate. Here, we construct the optimal coordinate and the associated free-energy profile for all-helical proteins HP35 and its norleucine (Nle/Nle) double mutant, based on realistic equilibrium folding simulations [Piana et al. Proc. Natl. Acad. Sci. U.S.A.2012, 109, 17845]. From the obtained profiles, we directly determine such basic properties of folding dynamics as the configurations of the minima and transition states (TS), the formation of secondary structure and hydrophobic core during the folding process, the value of the pre-exponential factor and its relation to the transition path times, the relation between the autocorrelation times in TS and minima. We also present an investigation of the accuracy of the pre-exponential factor estimation based on the transition-path times. Four different estimations of the pre-exponential factor for both proteins give k0–1 values of approximately a few tens of nanoseconds. Our analysis gives detailed information about folding of the proteins and can serve as a rigorous common language for extensive comparison between experiment and simulation. PMID:24348206

  2. High-Resolution Free-Energy Landscape Analysis of α-Helical Protein Folding: HP35 and Its Double Mutant.

    PubMed

    Banushkina, Polina V; Krivov, Sergei V

    2013-12-10

    The free-energy landscape can provide a quantitative description of folding dynamics, if determined as a function of an optimally chosen reaction coordinate. Here, we construct the optimal coordinate and the associated free-energy profile for all-helical proteins HP35 and its norleucine (Nle/Nle) double mutant, based on realistic equilibrium folding simulations [Piana et al. Proc. Natl. Acad. Sci. U.S.A. 2012 , 109 , 17845]. From the obtained profiles, we directly determine such basic properties of folding dynamics as the configurations of the minima and transition states (TS), the formation of secondary structure and hydrophobic core during the folding process, the value of the pre-exponential factor and its relation to the transition path times, the relation between the autocorrelation times in TS and minima. We also present an investigation of the accuracy of the pre-exponential factor estimation based on the transition-path times. Four different estimations of the pre-exponential factor for both proteins give k 0 -1 values of approximately a few tens of nanoseconds. Our analysis gives detailed information about folding of the proteins and can serve as a rigorous common language for extensive comparison between experiment and simulation.

  3. Near Identifiability of Dynamical Systems

    NASA Technical Reports Server (NTRS)

    Hadaegh, F. Y.; Bekey, G. A.

    1987-01-01

    Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.

  4. Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.

    PubMed Central

    Mulvany, M J

    1975-01-01

    1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023

  5. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  6. Evaluating oversight systems for emerging technologies: a case study of genetically engineered organisms.

    PubMed

    Kuzma, Jennifer; Najmaie, Pouya; Larson, Joel

    2009-01-01

    The U.S. oversight system for genetically engineered organisms (GEOs) was evaluated to develop hypotheses and derive lessons for oversight of other emerging technologies, such as nanotechnology. Evaluation was based upon quantitative expert elicitation, semi-standardized interviews, and historical literature analysis. Through an interdisciplinary policy analysis approach, blending legal, ethical, risk analysis, and policy sciences viewpoints, criteria were used to identify strengths and weaknesses of GEOs oversight and explore correlations among its attributes and outcomes. From the three sources of data, hypotheses and broader conclusions for oversight were developed. Our analysis suggests several lessons for oversight of emerging technologies: the importance of reducing complexity and uncertainty in oversight for minimizing financial burdens on small product developers; consolidating multi-agency jurisdictions to avoid gaps and redundancies in safety reviews; consumer benefits for advancing acceptance of GEO products; rigorous and independent pre- and post-market assessment for environmental safety; early public input and transparency for ensuring public confidence; and the positive role of public input in system development, informed consent, capacity, compliance, incentives, and data requirements and stringency in promoting health and environmental safety outcomes, as well as the equitable distribution of health impacts. Our integrated approach is instructive for more comprehensive analyses of oversight systems, developing hypotheses for how features of oversight systems affect outcomes, and formulating policy options for oversight of future technological products, especially nanotechnology products.

  7. Modeling of 2D diffusion processes based on microscopy data: parameter estimation and practical identifiability analysis.

    PubMed

    Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J

    2013-01-01

    Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.

  8. Trans-dimensional and hierarchical Bayesian approaches toward rigorous estimation of seismic sources and structures in the Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean

    2016-04-01

    A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.

  9. Interventions to prevent youth violence in Latin America: a systematic review.

    PubMed

    Atienzo, Erika E; Baxter, Susan K; Kaltenthaler, Eva

    2017-01-01

    This review aims to summarise evidence on the effectiveness of interventions to prevent youth violence in Latin America. A systematic search on 13 academic databases was conducted to locate studies evaluating a primary or secondary prevention intervention in Latin America. Studies could use any type of quantitative design to assess outcomes related to youth violence. A search of websites, references and citation searching was also carried out. The quality of each study was assessed. Nine studies were identified. Most documented positive effects of the interventions on the perception of youth violence present in the community/school. Evidence was found of a reduction in homicides and juvenile crimes in three studies, two of which evaluated a community-based intervention. There were mixed results for the self-report of participation on violent acts. The majority of the studies lacked of a rigorous design. Most of the interventions had some promising results, including the reduction of homicides within communities. Community-based programmes were the most consistent regarding an effectiveness to prevent violence. However, the evidence for Latin America is still scarce and relies on non-rigorously designed studies.

  10. Survivorship care plan preferences of cancer survivors and health care providers: a systematic review and quality appraisal of the evidence.

    PubMed

    Klemanski, Dori L; Browning, Kristine K; Kue, Jennifer

    2016-02-01

    The purpose of this systematic review was to describe and examine the current use of treatment summaries and survivorship care plans (TSs/SCPs) for cancer survivors, as well as to summarize and critically assess relevant literature regarding their preferences and usefulness. There is a knowledge gap regarding the preferences of stakeholders as to what is useful on a treatment summary or survivorship care plan. A systematic review of eligible manuscripts was conducted using preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines. Relevant studies were identified via PubMed, CINAHL Plus, and the Cochrane Library from 2005 through 2013. Eligible studies were critically appraised with qualitative and quantitative appraisal tools. There were 29 studies included in this review; 19 were quantitative. Survivors and primary care physicians preferred a printable format delivered 0 to 6 months posttreatment and highlighting signs and symptoms of recurrence, late, and long-term effects, and recommendations for healthy living. Oncology providers supported the concept of treatment summary and survivorship care plan but reported significant barriers to their provision. No studies incorporated caregiver perspectives of treatment summary and survivorship care plan. This systematic review did not reveal conclusive evidence regarding the needs of survivors or providers regarding treatment summaries and survivorship care plans. A lack of rigorous studies contributed to this. Treatment summaries and survivorship care plans are useful for cancer survivors; however, future rigorous studies should be conducted to identify and prioritize the preferences of survivors regarding these.

  11. In situ quantitative characterisation of the ocean water column using acoustic multibeam backscatter data

    NASA Astrophysics Data System (ADS)

    Lamarche, G.; Le Gonidec, Y.; Lucieer, V.; Lurton, X.; Greinert, J.; Dupré, S.; Nau, A.; Heffron, E.; Roche, M.; Ladroit, Y.; Urban, P.

    2017-12-01

    Detecting liquid, solid or gaseous features in the ocean is generating considerable interest in the geoscience community, because of their potentially high economic values (oil & gas, mining), their significance for environmental management (oil/gas leakage, biodiversity mapping, greenhouse gas monitoring) as well as their potential cultural and traditional values (food, freshwater). Enhancing people's capability to quantify and manage the natural capital present in the ocean water goes hand in hand with the development of marine acoustic technology, as marine echosounders provide the most reliable and technologically advanced means to develop quantitative studies of water column backscatter data. This is not developed to its full capability because (i) of the complexity of the physics involved in relation to the constantly changing marine environment, and (ii) the rapid technological evolution of high resolution multibeam echosounder (MBES) water-column imaging systems. The Water Column Imaging Working Group is working on a series of multibeam echosounder (MBES) water column datasets acquired in a variety of environments, using a range of frequencies, and imaging a number of water-column features such as gas seeps, oil leaks, suspended particulate matter, vegetation and freshwater springs. Access to data from different acoustic frequencies and ocean dynamics enables us to discuss and test multifrequency approaches which is the most promising means to develop a quantitative analysis of the physical properties of acoustic scatterers, providing rigorous cross calibration of the acoustic devices. In addition, high redundancy of multibeam data, such as is available for some datasets, will allow us to develop data processing techniques, leading to quantitative estimates of water column gas seeps. Each of the datasets has supporting ground-truthing data (underwater videos and photos, physical oceanography measurements) which provide information on the origin and chemistry of the seep content. This is of first importance when assessing the physical properties of water column scatterers from backscatter acoustic measurement.

  12. Patient and Healthcare Provider Barriers to Hypertension Awareness, Treatment and Follow Up: A Systematic Review and Meta-Analysis of Qualitative and Quantitative Studies

    PubMed Central

    Khatib, Rasha; Schwalm, Jon-David; Yusuf, Salim; Haynes, R. Brian; McKee, Martin; Khan, Maheer; Nieuwlaat, Robby

    2014-01-01

    Background Although the importance of detecting, treating, and controlling hypertension has been recognized for decades, the majority of patients with hypertension remain uncontrolled. The path from evidence to practice contains many potential barriers, but their role has not been reviewed systematically. This review aimed to synthesize and identify important barriers to hypertension control as reported by patients and healthcare providers. Methods Electronic databases MEDLINE, EMBASE and Global Health were searched systematically up to February 2013. Two reviewers independently selected eligible studies. Two reviewers categorized barriers based on a theoretical framework of behavior change. The theoretical framework suggests that a change in behavior requires a strong commitment to change [intention], the necessary skills and abilities to adopt the behavior [capability], and an absence of health system and support constraints. Findings Twenty-five qualitative studies and 44 quantitative studies met the inclusion criteria. In qualitative studies, health system barriers were most commonly discussed in studies of patients and health care providers. Quantitative studies identified disagreement with clinical recommendations as the most common barrier among health care providers. Quantitative studies of patients yielded different results: lack of knowledge was the most common barrier to hypertension awareness. Stress, anxiety and depression were most commonly reported as barriers that hindered or delayed adoption of a healthier lifestyle. In terms of hypertension treatment adherence, patients mostly reported forgetting to take their medication. Finally, priority setting barriers were most commonly reported by patients in terms of following up with their health care providers. Conclusions This review identified a wide range of barriers facing patients and health care providers pursuing hypertension control, indicating the need for targeted multi-faceted interventions. More methodologically rigorous studies that encompass the range of barriers and that include low- and middle-income countries are required in order to inform policies to improve hypertension control. PMID:24454721

  13. Rigorous quantitative elemental microanalysis by scanning electron microscopy/energy dispersive x-ray spectrometry (SEM/EDS) with spectrum processing by NIST DTSA-II

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2014-09-01

    Quantitative electron-excited x-ray microanalysis by scanning electron microscopy/silicon drift detector energy dispersive x-ray spectrometry (SEM/SDD-EDS) is capable of achieving high accuracy and high precision equivalent to that of the high spectral resolution wavelength dispersive x-ray spectrometer even when severe peak interference occurs. The throughput of the SDD-EDS enables high count spectra to be measured that are stable in calibration and resolution (peak shape) across the full deadtime range. With this high spectral stability, multiple linear least squares peak fitting is successful for separating overlapping peaks and spectral background. Careful specimen preparation is necessary to remove topography on unknowns and standards. The standards-based matrix correction procedure embedded in the NIST DTSA-II software engine returns quantitative results supported by a complete error budget, including estimates of the uncertainties from measurement statistics and from the physical basis of the matrix corrections. NIST DTSA-II is available free for Java-platforms at: http://www.cstl.nist.gov/div837/837.02/epq/dtsa2/index.html).

  14. Methods Beyond Methods: A Model for Africana Graduate Methods Training.

    PubMed

    Best, Latrica E; Byrd, W Carson

    2014-06-01

    A holistic graduate education can impart not just tools and knowledge, but critical positioning to fulfill many of the original missions of Africana Studies programs set forth in the 1960s and 1970s. As an interdisciplinary field with many approaches to examining the African Diaspora, the methodological training of graduate students can vary across graduate programs. Although taking qualitative methods courses are often required of graduate students in Africana Studies programs, and these programs offer such courses, rarely if ever are graduate students in these programs required to take quantitative methods courses, let alone have these courses offered in-house. These courses can offer Africana Studies graduate students new tools for their own research, but more importantly, improve their knowledge of quantitative research of diasporic communities. These tools and knowledge can assist with identifying flawed arguments about African-descended communities and their members. This article explores the importance of requiring and offering critical quantitative methods courses in graduate programs in Africana Studies, and discusses the methods requirements of one graduate program in the field as an example of more rigorous training that other programs could offer graduate students.

  15. Development of a comprehensive analytical platform for the detection and quantitation of food fraud using a biomarker approach. The oregano adulteration case study.

    PubMed

    Wielogorska, Ewa; Chevallier, Olivier; Black, Connor; Galvin-King, Pamela; Delêtre, Marc; Kelleher, Colin T; Haughey, Simon A; Elliott, Christopher T

    2018-01-15

    Due to increasing number of food fraud incidents, there is an inherent need for the development and implementation of analytical platforms enabling detection and quantitation of adulteration. In this study a set of unique biomarkers of commonly found oregano adulterants became the targets in the development of a LC-MS/MS method which underwent a rigorous in-house validation. The method presented very high selectivity and specificity, excellent linearity (R 2 >0.988) low decision limits and detection capabilities (<2%), acceptable accuracy (intra-assay 92-113%, inter-assay 69-138%) and precision (CV<20%). The method was compared with an established FTIR screening assay and revealed a good correlation of quali- and quantitative results (R 2 >0.81). An assessment of 54 suspected adulterated oregano samples revealed that almost 90% of them contained at least one bulking agent, with a median level of adulteration of 50%. Such innovative methodologies need to be established as routine testing procedures to detect and ultimately deter food fraud. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Quantifying trends in disease impact to produce a consistent and reproducible definition of an emerging infectious disease.

    PubMed

    Funk, Sebastian; Bogich, Tiffany L; Jones, Kate E; Kilpatrick, A Marm; Daszak, Peter

    2013-01-01

    The proper allocation of public health resources for research and control requires quantification of both a disease's current burden and the trend in its impact. Infectious diseases that have been labeled as "emerging infectious diseases" (EIDs) have received heightened scientific and public attention and resources. However, the label 'emerging' is rarely backed by quantitative analysis and is often used subjectively. This can lead to over-allocation of resources to diseases that are incorrectly labelled "emerging," and insufficient allocation of resources to diseases for which evidence of an increasing or high sustained impact is strong. We suggest a simple quantitative approach, segmented regression, to characterize the trends and emergence of diseases. Segmented regression identifies one or more trends in a time series and determines the most statistically parsimonious split(s) (or joinpoints) in the time series. These joinpoints in the time series indicate time points when a change in trend occurred and may identify periods in which drivers of disease impact change. We illustrate the method by analyzing temporal patterns in incidence data for twelve diseases. This approach provides a way to classify a disease as currently emerging, re-emerging, receding, or stable based on temporal trends, as well as to pinpoint the time when the change in these trends happened. We argue that quantitative approaches to defining emergence based on the trend in impact of a disease can, with appropriate context, be used to prioritize resources for research and control. Implementing this more rigorous definition of an EID will require buy-in and enforcement from scientists, policy makers, peer reviewers and journal editors, but has the potential to improve resource allocation for global health.

  17. Selection of Valid Reference Genes for Reverse Transcription Quantitative PCR Analysis in Heliconius numata (Lepidoptera: Nymphalidae)

    PubMed Central

    Chouteau, Mathieu; Whibley, Annabel; Joron, Mathieu; Llaurens, Violaine

    2016-01-01

    Identifying the genetic basis of adaptive variation is challenging in non-model organisms and quantitative real time PCR. is a useful tool for validating predictions regarding the expression of candidate genes. However, comparing expression levels in different conditions requires rigorous experimental design and statistical analyses. Here, we focused on the neotropical passion-vine butterflies Heliconius, non-model species studied in evolutionary biology for their adaptive variation in wing color patterns involved in mimicry and in the signaling of their toxicity to predators. We aimed at selecting stable reference genes to be used for normalization of gene expression data in RT-qPCR analyses from developing wing discs according to the minimal guidelines described in Minimum Information for publication of Quantitative Real-Time PCR Experiments (MIQE). To design internal RT-qPCR controls, we studied the stability of expression of nine candidate reference genes (actin, annexin, eF1α, FK506BP, PolyABP, PolyUBQ, RpL3, RPS3A, and tubulin) at two developmental stages (prepupal and pupal) using three widely used programs (GeNorm, NormFinder and BestKeeper). Results showed that, despite differences in statistical methods, genes RpL3, eF1α, polyABP, and annexin were stably expressed in wing discs in late larval and pupal stages of Heliconius numata. This combination of genes may be used as a reference for a reliable study of differential expression in wings for instance for genes involved in important phenotypic variation, such as wing color pattern variation. Through this example, we provide general useful technical recommendations as well as relevant statistical strategies for evolutionary biologists aiming to identify candidate-genes involved adaptive variation in non-model organisms. PMID:27271971

  18. Investigation of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian A.

    2005-01-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical model. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. Excellent agreement is achieved between the predicted and measured results, thereby quantitatively validating the numerical tool.

  19. Floquet spectrum and driven conductance in Dirac materials: Effects of Landau-Zener-Stückelberg-Majorana interferometry

    NASA Astrophysics Data System (ADS)

    Rodionov, Ya. I.; Kugel, K. I.; Nori, Franco

    2016-11-01

    Using the Landau-Zener-Stückelberg-Majorana-type (LZSM) semiclassical approach, we study both graphene and a thin film of a Weyl semimetal subjected to a strong ac electromagnetic field. The spectrum of quasienergies in the Weyl semimetal turns out to be similar to that of a graphene sheet. It has been predicted qualitatively that the transport properties of strongly irradiated graphene oscillate as a function of the radiation intensity [S. V. Syzranov et al., Phys. Rev. B 88, 241112 (2013)], 10.1103/PhysRevB.88.241112. Here we obtain rigorous quantitative results for a driven linear conductance of graphene and a thin film of a Weyl semimetal. The exact quantitative structure of oscillations exhibits two contributions. The first one is a manifestation of the Ramsauer-Townsend effect, while the second contribution is a consequence of the LZSM interference defining the spectrum of quasienergies.

  20. Enhancing managerial effectiveness in dietetics.

    PubMed

    Hoover, L W

    1983-01-01

    Environmental pressures from such sources as economic conditions, the government, third-party payers, and inter-institutional competition create managerial challenges. Although cost-containment has received considerable attention, long-term cost-effectiveness is probably the significant issue. Dietitians must become more cost-conscious and effective in resource management to attain desired performance outcomes. Some of the skills and characteristics essential to managerial effectiveness are a marketing orientation, systems design skill, quantitative operations management techniques, financial expertise, and leadership. These abilities facilitate decision-making and achievement of long-term cost-effectiveness. Curriculum enhancement and continuing education are two strategies for improving managerial competency in the dietetics profession. In dietetics education, study of management topics should be enhanced to provide more advanced coverage of management theories and quantitative models so that managerial performance can be at a higher level of sophistication and competency. To assure the viability of the dietetics profession, the emphasis on management must be more comprehensive and rigorous.

  1. Simultaneous determination of effective carrier lifetime and resistivity of Si wafers using the nonlinear nature of photocarrier radiometric signals

    NASA Astrophysics Data System (ADS)

    Sun, Qiming; Melnikov, Alexander; Wang, Jing; Mandelis, Andreas

    2018-04-01

    A rigorous treatment of the nonlinear behavior of photocarrier radiometric (PCR) signals is presented theoretically and experimentally for the quantitative characterization of semiconductor photocarrier recombination and transport properties. A frequency-domain model based on the carrier rate equation and the classical carrier radiative recombination theory was developed. The derived concise expression reveals different functionalities of the PCR amplitude and phase channels: the phase bears direct quantitative correlation with the carrier effective lifetime, while the amplitude versus the estimated photocarrier density dependence can be used to extract the equilibrium majority carrier density and thus, resistivity. An experimental ‘ripple’ optical excitation mode (small modulation depth compared to the dc level) was introduced to bypass the complicated ‘modulated lifetime’ problem so as to simplify theoretical interpretation and guarantee measurement self-consistency and reliability. Two Si wafers with known resistivity values were tested to validate the method.

  2. Methodological quality and scientific impact of quantitative nursing education research over 18 months.

    PubMed

    Yucha, Carolyn B; Schneider, Barbara St Pierre; Smyer, Tish; Kowalski, Susan; Stowers, Eva

    2011-01-01

    The methodological quality of nursing education research has not been rigorously studied. The purpose of this study was to evaluate the methodological quality and scientific impact of nursing education research reports. The methodological quality of 133 quantitative nursing education research articles published between July 2006 and December 2007 was evaluated using the Medical Education Research Study Quality Instrument (MERSQI).The mean (+/- SD) MERSQI score was 9.8 +/- 2.2. It correlated (p < .05) with several scientific impact indicators: citation counts from Scopus (r = .223), Google Scholar (r = .224), and journal impact factor (r = .216); it was not associated with Web of Science citation count, funding, or h Index. The similarities between this study's MERSQI ratings for nursing literature and those reported for the medical literature, coupled with the association with citation counts, suggest that the MERSQI is an appropriate instrument to evaluate the quality of nursing education research.

  3. Interventions to Reduce Perceived Stress Among Graduate Students: A Systematic Review With Implications for Evidence-Based Practice.

    PubMed

    Stillwell, Susan B; Vermeesch, Amber L; Scott, Jane G

    2017-12-01

    Stress is a part of daily life for graduate students, including graduate nursing students. Contemporary graduate nursing students are facing unprecedented challenges to meet rigorous academic standards as they prepare for their advanced professional role to meet the demands of the nation's complex and ever-changing healthcare system. Empowering graduate nursing students to ease their perceived stress and minimize undesirable health effects may benefit their capacity to adapt and successfully manage perceived stress in their future healthcare role. To conduct a systematic review to evaluate the existing evidence with the aim of identifying evidence-based self-care interventions for coping with perceived stress. We conducted a systematic review, searching CINAHL Plus with Full Text, PsycINFO, and MEDLINE. Inclusion criteria included self-care, graduate students, perceived stress as measured by Perceived Stress Scale, quantitative analysis, conducted within the United States, English language, and peer reviewed. Two authors completed an asynchronous review of the articles, and one expert evidence-based practice mentor and one wellness expert conducted rigorous appraisal of the eight identified studies. Evidence was evaluated and synthesized, and recommendations for practice were determined. Eight studies meeting the criteria for this systematic review were critically appraised. The interventions varied from a stress management course to mind-body-stress-reduction (MBSR) techniques, such as yoga, breath work, meditation, and mindfulness. All studies measured the outcome of stress with the Perceived Stress Scale. Each study demonstrated a reduction in perceived stress postintervention. Most effective self-care MBSR interventions include (a) a didactic component, (b) a guided MBSR practice session, and (c) homework. Consideration should be given to a trained or certified MBSR instructor to teach the intervention. © 2017 Sigma Theta Tau International.

  4. Evidence-Based Practice Point-of-Care Resources: A Quantitative Evaluation of Quality, Rigor, and Content.

    PubMed

    Campbell, Jared M; Umapathysivam, Kandiah; Xue, Yifan; Lockwood, Craig

    2015-12-01

    Clinicians and other healthcare professionals need access to summaries of evidence-based information in order to provide effective care to their patients at the point-of-care. Evidence-based practice (EBP) point-of-care resources have been developed and are available online to meet this need. This study aimed to develop a comprehensive list of available EBP point-of-care resources and evaluate their processes and policies for the development of content, in order to provide a critical analysis based upon rigor, transparency and measures of editorial quality to inform healthcare providers and promote quality improvement amongst publishers of EBP resources. A comprehensive and systematic search (Pubmed, CINAHL, and Cochrane Central) was undertaken to identify available EBP point-of-care resources, defined as "web-based medical compendia specifically designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, and evidence-based information (and possibly also guidance) to clinicians." A pair of investigators independently extracted information on general characteristics, content presentation, editorial quality, evidence-based methodology, and breadth and volume. Twenty-seven summary resources were identified, of which 22 met the predefined inclusion criteria for EBP point-of-care resources, and 20 could be accessed for description and assessment. Overall, the upper quartile of EBP point-of-care providers was assessed to be UpToDate, Nursing Reference Centre, Mosby's Nursing Consult, BMJ Best Practice, and JBI COnNECT+. The choice of which EBP point-of-care resources are suitable for an organization is a decision that depends heavily on the unique requirements of that organization and the resources it has available. However, the results presented in this study should enable healthcare providers to make that assessment in a clear, evidence-based manner, and provide a comprehensive list of the available options. © 2015 Sigma Theta Tau International.

  5. Refining Estimates of Bird Collision and Electrocution Mortality at Power Lines in the United States

    PubMed Central

    Loss, Scott R.; Will, Tom; Marra, Peter P.

    2014-01-01

    Collisions and electrocutions at power lines are thought to kill large numbers of birds in the United States annually. However, existing estimates of mortality are either speculative (for electrocution) or based on extrapolation of results from one study to all U.S. power lines (for collision). Because national-scale estimates of mortality and comparisons among threats are likely to be used for prioritizing policy and management strategies and for identifying major research needs, these estimates should be based on systematic and transparent assessment of rigorously collected data. We conducted a quantitative review that incorporated data from 14 studies meeting our inclusion criteria to estimate that between 12 and 64 million birds are killed each year at U.S. power lines, with between 8 and 57 million birds killed by collision and between 0.9 and 11.6 million birds killed by electrocution. Sensitivity analyses indicate that the majority of uncertainty in our estimates arises from variation in mortality rates across studies; this variation is due in part to the small sample of rigorously conducted studies that can be used to estimate mortality. Little information is available to quantify species-specific vulnerability to mortality at power lines; the available literature over-represents particular bird groups and habitats, and most studies only sample and present data for one or a few species. Furthermore, additional research is needed to clarify whether, to what degree, and in what regions populations of different bird species are affected by power line-related mortality. Nonetheless, our data-driven analysis suggests that the amount of bird mortality at U.S. power lines is substantial and that conservation management and policy is necessary to reduce this mortality. PMID:24991997

  6. Feasibility, appropriateness, meaningfulness and effectiveness of patient participation at bedside shift reporting: mixed-method research protocol.

    PubMed

    Malfait, Simon; Eeckloo, Kristof; Lust, Elisa; Van Biesen, Wim; Van Hecke, Ann

    2017-02-01

    To evaluate the feasibility, appropriateness, meaningfulness and effectiveness of bedside shift reporting in a minimum of five interventions and five control wards. Hospitals continually improve their quality of care. Next to improvements in clinical performance, more patient participation is stimulated through different methods. Methods to enhance patient participation such as bedside shift reporting lack rigorously performed research to determine their feasibility, appropriateness, meaningfulness and effectiveness. Small-scale research and a previous pilot study indicate that bedside shift reporting improves patient participation, nurse-nurse communication and nurse-patient communication. The development, implementation and evaluation of bedside shift report are based on the Medical Research Council framework for complex interventions in health care. A matched, controlled, mixed-method, longitudinal study design will be used. The Feasibility-Appropriateness-Meaningfulness-Effectiveness framework will be applied for the quantitative and qualitative evaluation of bedside shift report. A tailored intervention and implementation process for bedside shift report will be developed using diagnostic interviews, co-design and acceptability testing. The intervention will be evaluated before implementation and three times after implementation. Individual and focus group interviews will be performed. Questionnaires, observations and analysis of the medical records and administrative databases will be completed. This study was funded in October 2015. Research Ethics Committee approval was granted in March 2016. There is a pressing need for rigorous research into the effects of interventions for improving patient participation. This study addresses the significance of bedside shift report as an intervention to improve quality of care, communication and patient participation within a large-scale, matched, controlled research design. © 2016 John Wiley & Sons Ltd.

  7. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research

    PubMed Central

    2011-01-01

    Objective The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people’s social and cultural lives. Approach I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. Results I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. Conclusion When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Implication Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health. PMID:22168509

  8. A novel design process for selection of attributes for inclusion in discrete choice experiments: case study exploring variation in clinical decision-making about thrombolysis in the treatment of acute ischaemic stroke.

    PubMed

    De Brún, Aoife; Flynn, Darren; Ternent, Laura; Price, Christopher I; Rodgers, Helen; Ford, Gary A; Rudd, Matthew; Lancsar, Emily; Simpson, Stephen; Teah, John; Thomson, Richard G

    2018-06-22

    A discrete choice experiment (DCE) is a method used to elicit participants' preferences and the relative importance of different attributes and levels within a decision-making process. DCEs have become popular in healthcare; however, approaches to identify the attributes/levels influencing a decision of interest and to selection methods for their inclusion in a DCE are under-reported. Our objectives were: to explore the development process used to select/present attributes/levels from the identified range that may be influential; to describe a systematic and rigorous development process for design of a DCE in the context of thrombolytic therapy for acute stroke; and, to discuss the advantages of our five-stage approach to enhance current guidance for developing DCEs. A five-stage DCE development process was undertaken. Methods employed included literature review, qualitative analysis of interview and ethnographic data, expert panel discussions, a quantitative structured prioritisation (ranking) exercise and pilot testing of the DCE using a 'think aloud' approach. The five-stage process reported helped to reduce the list of 22 initial patient-related factors to a final set of nine variable factors and six fixed factors for inclusion in a testable DCE using a vignette model of presentation. In order for the data and conclusions generated by DCEs to be deemed valid, it is crucial that the methods of design and development are documented and reported. This paper has detailed a rigorous and systematic approach to DCE development which may be useful to researchers seeking to establish methods for reducing and prioritising attributes for inclusion in future DCEs.

  9. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.

  10. On analyticity of linear waves scattered by a layered medium

    NASA Astrophysics Data System (ADS)

    Nicholls, David P.

    2017-10-01

    The scattering of linear waves by periodic structures is a crucial phenomena in many branches of applied physics and engineering. In this paper we establish rigorous analytic results necessary for the proper numerical analysis of a class of High-Order Perturbation of Surfaces methods for simulating such waves. More specifically, we prove a theorem on existence and uniqueness of solutions to a system of partial differential equations which model the interaction of linear waves with a multiply layered periodic structure in three dimensions. This result provides hypotheses under which a rigorous numerical analysis could be conducted for recent generalizations to the methods of Operator Expansions, Field Expansions, and Transformed Field Expansions.

  11. How measurement science can improve confidence in research results.

    PubMed

    Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T

    2018-04-01

    The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.

  12. Chemical Characterization and Determination of the Anti-Oxidant Capacity of Two Brown Algae with Respect to Sampling Season and Morphological Structures Using Infrared Spectroscopy and Multivariate Analyses.

    PubMed

    Beratto, Angelo; Agurto, Cristian; Freer, Juanita; Peña-Farfal, Carlos; Troncoso, Nicolás; Agurto, Andrés; Castillo, Rosario Del P

    2017-10-01

    Brown algae biomass has been shown to be a highly important industrial source for the production of alginates and different nutraceutical products. The characterization of this biomass is necessary in order to allocate its use to specific applications according to the chemical and biological characteristics of this highly variable resource. The methods commonly used for algae characterization require a long time for the analysis and rigorous pretreatments of samples. In this work, nondestructive and fast analyses of different morphological structures from Lessonia spicata and Macrocystis pyrifera, which were collected during different seasons, were performed using Fourier transform infrared (FT-IR) techniques in combination with chemometric methods. Mid-infrared (IR) and near-infrared (NIR) spectral ranges were tested to evaluate the spectral differences between the species, seasons, and morphological structures of algae using a principal component analysis (PCA). Quantitative analyses of the polyphenol and alginate contents and the anti-oxidant capacity of the samples were performed using partial least squares (PLS) with both spectral ranges in order to build a predictive model for the rapid quantification of these parameters with industrial purposes. The PCA mainly showed differences in the samples based on seasonal sampling, where changes were observed in the bands corresponding to polysaccharides, proteins, and lipids. The obtained PLS models had high correlation coefficients (r) for the polyphenol content and anti-oxidant capacity (r > 0.9) and lower values for the alginate determination (0.7 < r < 0.8). Fourier transform infrared-based techniques were suitable tools for the rapid characterization of algae biomass, in which high variability in the samples was incorporated for the qualitative and quantitative analyses, and have the potential to be used on an industrial scale.

  13. Digital morphogenesis via Schelling segregation

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2018-04-01

    Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.

  14. Using mixed methods effectively in prevention science: designs, procedures, and examples.

    PubMed

    Zhang, Wanqing; Watanabe-Galloway, Shinobu

    2014-10-01

    There is growing interest in using a combination of quantitative and qualitative methods to generate evidence about the effectiveness of health prevention, services, and intervention programs. With the emerging importance of mixed methods research across the social and health sciences, there has been an increased recognition of the value of using mixed methods for addressing research questions in different disciplines. We illustrate the mixed methods approach in prevention research, showing design procedures used in several published research articles. In this paper, we focused on two commonly used mixed methods designs: concurrent and sequential mixed methods designs. We discuss the types of mixed methods designs, the reasons for, and advantages of using a particular type of design, and the procedures of qualitative and quantitative data collection and integration. The studies reviewed in this paper show that the essence of qualitative research is to explore complex dynamic phenomena in prevention science, and the advantage of using mixed methods is that quantitative data can yield generalizable results and qualitative data can provide extensive insights. However, the emphasis of methodological rigor in a mixed methods application also requires considerable expertise in both qualitative and quantitative methods. Besides the necessary skills and effective interdisciplinary collaboration, this combined approach also requires an open-mindedness and reflection from the involved researchers.

  15. IMPROVING ALTERNATIVES FOR ENVIRONMENTAL IMPACT ASSESSMENT. (R825758)

    EPA Science Inventory

    Environmental impact assessment (EIA), in the US, requires an objective and rigorous analysis of alternatives. Yet the choice of alternatives for that analysis can be subjective and arbitrary. Alternatives often reflect narrow project objectives, agency agendas, and predilecti...

  16. FORMAL SCENARIO DEVELOPMENT FOR ENVIRONMENTAL IMPACT ASSESSMENT STUDIES

    EPA Science Inventory

    Scenario analysis is a process of evaluating possible future events through the consideration of alternative plausible (though not equally likely) outcomes (scenarios). The analysis is designed to enable improved decision-making and assessment through a more rigorous evaluation o...

  17. Testing the Community-Based Learning Collaborative (CBLC) implementation model: a study protocol.

    PubMed

    Hanson, Rochelle F; Schoenwald, Sonja; Saunders, Benjamin E; Chapman, Jason; Palinkas, Lawrence A; Moreland, Angela D; Dopp, Alex

    2016-01-01

    High rates of youth exposure to violence, either through direct victimization or witnessing, result in significant health/mental health consequences and high associated lifetime costs. Evidence-based treatments (EBTs), such as Trauma-Focused Cognitive Behavioral Therapy (TF-CBT), can prevent and/or reduce these negative effects, yet these treatments are not standard practice for therapists working with children identified by child welfare or mental health systems as needing services. While research indicates that collaboration among child welfare and mental health services sectors improves availability and sustainment of EBTs for children, few implementation strategies designed specifically to promote and sustain inter-professional collaboration (IC) and inter-organizational relationships (IOR) have undergone empirical investigation. A potential candidate for evaluation is the Community-Based Learning Collaborative (CBLC) implementation model, an adaptation of the Learning Collaborative which includes strategies designed to develop and strengthen inter-professional relationships between brokers and providers of mental health services to promote IC and IOR and achieve sustained implementation of EBTs for children within a community. This non-experimental, mixed methods study involves two phases: (1) analysis of existing prospective quantitative and qualitative quality improvement and project evaluation data collected pre and post, weekly, and monthly from 998 participants in one of seven CBLCs conducted as part of a statewide initiative; and (2) Phase 2 collection of new quantitative and qualitative (key informant interviews) data during the funded study period to evaluate changes in relations among IC, IOR, social networks and the penetration and sustainment of TF-CBT in targeted communities. Recruitment for Phase 2 is from the pool of 998 CBLC participants to achieve a targeted enrollment of n = 150. Study aims include: (1) Use existing quality improvement (weekly/monthly online surveys; pre-post surveys; interviews) and newly collected quantitative (monthly surveys) and qualitative (key informant interviews) data and social network analysis to test whether CBLC strategies are associated with penetration and sustainment of TF-CBT; and (2) Use existing quantitative quality improvement (weekly/monthly on-line surveys; pre/post surveys) and newly collected qualitative (key informant interviews) data and social network analysis to test whether CBLC strategies are associated with increased IOR and IC intensity. The proposed research leverages an on-going, statewide implementation initiative to generate evidence about implementation strategies needed to make trauma-focused EBTs more accessible to children. This study also provides feasibility data to inform an effectiveness trial that will utilize a time-series design to rigorously evaluate the CBLC model as a mechanism to improve access and sustained use of EBTs for children.

  18. Climate Change Accuracy: Requirements and Economic Value

    NASA Astrophysics Data System (ADS)

    Wielicki, B. A.; Cooke, R.; Mlynczak, M. G.; Lukashin, C.; Thome, K. J.; Baize, R. R.

    2014-12-01

    Higher than normal accuracy is required to rigorously observe decadal climate change. But what level is needed? How can this be quantified? This presentation will summarize a new more rigorous and quantitative approach to determining the required accuracy for climate change observations (Wielicki et al., 2013, BAMS). Most current global satellite observations cannot meet this accuracy level. A proposed new satellite mission to resolve this challenge is CLARREO (Climate Absolute Radiance and Refractivity Observatory). CLARREO is designed to achieve advances of a factor of 10 for reflected solar spectra and a factor of 3 to 5 for thermal infrared spectra (Wielicki et al., Oct. 2013 BAMS). The CLARREO spectrometers are designed to serve as SI traceable benchmarks for the Global Satellite Intercalibration System (GSICS) and to greatly improve the utility of a wide range of LEO and GEO infrared and reflected solar passive satellite sensors for climate change observations (e.g. CERES, MODIS, VIIIRS, CrIS, IASI, Landsat, SPOT, etc). Providing more accurate decadal change trends can in turn lead to more rapid narrowing of key climate science uncertainties such as cloud feedback and climate sensitivity. A study has been carried out to quantify the economic benefits of such an advance as part of a rigorous and complete climate observing system. The study concludes that the economic value is $12 Trillion U.S. dollars in Net Present Value for a nominal discount rate of 3% (Cooke et al. 2013, J. Env. Sys. Dec.). A brief summary of these two studies and their implications for the future of climate science will be presented.

  19. Stigma-related experiences in non-communicable respiratory diseases: A systematic review.

    PubMed

    Rose, Shiho; Paul, Christine; Boyes, Allison; Kelly, Brian; Roach, Della

    2017-08-01

    The stigma of non-communicable respiratory diseases (NCRDs), whether perceived or otherwise, can be an important element of a patient's experience of his/her illness and a contributing factor to poor psychosocial, treatment and clinical outcomes. This systematic review examines the evidence regarding the associations between stigma-related experiences and patient outcomes, comparing findings across a range of common NCRDs. Electronic databases and manual searches were conducted to identify original quantitative research published to December 2015. Articles focussing on adult patient samples diagnosed with asthma, chronic obstructive pulmonary disease (COPD), cystic fibrosis, lung cancer or mesothelioma, and included a measurement of stigma-related experience (i.e. perceived stigma, shame, blame or guilt), were eligible for inclusion. Included articles were described for study characteristics, outcome scores, correlates between stigma-related experiences and patient outcomes and methodological rigor. Twenty-five articles were eligible for this review, with most ( n = 20) related to lung cancer. No articles for cystic fibrosis were identified. Twenty unique scales were used, with low to moderate stigma-related experiences reported overall. The stigma-related experiences significantly correlated with all six patient-related domains explored (psychosocial, quality of life, behavioral, physical, treatment and work), which were investigated more widely in COPD and lung cancer samples. No studies adequately met all criteria for methodological rigor. The inter-connectedness of stigma-related experiences to other aspects of patient experiences highlight that an integrated approach is needed to address this important issue. Future studies should adopt more rigorous methodology, including streamlining measures, to provide robust evidence.

  20. Development of rigor mortis is not affected by muscle volume.

    PubMed

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  1. CSF analysis

    MedlinePlus

    ... A, Sancesario GM, Esposito Z, et al. Plasmin system of Alzheimer's disease: CSF analysis. J Neural Transm (Vienna) . ... urac.org). URAC's accreditation program is an independent audit to verify that A.D.A.M. follows rigorous standards of quality and accountability. A.D.A.M. is ...

  2. Exploring simple, transparent, interpretable and predictive QSAR models for classification and quantitative prediction of rat toxicity of ionic liquids using OECD recommended guidelines.

    PubMed

    Das, Rudra Narayan; Roy, Kunal; Popelier, Paul L A

    2015-11-01

    The present study explores the chemical attributes of diverse ionic liquids responsible for their cytotoxicity in a rat leukemia cell line (IPC-81) by developing predictive classification as well as regression-based mathematical models. Simple and interpretable descriptors derived from a two-dimensional representation of the chemical structures along with quantum topological molecular similarity indices have been used for model development, employing unambiguous modeling strategies that strictly obey the guidelines of the Organization for Economic Co-operation and Development (OECD) for quantitative structure-activity relationship (QSAR) analysis. The structure-toxicity relationships that emerged from both classification and regression-based models were in accordance with the findings of some previous studies. The models suggested that the cytotoxicity of ionic liquids is dependent on the cationic surfactant action, long alkyl side chains, cationic lipophilicity as well as aromaticity, the presence of a dialkylamino substituent at the 4-position of the pyridinium nucleus and a bulky anionic moiety. The models have been transparently presented in the form of equations, thus allowing their easy transferability in accordance with the OECD guidelines. The models have also been subjected to rigorous validation tests proving their predictive potential and can hence be used for designing novel and "greener" ionic liquids. The major strength of the present study lies in the use of a diverse and large dataset, use of simple reproducible descriptors and compliance with the OECD norms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases.

    PubMed

    Hollingsworth, T Déirdre; Adams, Emily R; Anderson, Roy M; Atkins, Katherine; Bartsch, Sarah; Basáñez, María-Gloria; Behrend, Matthew; Blok, David J; Chapman, Lloyd A C; Coffeng, Luc; Courtenay, Orin; Crump, Ron E; de Vlas, Sake J; Dobson, Andy; Dyson, Louise; Farkas, Hajnal; Galvani, Alison P; Gambhir, Manoj; Gurarie, David; Irvine, Michael A; Jervis, Sarah; Keeling, Matt J; Kelly-Hope, Louise; King, Charles; Lee, Bruce Y; Le Rutte, Epke A; Lietman, Thomas M; Ndeffo-Mbah, Martial; Medley, Graham F; Michael, Edwin; Pandey, Abhishek; Peterson, Jennifer K; Pinsent, Amy; Porco, Travis C; Richardus, Jan Hendrik; Reimer, Lisa; Rock, Kat S; Singh, Brajendra K; Stolk, Wilma; Swaminathan, Subramanian; Torr, Steve J; Townsend, Jeffrey; Truscott, James; Walker, Martin; Zoueva, Alexandra

    2015-12-09

    Quantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an overview of a collection of novel model-based analyses which aim to address key questions on the dynamics of transmission and control of nine NTDs: Chagas disease, visceral leishmaniasis, human African trypanosomiasis, leprosy, soil-transmitted helminths, schistosomiasis, lymphatic filariasis, onchocerciasis and trachoma. Several common themes resonate throughout these analyses, including: the importance of epidemiological setting on the success of interventions; targeting groups who are at highest risk of infection or re-infection; and reaching populations who are not accessing interventions and may act as a reservoir for infection,. The results also highlight the challenge of maintaining elimination 'as a public health problem' when true elimination is not reached. The models elucidate the factors that may be contributing most to persistence of disease and discuss the requirements for eventually achieving true elimination, if that is possible. Overall this collection presents new analyses to inform current control initiatives. These papers form a base from which further development of the models and more rigorous validation against a variety of datasets can help to give more detailed advice. At the moment, the models' predictions are being considered as the world prepares for a final push towards control or elimination of neglected tropical diseases by 2020.

  4. Volumetric CT in lung cancer: an example for the qualification of imaging as a biomarker.

    PubMed

    Buckler, Andrew J; Mozley, P David; Schwartz, Lawrence; Petrick, Nicholas; McNitt-Gray, Michael; Fenimore, Charles; O'Donnell, Kevin; Hayes, Wendy; Kim, Hyun J; Clarke, Laurence; Sullivan, Daniel

    2010-01-01

    New ways to understand biology as well as increasing interest in personalized treatments requires new capabilities for the assessment of therapy response. The lack of consensus methods and qualification evidence needed for large-scale multicenter trials, and in turn the standardization that allows them, are widely acknowledged to be the limiting factor in the deployment of qualified imaging biomarkers. The Quantitative Imaging Biomarker Alliance is organized to establish a methodology whereby multiple stakeholders collaborate. It has charged the Volumetric Computed Tomography (CT) Technical Subcommittee with investigating the technical feasibility and clinical value of quantifying changes over time in either volume or other parameters as biomarkers. The group selected solid tumors of the chest in subjects with lung cancer as its first case in point. Success is defined as sufficiently rigorous improvements in CT-based outcome measures to allow individual patients in clinical settings to switch treatments sooner if they are no longer responding to their current regimens, and reduce the costs of evaluating investigational new drugs to treat lung cancer. The team has completed a systems engineering analysis, has begun a roadmap of experimental groundwork, documented profile claims and protocols, and documented a process for imaging biomarker qualification as a general paradigm for qualifying other imaging biomarkers as well. This report addresses a procedural template for the qualification of quantitative imaging biomarkers. This mechanism is cost-effective for stakeholders while simultaneously advancing the public health by promoting the use of measures that prove effective.

  5. Measurement and prediction of the thermomechanical response of shape memory alloy hybrid composite beams

    NASA Astrophysics Data System (ADS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2005-05-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  6. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2005-01-01

    Previous work at NASA Langley Research Center (LaRC) involved fabrication and testing of composite beams with embedded, pre-strained shape memory alloy (SMA) ribbons within the beam structures. That study also provided comparison of experimental results with numerical predictions from a research code making use of a new thermoelastic model for shape memory alloy hybrid composite (SMAHC) structures. The previous work showed qualitative validation of the numerical model. However, deficiencies in the experimental-numerical correlation were noted and hypotheses for the discrepancies were given for further investigation. The goal of this work is to refine the experimental measurement and numerical modeling approaches in order to better understand the discrepancies, improve the correlation between prediction and measurement, and provide rigorous quantitative validation of the numerical analysis/design tool. The experimental investigation is refined by a more thorough test procedure and incorporation of higher fidelity measurements such as infrared thermography and projection moire interferometry. The numerical results are produced by a recently commercialized version of the constitutive model as implemented in ABAQUS and are refined by incorporation of additional measured parameters such as geometric imperfection. Thermal buckling, post-buckling, and random responses to thermal and inertial (base acceleration) loads are studied. The results demonstrate the effectiveness of SMAHC structures in controlling static and dynamic responses by adaptive stiffening. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  7. PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study

    NASA Astrophysics Data System (ADS)

    Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.

    2009-12-01

    The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.

  8. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  9. University Students' Strategies for Constructing Hypothesis when Tackling Paper-and-Pencil Tasks in Physics

    NASA Astrophysics Data System (ADS)

    Guisasola, Jenaro; Ceberio, Mikel; Zubimendi, José Luis

    2006-09-01

    The study we present tries to explore how first year engineering students formulate hypotheses in order to construct their own problem solving structure when confronted with problems in physics. Under the constructivistic perspective of the teaching-learning process, the formulation of hypotheses plays a key role in contrasting the coherence of the students' ideas with the theoretical frame. The main research instrument used to identify students' reasoning is the written report by the student on how they have attempted four problem solving tasks in which they have been asked explicitly to formulate hypotheses. The protocols used in the assessment of the solutions consisted of a semi-quantitative study based on grids designed for the analysis of written answers. In this paper we have included two of the tasks used and the corresponding scheme for the categorisation of the answers. Details of the other two tasks are also outlined. According to our findings we would say that the majority of students judge a hypothesis to be plausible if it is congruent with their previous knowledge without rigorously checking it against the theoretical framework explained in class.

  10. High Resolution Gamma Ray Analysis of Medical Isotopes

    NASA Astrophysics Data System (ADS)

    Chillery, Thomas

    2015-10-01

    Compton-suppressed high-purity Germanium detectors at the University of Massachusetts Lowell have been used to study medical radioisotopes produced at Brookhaven Linac Isotope Producer (BLIP), in particular isotopes such as Pt-191 used for cancer therapy in patients. The ability to precisely analyze the concentrations of such radio-isotopes is essential for both production facilities such as Brookhaven and consumer hospitals across the U.S. Without accurate knowledge of the quantities and strengths of these isotopes, it is possible for doctors to administer incorrect dosages to patients, thus leading to undesired results. Samples have been produced at Brookhaven and shipped to UML, and the advanced electronics and data acquisition capabilities at UML have been used to extract peak areas in the gamma decay spectra. Levels of Pt isotopes in diluted samples have been quantified, and reaction cross-sections deduced from the irradiation parameters. These provide both cross checks with published work, as well as a rigorous quantitative framework with high quality state-of-the-art detection apparatus in use in the experimental nuclear physics community.

  11. Influence of gas compressibility on a burning accident in a mining passage

    NASA Astrophysics Data System (ADS)

    Demir, Sinan; Calavay, Anish Raman; Akkerman, V'yacheslav

    2018-03-01

    A recent predictive scenario of a methane/air/coal dust fire in a mining passage is extended by incorporating the effect of gas compressibility into the analysis. The compressible and incompressible formulations are compared, qualitatively and quantitatively, in both the two-dimensional planar and cylindrical-axisymmetric geometries, and a detailed parametric study accounting for coal-dust combustion is performed. It is shown that gas compression moderates flame acceleration, and its impact depends on the type of the fuel, its various thermal-chemical parameters as well as on the geometry of the problem. While the effect of gas compression is relatively minor for the lean and rich flames, providing 5-25% reduction in the burning velocity and thereby justifying the incompressible formulation in that case, such a reduction appears significant, up to 70% for near-stoichiometric methane-air combustion, and therefore it should be incorporated into a rigorous formulation. It is demonstrated that the flame tip velocity remains noticeably subsonic in all the cases considered, which is opposite to the prediction of the incompressible formulation, but qualitatively agrees with the experimental predictions from the literature.

  12. Cultural adaptation of a supportive care needs measure for Hispanic men cancer survivors.

    PubMed

    Martinez Tyson, Dinorah; Medina-Ramirez, Patricia; Vázquez-Otero, Coralia; Gwede, Clement K; Bobonis, Margarita; McMillan, Susan C

    2018-01-01

    Research with ethnic minority populations requires instrumentation that is cultural and linguistically relevant. The aim of this study was to translate and culturally adapt the Cancer Survivor Unmet Needs measure into Spanish. We describe the iterative, community-engaged consensus-building approaches used to adapt the instrument for Hispanic male cancer survivors. We used an exploratory sequential mixed method study design. Methods included translation and back-translation, focus groups with cancer survivors (n = 18) and providers (n = 5), use of cognitive interview techniques to evaluate the comprehension and acceptability of the adapted instrument with survivors (n = 12), ongoing input from the project's community advisory board, and preliminary psychometric analysis (n = 84). The process emphasized conceptual, content, semantic, and technical equivalence. Combining qualitative and quantitative approaches offered a rigorous, systematic, and contextual approach to translation alone and supports the cultural adaptation of this measure in a purposeful and relevant manner. Our findings highlight the importance of going beyond translation when adapting measures for cross-cultural populations and illustrate the importance of taking culture, literacy, and language into consideration.

  13. A ground truth based comparative study on clustering of gene expression data.

    PubMed

    Zhu, Yitan; Wang, Zuyi; Miller, David J; Clarke, Robert; Xuan, Jianhua; Hoffman, Eric P; Wang, Yue

    2008-05-01

    Given the variety of available clustering methods for gene expression data analysis, it is important to develop an appropriate and rigorous validation scheme to assess the performance and limitations of the most widely used clustering algorithms. In this paper, we present a ground truth based comparative study on the functionality, accuracy, and stability of five data clustering methods, namely hierarchical clustering, K-means clustering, self-organizing maps, standard finite normal mixture fitting, and a caBIG toolkit (VIsual Statistical Data Analyzer--VISDA), tested on sample clustering of seven published microarray gene expression datasets and one synthetic dataset. We examined the performance of these algorithms in both data-sufficient and data-insufficient cases using quantitative performance measures, including cluster number detection accuracy and mean and standard deviation of partition accuracy. The experimental results showed that VISDA, an interactive coarse-to-fine maximum likelihood fitting algorithm, is a solid performer on most of the datasets, while K-means clustering and self-organizing maps optimized by the mean squared compactness criterion generally produce more stable solutions than the other methods.

  14. Understanding the low uptake of bone-anchored hearing aids: a review.

    PubMed

    Powell, R; Wearden, A; Pardesi, S M; Green, K

    2017-03-01

    Bone-anchored hearing aids improve hearing for patients for whom conventional behind-the-ear aids are problematic. However, uptake of bone-anchored hearing aids is low and it is important to understand why this is the case. A narrative review was conducted. Studies examining why people accept or decline bone-anchored hearing aids and satisfaction levels of people with bone-anchored hearing aids were reviewed. Reasons for declining bone-anchored hearing aids included limited perceived benefits, concerns about surgery, aesthetic concerns and treatment cost. No studies providing in-depth analysis of the reasons for declining or accepting bone-anchored hearing aids were identified. Studies of patient satisfaction showed that most participants reported benefits with bone-anchored hearing aids. However, most studies used cross-sectional and/or retrospective designs and only included people with bone-anchored hearing aids. Important avenues for further research are in-depth qualitative research designed to fully understand the decision-making process for bone-anchored hearing aids and rigorous quantitative research comparing satisfaction of people who receive bone-anchored hearing aids with those who receive alternative (or no) treatments.

  15. Development and Justification of a Risk Evaluation Matrix To Guide Chemical Testing Necessary To Select and Qualify Plastic Components Used in Production Systems for Pharmaceutical Products.

    PubMed

    Jenke, Dennis

    2015-01-01

    An accelerating trend in the pharmaceutical industry is the use of plastic components in systems used to produce an active pharmaceutical ingredient or a finished drug product. If the active pharmaceutical ingredient, the finished drug product, or any solution used to generate them (for example, a process stream such as media, buffers, eluents, and the like) is contacted by a plastic component at any time during the production process, substances leached from the component may accumulate in the active pharmaceutical ingredient or finished drug product, affecting its safety and/or efficacy. In this article the author develops and justifies a semi-quantitative risk evaluation matrix that is used to determine the amount and rigor of component testing necessary and appropriate to establish that the component is chemically suitable for its intended use. By considering key properties of the component, the contact medium, the contact conditions, and the active pharmaceutical ingredient's or finished drug product's clinical conditions of use, use of the risk evaluation matrix produces a risk score whose magnitude reflects the accumulated risk that the component will interact with the contact solution to such an extent that component-related extractables will accumulate in the active pharmaceutical ingredient or finished drug product as leachables at levels sufficiently high to adversely affect user safety. The magnitude of the risk score establishes the amount and rigor of the testing that is required to select and qualify the component, and such testing is broadly grouped into three categories: baseline assessment, general testing, and full testing (extractables profiling). Production suites used to generate pharmaceuticals can include plastic components. It is possible that substances in the components could leach into manufacturing solutions and accumulate in the pharmaceutical product. In this article the author develops and justifies a semi-quantitative risk evaluation matrix that can be used to determine the amount and rigor of component testing that may be necessary and appropriate to establish that the component is suitable for its intended use. Use of the risk evaluation matrix allows a user of a component to determine the type and amount of testing that should be performed to establish the patient safety risk associated with using that component in order to manufacture an active pharmaceutical ingredient or a finished drug product. © PDA, Inc. 2015.

  16. Retention in STEM: Understanding the Effectiveness of Science Posse

    NASA Astrophysics Data System (ADS)

    Godsoe, Kimberly

    One of the major areas of debate in higher education is how to best support underrepresented racial minority students in their study of Science, Technology, Engineering, and Math. In 2008, Brandeis University began a new program in conjunction with the Posse Foundation for students interested in studying science at the college-level. The research used a mixed methods design. A detailed quantitative analysis was conducted to understand how being part of Science Posse impacted the probability of doing well in initial science classes, influenced perceptions of the difficulty of studying science, and predicted the probability of majoring in STEM at Brandeis. The qualitative data was drawn from 89 student interviews, including 38 Science Posse Scholars, 24 students from backgrounds similar to the Scholars, and 25 students from well-resourced families. The qualitative analysis demonstrated how students had been exposed to the sciences prior to enrollment, how they navigated the sciences at Brandeis, and how they demonstrated resilience when science becomes challenging. This research study had four key findings. The first was in the quantitative analysis which demonstrated that Science Posse Scholars experience strong feelings of doubt about their academic abilities; based on previous research, this should have resulted in their not declaring majors in STEM disciplines. Instead, Science Posse Scholars were more likely to earn a B+ or above in their entry level science courses and declare a major in a STEM discipline, even when factors such as math and verbal SAT scores were included in the analysis. The second finding was in the qualitative analysis, which demonstrated that the cohort model in which Science Posse Scholars participate was instrumental to their success. The third finding was that students who attended academically less rigorous high schools could succeed in the sciences at a highly selective research institution such as Brandeis without academic remediation. The fourth finding was that neither Science Posse Scholars nor underrepresented students agreed with the idea that competition in the sciences was positive. While well-resourced students described this competition as good as it fostered individual success, Science Posse Scholars and underrepresented students placed greater emphasis on group success.

  17. Topic model-based mass spectrometric data analysis in cancer biomarker discovery studies.

    PubMed

    Wang, Minkun; Tsai, Tsung-Heng; Di Poto, Cristina; Ferrarini, Alessia; Yu, Guoqiang; Ressom, Habtom W

    2016-08-18

    A fundamental challenge in quantitation of biomolecules for cancer biomarker discovery is owing to the heterogeneous nature of human biospecimens. Although this issue has been a subject of discussion in cancer genomic studies, it has not yet been rigorously investigated in mass spectrometry based proteomic and metabolomic studies. Purification of mass spectometric data is highly desired prior to subsequent analysis, e.g., quantitative comparison of the abundance of biomolecules in biological samples. We investigated topic models to computationally analyze mass spectrometric data considering both integrated peak intensities and scan-level features, i.e., extracted ion chromatograms (EICs). Probabilistic generative models enable flexible representation in data structure and infer sample-specific pure resources. Scan-level modeling helps alleviate information loss during data preprocessing. We evaluated the capability of the proposed models in capturing mixture proportions of contaminants and cancer profiles on LC-MS based serum proteomic and GC-MS based tissue metabolomic datasets acquired from patients with hepatocellular carcinoma (HCC) and liver cirrhosis as well as synthetic data we generated based on the serum proteomic data. The results we obtained by analysis of the synthetic data demonstrated that both intensity-level and scan-level purification models can accurately infer the mixture proportions and the underlying true cancerous sources with small average error ratios (<7 %) between estimation and ground truth. By applying the topic model-based purification to mass spectrometric data, we found more proteins and metabolites with significant changes between HCC cases and cirrhotic controls. Candidate biomarkers selected after purification yielded biologically meaningful pathway analysis results and improved disease discrimination power in terms of the area under ROC curve compared to the results found prior to purification. We investigated topic model-based inference methods to computationally address the heterogeneity issue in samples analyzed by LC/GC-MS. We observed that incorporation of scan-level features have the potential to lead to more accurate purification results by alleviating the loss in information as a result of integrating peaks. We believe cancer biomarker discovery studies that use mass spectrometric analysis of human biospecimens can greatly benefit from topic model-based purification of the data prior to statistical and pathway analyses.

  18. Spa therapy and balneotherapy for treating low back pain: meta-analysis of randomized trials.

    PubMed

    Pittler, M H; Karagülle, M Z; Karagülle, M; Ernst, E

    2006-07-01

    Low back pain is a major public health concern and complementary treatments are frequently used for this condition. The objective of this systematic review and meta-analysis was to assess the evidence for or against the effectiveness of spa therapy and balneotherapy for treating low back pain. Systematic searches were conducted on Medline, Embase, Amed Cochrane Central, the UK National Research Register and ClincalTrials.gov (all until July 2005). Hand searches were performed and experts contacted. Methodological quality was assessed using a standard scale. Five randomized clinical trials met all inclusion criteria. Quantitative data synthesis was performed. The data for spa therapy, assessed on a 100 mm visual analogue scale (VAS), suggest significant beneficial effects compared with waiting list control groups (weighted mean difference 26.6 mm, 95% confidence interval 20.4-32.8, n=442) for patients with chronic low back pain. For balneotherapy the data, assessed on a 100 mm VAS, also suggest beneficial effects compared with control groups (weighted mean difference 18.8 mm, 95% confidence interval 10.3-27.3, n=138). Even though the data are scarce, there is encouraging evidence suggesting that spa therapy and balneotherapy may be effective for treating patients with low back pain. These data are not compelling but warrant rigorous large-scale trials.

  19. Nonlinear Trimodal Regression Analysis of Radiodensitometric Distributions to Quantify Sarcopenic and Sequelae Muscle Degeneration

    PubMed Central

    Árnadóttir, Í.; Gíslason, M. K.; Carraro, U.

    2016-01-01

    Muscle degeneration has been consistently identified as an independent risk factor for high mortality in both aging populations and individuals suffering from neuromuscular pathology or injury. While there is much extant literature on its quantification and correlation to comorbidities, a quantitative gold standard for analyses in this regard remains undefined. Herein, we hypothesize that rigorously quantifying entire radiodensitometric distributions elicits more muscle quality information than average values reported in extant methods. This study reports the development and utility of a nonlinear trimodal regression analysis method utilized on radiodensitometric distributions of upper leg muscles from CT scans of a healthy young adult, a healthy elderly subject, and a spinal cord injury patient. The method was then employed with a THA cohort to assess pre- and postsurgical differences in their healthy and operative legs. Results from the initial representative models elicited high degrees of correlation to HU distributions, and regression parameters highlighted physiologically evident differences between subjects. Furthermore, results from the THA cohort echoed physiological justification and indicated significant improvements in muscle quality in both legs following surgery. Altogether, these results highlight the utility of novel parameters from entire HU distributions that could provide insight into the optimal quantification of muscle degeneration. PMID:28115982

  20. Limitations of the ideal phase-Doppler system: Extension to spatially and temporally inhomogeneous particle flows with an application to diesel sprays

    NASA Astrophysics Data System (ADS)

    Marx, K. D.; Edwards, C. F.

    1992-12-01

    The effect of the single-particle constraint on the response of phase-Doppler instruments is determined for particle flows which are spatially nonuniform and time-dependent. Poisson statistics are applied to particle positions and arrival times within the phase-Doppler probe volume to determine the probability that a particle is measured successfully. It is shown that the single-particle constraint can be viewed as applying spatial and temporal filters to the particle flow. These filters have the same meaning as those that were defined previously for uniform, steady-state sprays, but in space- and time-dependent form. Criteria are developed for determining when a fully inhomogeneous analysis of a flow is required and when a quasi-steady analysis will suffice. A new bias due to particle arrival time displacement is identified and the conditions under which it must be considered are established. The present work provides the means to rigorously investigate the response of phase-Doppler measurement systems to transient sprays such as those which occur in diesel engines. To this end, the results are applied to a numerical simulation of a diesel spray. The calculated hypothetical response of the ideal instrument provides a quantitative demonstration of the regimes within which measurements can accurately be made in such sprays.

  1. Using qualitative mixed methods to study small health care organizations while maximising trustworthiness and authenticity.

    PubMed

    Phillips, Christine B; Dwan, Kathryn; Hepworth, Julie; Pearce, Christopher; Hall, Sally

    2014-11-19

    The primary health care sector delivers the majority of health care in western countries through small, community-based organizations. However, research into these healthcare organizations is limited by the time constraints and pressure facing them, and the concern by staff that research is peripheral to their work. We developed Q-RARA-Qualitative Rapid Appraisal, Rigorous Analysis-to study small, primary health care organizations in a way that is efficient, acceptable to participants and methodologically rigorous. Q-RARA comprises a site visit, semi-structured interviews, structured and unstructured observations, photographs, floor plans, and social scanning data. Data were collected over the course of one day per site and the qualitative analysis was integrated and iterative. We found Q-RARA to be acceptable to participants and effective in collecting data on organizational function in multiple sites without disrupting the practice, while maintaining a balance between speed and trustworthiness. The Q-RARA approach is capable of providing a richly textured, rigorous understanding of the processes of the primary care practice while also allowing researchers to develop an organizational perspective. For these reasons the approach is recommended for use in small-scale organizations both within and outside the primary health care sector.

  2. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE PAGES

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...

    2016-11-28

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  3. A traits-based approach for prioritizing species for monitoring and surrogacy selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.

    The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less

  4. Geographical, landscape and host associations of Trypanosoma cruzi DTUs and lineages.

    PubMed

    Izeta-Alberdi, Amaia; Ibarra-Cerdeña, Carlos N; Moo-Llanes, David A; Ramsey, Janine M

    2016-12-07

    The evolutionary history and ecological associations of Trypanosoma cruzi, the need to identify genetic markers that can distinguish parasite subpopulations, and understanding the parasite's evolutionary and selective processes have been the subject of a significant number of publications since 1998, the year when the first DNA sequence analysis for the species was published. The current analysis systematizes and re-analyzes this original research, focusing on critical methodological and analytical variables and results that have given rise to interpretations of putative patterns of genetic diversity and diversification of T. cruzi lineages, discrete typing units (DTUs), and populations, and their associations with hosts, vectors, and geographical distribution that have been interpreted as evidence for parasite subpopulation specificities. Few studies use hypothesis-driven or quantitative analysis for T. cruzi phylogeny (16/58 studies) or phylogeography (10/13). Among these, only one phylogenetic and five phylogeographic studies analyzed molecular markers directly from tissues (i.e. not from isolates). Analysis of T. cruzi DTU or lineage niche and its geographical projection demonstrate extensive sympatry among all clades across the continent and no significant niche differences among DTUs. DTU beta-diversity was high, indicating diverse host assemblages across regions, while host dissimilarity was principally due to host species turnover and to a much lesser degree to nestedness. DTU-host order specificities appear related to trophic or microenvironmental interactions. More rigorous study designs and analyses will be required to discern evolutionary processes and the impact of landscape modification on population dynamics and risk for T. cruzi transmission to humans.

  5. Validation of Normalizations, Scaling, and Photofading Corrections for FRAP Data Analysis

    PubMed Central

    Kang, Minchul; Andreani, Manuel; Kenworthy, Anne K.

    2015-01-01

    Fluorescence Recovery After Photobleaching (FRAP) has been a versatile tool to study transport and reaction kinetics in live cells. Since the fluorescence data generated by fluorescence microscopy are in a relative scale, a wide variety of scalings and normalizations are used in quantitative FRAP analysis. Scaling and normalization are often required to account for inherent properties of diffusing biomolecules of interest or photochemical properties of the fluorescent tag such as mobile fraction or photofading during image acquisition. In some cases, scaling and normalization are also used for computational simplicity. However, to our best knowledge, the validity of those various forms of scaling and normalization has not been studied in a rigorous manner. In this study, we investigate the validity of various scalings and normalizations that have appeared in the literature to calculate mobile fractions and correct for photofading and assess their consistency with FRAP equations. As a test case, we consider linear or affine scaling of normal or anomalous diffusion FRAP equations in combination with scaling for immobile fractions. We also consider exponential scaling of either FRAP equations or FRAP data to correct for photofading. Using a combination of theoretical and experimental approaches, we show that compatible scaling schemes should be applied in the correct sequential order; otherwise, erroneous results may be obtained. We propose a hierarchical workflow to carry out FRAP data analysis and discuss the broader implications of our findings for FRAP data analysis using a variety of kinetic models. PMID:26017223

  6. Preserving pre-rigor meat functionality for beef patty production.

    PubMed

    Claus, J R; Sørheim, O

    2006-06-01

    Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.

  7. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  8. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    NASA Astrophysics Data System (ADS)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  9. Methods Beyond Methods: A Model for Africana Graduate Methods Training

    PubMed Central

    Best, Latrica E.; Byrd, W. Carson

    2018-01-01

    A holistic graduate education can impart not just tools and knowledge, but critical positioning to fulfill many of the original missions of Africana Studies programs set forth in the 1960s and 1970s. As an interdisciplinary field with many approaches to examining the African Diaspora, the methodological training of graduate students can vary across graduate programs. Although taking qualitative methods courses are often required of graduate students in Africana Studies programs, and these programs offer such courses, rarely if ever are graduate students in these programs required to take quantitative methods courses, let alone have these courses offered in-house. These courses can offer Africana Studies graduate students new tools for their own research, but more importantly, improve their knowledge of quantitative research of diasporic communities. These tools and knowledge can assist with identifying flawed arguments about African-descended communities and their members. This article explores the importance of requiring and offering critical quantitative methods courses in graduate programs in Africana Studies, and discusses the methods requirements of one graduate program in the field as an example of more rigorous training that other programs could offer graduate students. PMID:29710883

  10. Optimisation d'analyses de grenat almandin realisees au microscope electronique a balayage

    NASA Astrophysics Data System (ADS)

    Larose, Miguel

    The electron microprobe (EMP) is considered as the golden standard for the collection of precise and representative chemical composition of minerals in rocks, but data of similar quality should be obtainable with a scanning electron microscope (SEM). This thesis presents an analytical protocol aimed at optimizing operational parameters of an SEM paired with an EDS Si(Li) X-ray detector (JEOL JSM-840A) for the imaging, quantitative chemical analysis and compositional X-ray maps of almandine garnet found in pelitic schists from the Canadian Cordillera. Results are then compared to those obtained for the same samples on a JEOL JXA 8900 EMP. For imaging purposes, the secondary electrons and backscattered electrons signals have been used to obtain topographic and chemical contrast of the samples, respectively. The SEM allows the acquisition of images with higher resolution than the EMP when working at high magnifications. However, for millimetric size minerals requiring very low magnifications, the EMP can usually match the imaging capabilities of an SEM. When optimizing images for both signals, the optimal operational parameters to show similar contrasts are not restricted to a unique combination of values. Optimization of operational parameters for quantitative chemical analysis resulted in analytical data with a similar precision and showing good correlation to that obtained with an EMP. Optimization of operational parameters for compositional X-ray maps aimed at maximizing the collected intensity within a pixel as well as complying with the spatial resolution criterion in order to obtain a qualitative compositional map representative of the chemical variation within the grain. Even though various corrections were needed, such as the shadow effect and the background noise removal, as well as the impossibility to meet the spatial resolution criterion because of the limited pixel density available on the SEM, the compositional X-ray maps show a good correlation with those obtained with the EMP, even for concentrations as low as 0,5%. When paired with a rigorous analytical protocol, the use of an SEM equipped with an EDS Si (Li) X-ray detector allows the collection of qualitative and quantitative results similar to those obtained with an EMP for all three of the applications considered.

  11. The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2018-01-01

    The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…

  12. Using Content Analysis to Examine the Verbal or Written Communication of Stakeholders within Early Intervention.

    ERIC Educational Resources Information Center

    Johnson, Lawrence J.; LaMontagne, M. J.

    1993-01-01

    This paper describes content analysis as a data analysis technique useful for examining written or verbal communication within early intervention. The article outlines the use of referential or thematic recording units derived from interview data, identifies procedural guidelines, and addresses issues of rigor and validity. (Author/JDD)

  13. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  14. Addressing the Challenge of Molecular Change: An Interim Report

    NASA Astrophysics Data System (ADS)

    Levine, Raphael D.

    2018-04-01

    Invited by the editorial committee of the Annual Review of Physical Chemistry to "contribute my autobiography," I present it here, as I understand the term. It is about my parents, my mentors, my coworkers, and my friends in learning and the scientific problems that we tried to address. Courtesy of the editorial assistance of Annual Reviews, some of the science is in the figure captions and sidebars. I am by no means done: I am currently trying to fuse the quantitative rigor of physical chemistry with systems biology while also dealing with a post-Born-Oppenheimer regime in electronic dynamics and am attempting to instruct molecules to perform advanced logic.

  15. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  16. Building confidence in quantitative systems pharmacology models: An engineer's guide to exploring the rationale in model design and development.

    PubMed

    Timmis, J; Alden, K; Andrews, P; Clark, E; Nellis, A; Naylor, B; Coles, M; Kaye, P

    2017-03-01

    This tutorial promotes good practice for exploring the rationale of systems pharmacology models. A safety systems engineering inspired notation approach provides much needed rigor and transparency in development and application of models for therapeutic discovery and design of intervention strategies. Structured arguments over a model's development, underpinning biological knowledge, and analyses of model behaviors are constructed to determine the confidence that a model is fit for the purpose for which it will be applied. © 2016 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  17. Addressing the Challenge of Molecular Change: An Interim Report.

    PubMed

    Levine, Raphael D

    2018-04-20

    Invited by the editorial committee of the Annual Review of Physical Chemistry to "contribute my autobiography," I present it here, as I understand the term. It is about my parents, my mentors, my coworkers, and my friends in learning and the scientific problems that we tried to address. Courtesy of the editorial assistance of Annual Reviews, some of the science is in the figure captions and sidebars. I am by no means done: I am currently trying to fuse the quantitative rigor of physical chemistry with systems biology while also dealing with a post-Born-Oppenheimer regime in electronic dynamics and am attempting to instruct molecules to perform advanced logic.

  18. Alternative approaches to research in physical therapy: positivism and phenomenology.

    PubMed

    Shepard, K F; Jensen, G M; Schmoll, B J; Hack, L M; Gwyer, J

    1993-02-01

    This article presents philosophical approaches to research in physical therapy. A comparison is made to demonstrate how the research purpose, research design, research methods, and research data differ when one approaches research from the philosophical perspective of positivism (predominantly quantitative) as compared with the philosophical perspective of phenomenology (predominantly qualitative). Differences between the two approaches are highlighted by examples from research articles published in Physical Therapy. The authors urge physical therapy researchers to become familiar with the tenets, rigor, and knowledge gained from the use of both approaches in order to increase their options in conducting research relevant to the practice of physical therapy.

  19. A Meta-Analysis of Single-Subject Research on Behavioral Momentum to Enhance Success in Students with Autism.

    PubMed

    Cowan, Richard J; Abel, Leah; Candel, Lindsay

    2017-05-01

    We conducted a meta-analysis of single-subject research studies investigating the effectiveness of antecedent strategies grounded in behavioral momentum for improving compliance and on-task performance for students with autism. First, we assessed the research rigor of those studies meeting our inclusionary criteria. Next, in order to apply a universal metric to help determine the effectiveness of this category of antecedent strategies investigated via single-subject research methods, we calculated effect sizes via omnibus improvement rate differences (IRDs). Outcomes provide additional support for behavioral momentum, especially interventions incorporating the high-probability command sequence. Implications for research and practice are discussed, including the consideration of how single-subject research is systematically reviewed to assess the rigor of studies and assist in determining overall intervention effectiveness .

  20. A Research Communication Brief: Gluten Analysis in Beef Samples Collected Using a Rigorous, Nationally Representative Sampling Protocol Confirms That Grain-Finished Beef Is Naturally Gluten-Free.

    PubMed

    McNeill, Shalene H; Cifelli, Amy M; Roseland, Janet M; Belk, Keith E; Woerner, Dale R; Gehring, Kerri B; Savell, Jeffrey W; Brooks, J Chance; Thompson, Leslie D

    2017-08-25

    Knowing whether or not a food contains gluten is vital for the growing number of individuals with celiac disease and non-celiac gluten sensitivity. Questions have recently been raised about whether beef from conventionally-raised, grain-finished cattle may contain gluten. To date, basic principles of ruminant digestion have been cited in support of the prevailing expert opinion that beef is inherently gluten-free. For this study, gluten analysis was conducted in beef samples collected using a rigorous nationally representative sampling protocol to determine whether gluten was present. The findings of our research uphold the understanding of the principles of gluten digestion in beef cattle and corroborate recommendations that recognize beef as a naturally gluten-free food.

  1. Critical Appraisal of Emergency Medicine Education Research: The Best Publications of 2014.

    PubMed

    Yarris, Lalena M; Juve, Amy Miller; Coates, Wendy C; Fisher, Jonathan; Heitz, Corey; Shayne, Philip; Farrell, Susan E

    2015-11-01

    The objective was to critically appraise and highlight rigorous education research study articles published in 2014 whose outcomes advance the science of emergency medicine (EM) education. A search of the English language literature in 2014 querying Education Resources Information Center (ERIC), PsychINFO, PubMed, and Scopus identified 243 EM-related articles using either quantitative (hypothesis-testing or observational investigations of educational interventions) or qualitative (exploring important phenomena in EM education) methods. Two reviewers independently screened all of the publications using previously established exclusion criteria. Six reviewers then independently scored the 25 selected publications using either a qualitative or a quantitative scoring system. Each scoring system consisted of nine criteria. Selected criteria were based on accepted educational review literature and chosen a priori. Both scoring systems use parallel scoring metrics and have been used previously within this annual review. Twenty-five medical education research papers (22 quantitative, three qualitative) met the criteria for inclusion and were reviewed. Five quantitative and two qualitative studies were ranked most highly by the reviewers as exemplary and are summarized in this article. This annual critical appraisal series highlights seven excellent EM education research studies, meeting a priori criteria and published in 2014. Methodologic strengths in the 2014 papers are noted, and current trends in medical education research in EM are discussed. © 2015 by the Society for Academic Emergency Medicine.

  2. Scientific Data Analysis Toolkit: A Versatile Add-in to Microsoft Excel for Windows

    ERIC Educational Resources Information Center

    Halpern, Arthur M.; Frye, Stephen L.; Marzzacco, Charles J.

    2018-01-01

    Scientific Data Analysis Toolkit (SDAT) is a rigorous, versatile, and user-friendly data analysis add-in application for Microsoft Excel for Windows (PC). SDAT uses the familiar Excel environment to carry out most of the analytical tasks used in data analysis. It has been designed for student use in manipulating and analyzing data encountered in…

  3. A narrative literature review to direct spinal cord injury patient education programming.

    PubMed

    van Wyk, Kim; Backwell, Amber; Townson, Andrea

    2015-01-01

    To summarize the evidence on SCI-related education literature, while looking at potential barriers, solutions, benefits, and patient preferences regarding SCI patient education. A literature review was conducted using 5 electronic databases. Quality appraisal instruments were designed to determine the methodological rigor of the quantitative and qualitative studies found. Selected articles were read in their entirety and themes were abstracted. Fourteen articles met the inclusion criteria for this narrative literature review, all of which were based on research studies. Seven of these 14 were quantitative studies, 3 were qualitative studies, and 4 were mixed-methods studies. To improve SCI education during rehabilitation, programs should maximize the receptiveness of newly injured patients to SCI-related information, optimize the delivery of SCI education, increase the number of opportunities for learning, promote and support lifelong learning, and include patient and program evaluation. How these strategies are specifically implemented needs to be determined by program management in consultation with various stakeholders, whilst considering the unique characteristics of the rehabilitation facility.

  4. Mspire-Simulator: LC-MS shotgun proteomic simulator for creating realistic gold standard data.

    PubMed

    Noyce, Andrew B; Smith, Rob; Dalgleish, James; Taylor, Ryan M; Erb, K C; Okuda, Nozomu; Prince, John T

    2013-12-06

    The most important step in any quantitative proteomic pipeline is feature detection (aka peak picking). However, generating quality hand-annotated data sets to validate the algorithms, especially for lower abundance peaks, is nearly impossible. An alternative for creating gold standard data is to simulate it with features closely mimicking real data. We present Mspire-Simulator, a free, open-source shotgun proteomic simulator that goes beyond previous simulation attempts by generating LC-MS features with realistic m/z and intensity variance along with other noise components. It also includes machine-learned models for retention time and peak intensity prediction and a genetic algorithm to custom fit model parameters for experimental data sets. We show that these methods are applicable to data from three different mass spectrometers, including two fundamentally different types, and show visually and analytically that simulated peaks are nearly indistinguishable from actual data. Researchers can use simulated data to rigorously test quantitation software, and proteomic researchers may benefit from overlaying simulated data on actual data sets.

  5. AN EARLY STAGE IN THE PLANT RECOLONIZATION OF A NUCLEAR TARGET AREA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rickard, W.H.; Shields, L.M.

    1963-01-01

    Vegetational analyses were conducted three years postdetonation in a nuclear target area in a Grayia spinosa-Lycium andersonii community in Yucca Fiat, Nevada. Annual plants dominated the early stage of recolonization and were quantitatively more abundant in the disturbed areas than in an adjacent undisturbed shrub community. Ment zelia albicaulis and Chaenactis steviodes occurred in both disturbed and undisturbed areas, however; Mentzelia was more abundant in disturbed areas while Chaenactis was more abundant in the undisturbed community. Salsola kali was confined to disturbed areas while Phacelia vallismortae was more often encountered in the undisturbed community. The plant recolonization of a mechanicallymore » disturbed area was quantitatively and qualitatively more like that of the interior zone of the nuclear target area than less disturbed habitats. These data support a conclusion that soil displacement presents a more rigorous habitat for plant recolonization than disturbances created by the wider ranging destructive components of a nuclear detonation. (auth)« less

  6. A Synthesis Of Knowledge About Caregiver Decision Making Finds Gaps In Support For Those Who Care For Aging Loved Ones.

    PubMed

    Garvelink, Mirjam M; Ngangue, Patrice A G; Adekpedjou, Rheda; Diouf, Ndeye T; Goh, Larissa; Blair, Louisa; Légaré, France

    2016-04-01

    We conducted a mixed-methods knowledge synthesis to assess the effectiveness of interventions to improve caregivers' involvement in decision making with seniors, and to describe caregivers' experiences of decision making in the absence of interventions. We analyzed forty-nine qualitative, fourteen quantitative, and three mixed-methods studies. The qualitative studies indicated that caregivers had unmet needs for information, discussions of values and needs, and decision support, which led to negative sentiments after decision making. Our results indicate that there have been insufficient quantitative evaluations of interventions to involve caregivers in decision making with seniors and that the evaluations that do exist found few clinically significant effects. Elements of usual care that received positive evaluations were the availability of a decision coach and a supportive decision-making environment. Additional rigorously evaluated interventions are needed to help caregivers be more involved in decision making with seniors. Project HOPE—The People-to-People Health Foundation, Inc.

  7. Unified quantitative characterization of epithelial tissue development

    PubMed Central

    Guirao, Boris; Rigaud, Stéphane U; Bosveld, Floris; Bailles, Anaïs; López-Gay, Jesús; Ishihara, Shuji; Sugimura, Kaoru

    2015-01-01

    Understanding the mechanisms regulating development requires a quantitative characterization of cell divisions, rearrangements, cell size and shape changes, and apoptoses. We developed a multiscale formalism that relates the characterizations of each cell process to tissue growth and morphogenesis. Having validated the formalism on computer simulations, we quantified separately all morphogenetic events in the Drosophila dorsal thorax and wing pupal epithelia to obtain comprehensive statistical maps linking cell and tissue scale dynamics. While globally cell shape changes, rearrangements and divisions all significantly participate in tissue morphogenesis, locally, their relative participations display major variations in space and time. By blocking division we analyzed the impact of division on rearrangements, cell shape changes and tissue morphogenesis. Finally, by combining the formalism with mechanical stress measurement, we evidenced unexpected interplays between patterns of tissue elongation, cell division and stress. Our formalism provides a novel and rigorous approach to uncover mechanisms governing tissue development. DOI: http://dx.doi.org/10.7554/eLife.08519.001 PMID:26653285

  8. On the Tracy-Widomβ Distribution for β=6

    NASA Astrophysics Data System (ADS)

    Grava, Tamara; Its, Alexander; Kapaev, Andrei; Mezzadri, Francesco

    2016-11-01

    We study the Tracy-Widom distribution function for Dyson's β-ensemble with β = 6. The starting point of our analysis is the recent work of I. Rumanov where he produces a Lax-pair representation for the Bloemendal-Virág equation. The latter is a linear PDE which describes the Tracy-Widom functions corresponding to general values of β. Using his Lax pair, Rumanov derives an explicit formula for the Tracy-Widom β=6 function in terms of the second Painlevé transcendent and the solution of an auxiliary ODE. Rumanov also shows that this formula allows him to derive formally the asymptotic expansion of the Tracy-Widom function. Our goal is to make Rumanov's approach and hence the asymptotic analysis it provides rigorous. In this paper, the first one in a sequel, we show that Rumanov's Lax-pair can be interpreted as a certain gauge transformation of the standard Lax pair for the second Painlevé equation. This gauge transformation though contains functional parameters which are defined via some auxiliary nonlinear ODE which is equivalent to the auxiliary ODE of Rumanov's formula. The gauge-interpretation of Rumanov's Lax-pair allows us to highlight the steps of the original Rumanov's method which needs rigorous justifications in order to make the method complete. We provide a rigorous justification of one of these steps. Namely, we prove that the Painlevé function involved in Rumanov's formula is indeed, as it has been suggested by Rumanov, the Hastings-McLeod solution of the second Painlevé equation. The key issue which we also discuss and which is still open is the question of integrability of the auxiliary ODE in Rumanov's formula. We note that this question is crucial for the rigorous asymptotic analysis of the Tracy-Widom function. We also notice that our work is a partial answer to one of the problems related to the β-ensembles formulated by Percy Deift during the June 2015 Montreal Conference on integrable systems.

  9. Using qualitative methods to improve questionnaires for Spanish speakers: assessing face validity of a food behavior checklist.

    PubMed

    Banna, Jinan C; Vera Becerra, Luz E; Kaiser, Lucia L; Townsend, Marilyn S

    2010-01-01

    Development of outcome measures relevant to health nutrition behaviors requires a rigorous process of testing and revision. Whereas researchers often report performance of quantitative data collection to assess questionnaire validity and reliability, qualitative testing procedures are often overlooked. This report outlines a procedure for assessing face validity of a Spanish-language dietary assessment tool. Reviewing the literature produced no rigorously validated Spanish-language food behavior assessment tools for the US Department of Agriculture's food assistance and education programs. In response to this need, this study evaluated the face validity of a Spanish-language food behavior checklist adapted from a 16-item English version of a food behavior checklist shown to be valid and reliable for limited-resource English speakers. The English version was translated using rigorous methods involving initial translation by one party and creation of five possible versions. Photos were modified based on client input and new photos were taken as necessary. A sample of low-income, Spanish-speaking women completed cognitive interviews (n=20). Spanish translation experts (n=7) fluent in both languages and familiar with both cultures made minor modifications but essentially approved client preferences. The resulting checklist generated a readability score of 93, indicating low reading difficulty. The Spanish-language checklist has adequate face validity in the target population and is ready for further validation using convergent measures. At the conclusion of testing, this instrument may be used to evaluate nutrition education interventions in California. These qualitative procedures provide a framework for designing evaluation tools for low-literate audiences participating in the US Department of Agriculture food assistance and education programs. Copyright 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  10. Using Qualitative Methods to Improve Questionnaires for Spanish Speakers: Assessing Face Validity of a Food Behavior Checklist

    PubMed Central

    BANNA, JINAN C.; VERA BECERRA, LUZ E.; KAISER, LUCIA L.; TOWNSEND, MARILYN S.

    2015-01-01

    Development of outcome measures relevant to health nutrition behaviors requires a rigorous process of testing and revision. Whereas researchers often report performance of quantitative data collection to assess questionnaire validity and reliability, qualitative testing procedures are often overlooked. This report outlines a procedure for assessing face validity of a Spanish-language dietary assessment tool. Reviewing the literature produced no rigorously validated Spanish-language food behavior assessment tools for the US Department of Agriculture’s food assistance and education programs. In response to this need, this study evaluated the face validity of a Spanish-language food behavior checklist adapted from a 16-item English version of a food behavior checklist shown to be valid and reliable for limited-resource English speakers. The English version was translated using rigorous methods involving initial translation by one party and creation of five possible versions. Photos were modified based on client input and new photos were taken as necessary. A sample of low-income, Spanish-speaking women completed cognitive interviews (n=20). Spanish translation experts (n=7) fluent in both languages and familiar with both cultures made minor modifications but essentially approved client preferences. The resulting checklist generated a readability score of 93, indicating low reading difficulty. The Spanish-language checklist has adequate face validity in the target population and is ready for further validation using convergent measures. At the conclusion of testing, this instrument may be used to evaluate nutrition education interventions in California. These qualitative procedures provide a framework for designing evaluation tools for low-literate audiences participating in the US Department of Agriculture food assistance and education programs. PMID:20102831

  11. Cost-Effectiveness Analysis of Early Reading Programs: A Demonstration with Recommendations for Future Research

    ERIC Educational Resources Information Center

    Hollands, Fiona M.; Kieffer, Michael J.; Shand, Robert; Pan, Yilin; Cheng, Henan; Levin, Henry M.

    2016-01-01

    We review the value of cost-effectiveness analysis for evaluation and decision making with respect to educational programs and discuss its application to early reading interventions. We describe the conditions for a rigorous cost-effectiveness analysis and illustrate the challenges of applying the method in practice, providing examples of programs…

  12. Feeding Problems and Nutrient Intake in Children with Autism Spectrum Disorders: A Meta-Analysis and Comprehensive Review of the Literature

    ERIC Educational Resources Information Center

    Sharp, William G.; Berry, Rashelle C.; McCracken, Courtney; Nuhu, Nadrat N.; Marvel, Elizabeth; Saulnier, Celine A.; Klin, Ami; Jones, Warren; Jaquess, David L.

    2013-01-01

    We conducted a comprehensive review and meta-analysis of research regarding feeding problems and nutrient status among children with autism spectrum disorders (ASD). The systematic search yielded 17 prospective studies involving a comparison group. Using rigorous meta-analysis techniques, we calculated the standardized mean difference (SMD) with…

  13. Can power-law scaling and neuronal avalanches arise from stochastic dynamics?

    PubMed

    Touboul, Jonathan; Destexhe, Alain

    2010-02-11

    The presence of self-organized criticality in biology is often evidenced by a power-law scaling of event size distributions, which can be measured by linear regression on logarithmic axes. We show here that such a procedure does not necessarily mean that the system exhibits self-organized criticality. We first provide an analysis of multisite local field potential (LFP) recordings of brain activity and show that event size distributions defined as negative LFP peaks can be close to power-law distributions. However, this result is not robust to change in detection threshold, or when tested using more rigorous statistical analyses such as the Kolmogorov-Smirnov test. Similar power-law scaling is observed for surrogate signals, suggesting that power-law scaling may be a generic property of thresholded stochastic processes. We next investigate this problem analytically, and show that, indeed, stochastic processes can produce spurious power-law scaling without the presence of underlying self-organized criticality. However, this power-law is only apparent in logarithmic representations, and does not survive more rigorous analysis such as the Kolmogorov-Smirnov test. The same analysis was also performed on an artificial network known to display self-organized criticality. In this case, both the graphical representations and the rigorous statistical analysis reveal with no ambiguity that the avalanche size is distributed as a power-law. We conclude that logarithmic representations can lead to spurious power-law scaling induced by the stochastic nature of the phenomenon. This apparent power-law scaling does not constitute a proof of self-organized criticality, which should be demonstrated by more stringent statistical tests.

  14. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Treesearch

    Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  15. Driven and No Regrets: A Qualitative Analysis of Students Earning Baccalaureate Degrees in Three Years

    ERIC Educational Resources Information Center

    Firmin, Michael W.; Gilson, Krista Merrick

    2007-01-01

    Using rigorous qualitative research methodology, twenty-four college students receiving their undergraduate degrees in three years were interviewed. Following analysis of the semi-structured interview transcripts and coding, themes emerged, indicating that these students possessed self-discipline, self-motivation, and drive. Overall, the results…

  16. Gender, Discourse, and "Gender and Discourse."

    ERIC Educational Resources Information Center

    Davis, Hayley

    1997-01-01

    A critic of Deborah Tannen's book "Gender and Discourse" responds to comments made about her critique, arguing that the book's analysis of the relationship of gender and discourse tends to seek, and perhaps force, explanations only in those terms. Another linguist's analysis of similar phenomena is found to be more rigorous. (MSE)

  17. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  18. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    PubMed

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  19. Rigorous derivation of the effective model describing a non-isothermal fluid flow in a vertical pipe filled with porous medium

    NASA Astrophysics Data System (ADS)

    Beneš, Michal; Pažanin, Igor

    2018-03-01

    This paper reports an analytical investigation of non-isothermal fluid flow in a thin (or long) vertical pipe filled with porous medium via asymptotic analysis. We assume that the fluid inside the pipe is cooled (or heated) by the surrounding medium and that the flow is governed by the prescribed pressure drop between pipe's ends. Starting from the dimensionless Darcy-Brinkman-Boussinesq system, we formally derive a macroscopic model describing the effective flow at small Brinkman-Darcy number. The asymptotic approximation is given by the explicit formulae for the velocity, pressure and temperature clearly acknowledging the effects of the cooling (heating) and porous structure. The theoretical error analysis is carried out to indicate the order of accuracy and to provide a rigorous justification of the effective model.

  20. A Research Communication Brief: Gluten Analysis in Beef Samples Collected Using a Rigorous, Nationally Representative Sampling Protocol Confirms That Grain-Finished Beef Is Naturally Gluten-Free

    PubMed Central

    McNeill, Shalene H.; Cifelli, Amy M.; Roseland, Janet M.; Belk, Keith E.; Gehring, Kerri B.; Brooks, J. Chance; Thompson, Leslie D.

    2017-01-01

    Knowing whether or not a food contains gluten is vital for the growing number of individuals with celiac disease and non-celiac gluten sensitivity. Questions have recently been raised about whether beef from conventionally-raised, grain-finished cattle may contain gluten. To date, basic principles of ruminant digestion have been cited in support of the prevailing expert opinion that beef is inherently gluten-free. For this study, gluten analysis was conducted in beef samples collected using a rigorous nationally representative sampling protocol to determine whether gluten was present. The findings of our research uphold the understanding of the principles of gluten digestion in beef cattle and corroborate recommendations that recognize beef as a naturally gluten-free food. PMID:28841165

  1. Treetrimmer: a method for phylogenetic dataset size reduction.

    PubMed

    Maruyama, Shinichiro; Eveleigh, Robert J M; Archibald, John M

    2013-04-12

    With rapid advances in genome sequencing and bioinformatics, it is now possible to generate phylogenetic trees containing thousands of operational taxonomic units (OTUs) from a wide range of organisms. However, use of rigorous tree-building methods on such large datasets is prohibitive and manual 'pruning' of sequence alignments is time consuming and raises concerns over reproducibility. There is a need for bioinformatic tools with which to objectively carry out such pruning procedures. Here we present 'TreeTrimmer', a bioinformatics procedure that removes unnecessary redundancy in large phylogenetic datasets, alleviating the size effect on more rigorous downstream analyses. The method identifies and removes user-defined 'redundant' sequences, e.g., orthologous sequences from closely related organisms and 'recently' evolved lineage-specific paralogs. Representative OTUs are retained for more rigorous re-analysis. TreeTrimmer reduces the OTU density of phylogenetic trees without sacrificing taxonomic diversity while retaining the original tree topology, thereby speeding up downstream computer-intensive analyses, e.g., Bayesian and maximum likelihood tree reconstructions, in a reproducible fashion.

  2. Academic Rigor in the College Classroom: Two Federal Commissions Strive to Define Rigor in the Past 70 Years

    ERIC Educational Resources Information Center

    Francis, Clay

    2018-01-01

    Historic notions of academic rigor usually follow from critiques of the system--we often define our goals for academically rigorous work through the lens of our shortcomings. This chapter discusses how the Truman Commission in 1947 and the Spellings Commission in 2006 shaped the way we think about academic rigor in today's context.

  3. Correlation of X-ray computed tomography with quantitative nuclear magnetic resonance methods for pre-clinical measurement of adipose and lean tissues in living mice.

    PubMed

    Metzinger, Matthew N; Miramontes, Bernadette; Zhou, Peng; Liu, Yueying; Chapman, Sarah; Sun, Lucy; Sasser, Todd A; Duffield, Giles E; Stack, M Sharon; Leevy, W Matthew

    2014-10-08

    Numerous obesity studies have coupled murine models with non-invasive methods to quantify body composition in longitudinal experiments, including X-ray computed tomography (CT) or quantitative nuclear magnetic resonance (QMR). Both microCT and QMR have been separately validated with invasive techniques of adipose tissue quantification, like post-mortem fat extraction and measurement. Here we report a head-to-head study of both protocols using oil phantoms and mouse populations to determine the parameters that best align CT data with that from QMR. First, an in vitro analysis of oil/water mixtures was used to calibrate and assess the overall accuracy of microCT vs. QMR data. Next, experiments were conducted with two cohorts of living mice (either homogenous or heterogeneous by sex, age and genetic backgrounds) to assess the microCT imaging technique for adipose tissue segmentation and quantification relative to QMR. Adipose mass values were obtained from microCT data with three different resolutions, after which the data were analyzed with different filter and segmentation settings. Strong linearity was noted between the adipose mass values obtained with microCT and QMR, with optimal parameters and scan conditions reported herein. Lean tissue (muscle, internal organs) was also segmented and quantified using the microCT method relative to the analogous QMR values. Overall, the rigorous calibration and validation of the microCT method for murine body composition, relative to QMR, ensures its validity for segmentation, quantification and visualization of both adipose and lean tissues.

  4. Musculoskeletal physiotherapists' use of psychological interventions: a systematic review of therapists' perceptions and practice.

    PubMed

    Alexanders, Jenny; Anderson, Anna; Henderson, Sarah

    2015-06-01

    Research has demonstrated that incorporating psychological interventions within physiotherapy practice has numerous potential benefits. Despite this physiotherapists have reported feeling inadequately trained to confidently use such interventions in their day-to-day practice. To systematically review musculoskeletal physiotherapists' perceptions regarding the use of psychological interventions within physiotherapy practice. Eligible studies were identified through a rigorous search of AMED, CINAHL, EMBASE, MEDLINE and PsychINFO from January 2002 until August 2013. Full text qualitative, quantitative and mixed methodology studies published in English language investigating musculoskeletal physiotherapists' perceptions regarding their use of psychological interventions within physiotherapy practice. Included studies were appraised for risk of bias using the Critical Appraisal Skills Programme qualitative checklist. Meta-analysis was not possible due to study heterogeneity. Six studies, all with a low risk of bias, met the inclusion criteria. These studies highlighted that physiotherapists appreciate the importance of using psychological interventions within their practice, but report inadequate understanding and consequent underutilisation of these interventions. These results should be noted with some degree of caution due to various limitations associated with the included studies and with this review, including the use of a qualitative appraisal tool for mixed methodology/quantitative studies. These findings suggest that musculoskeletal physiotherapists are aware of the potential benefits of incorporating psychological interventions within their practice but feel insufficiently trained to optimise their use of such interventions; hence highlighting a need for further research in this area and a review of physiotherapist training. Copyright © 2014 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  5. Systematic review of empowerment measures in health promotion.

    PubMed

    Cyril, Sheila; Smith, Ben J; Renzaho, Andre M N

    2016-12-01

    Empowerment, a multi-level construct comprising individual, community and organizational domains, is a fundamental value and goal in health promotion. While a range of scales have been developed for the measurement of empowerment, the qualities of these have not been rigorously assessed. The aim of this study was to evaluate the measurement properties of quantitative empowerment scales and their applicability in health promotion programs. A systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines was done to evaluate empowerment scales across three dimensions: item development, reliability and validity. This was followed by assessment of measurement properties using a ratings scale with criteria addressing an a priori explicit theoretical framework, assessment of content validity, internal consistency and factor analysis to test structural validity. Of the 20 studies included in this review, only 8 (40%) used literature reviews, expert panels and empirical studies to develop scale items and 9 (45%) of studies fulfilled ≥5 criteria on the ratings scale. Two studies (10%) measured community empowerment and one study measured organizational empowerment, the rest (85%) measured individual empowerment. This review highlights important gaps in the measurement of community and organizational domains of empowerment using quantitative scales. A priority for future empowerment research is to investigate and explore approaches such as mixed methods to enable adequate measurement of empowerment across all three domains. This would help health promotion practitioners to effectively measure empowerment as a driver of change and an outcome in health promotion programs. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Satellite SAR geocoding with refined RPC model

    NASA Astrophysics Data System (ADS)

    Zhang, Lu; Balz, Timo; Liao, Mingsheng

    2012-04-01

    Recent studies have proved that the Rational Polynomial Camera (RPC) model is able to act as a reliable replacement of the rigorous Range-Doppler (RD) model for the geometric processing of satellite SAR datasets. But its capability in absolute geolocation of SAR images has not been evaluated quantitatively. Therefore, in this article the problems of error analysis and refinement of SAR RPC model are primarily investigated to improve the absolute accuracy of SAR geolocation. Range propagation delay and azimuth timing error are identified as two major error sources for SAR geolocation. An approach based on SAR image simulation and real-to-simulated image matching is developed to estimate and correct these two errors. Afterwards a refined RPC model can be built from the error-corrected RD model and then used in satellite SAR geocoding. Three experiments with different settings are designed and conducted to comprehensively evaluate the accuracies of SAR geolocation with both ordinary and refined RPC models. All the experimental results demonstrate that with RPC model refinement the absolute location accuracies of geocoded SAR images can be improved significantly, particularly in Easting direction. In another experiment the computation efficiencies of SAR geocoding with both RD and RPC models are compared quantitatively. The results show that by using the RPC model such efficiency can be remarkably improved by at least 16 times. In addition the problem of DEM data selection for SAR image simulation in RPC model refinement is studied by a comparative experiment. The results reveal that the best choice should be using the proper DEM datasets of spatial resolution comparable to that of the SAR images.

  7. Modified ion source triple quadrupole mass spectrometer gas chromatograph for polycyclic aromatic hydrocarbon analyses

    PubMed Central

    Anderson, Kim A.; Szelewski, Michael J.; Wilson, Glenn; Quimby, Bruce D.; Hoffman, Peter D.

    2015-01-01

    We describe modified gas chromatography electron-impact/triple-quadrupole mass spectrometry (GC–EI/MS/MS) utilizing a newly developed hydrogen-injected self-cleaning ion source and modified 9 mm extractor lens. This instrument, with optimized parameters, achieves quantitative separation of 62 polycyclic aromatic hydrocarbons (PAHs). Existing methods historically limited rigorous identification and quantification to a small subset, such as the 16 PAHs the US EPA has defined as priority pollutants. Without the critical source and extractor lens modifications, the off-the-shelf GC–EI/MS/MS system was unsuitable for complex PAH analysis. Separations were enhanced by increased gas flow, a complex GC temperature profile incorporating multiple isothermal periods, specific ramp rates, and a PAH-optimized column. Typical determinations with our refined GC–EI/MS/MS have a large linear range of 1–10,000 pg μl−1 and detection limits of <2 pg μl−1. Included in the 62 PAHs, multiple-reaction-monitoring (MRM) mode enabled GC-EI/MS/MS identification and quantitation of several constituents of the MW 302 PAHs isomers. Using calibration standards, values determined were within 5% of true values over many months. Standard curve r2 values were typically >0.998, exceptional for compounds which are archetypally difficult. With this method benzo[a]fluorene, benzo[b]fluorene, benzo[c]fluorene were fully separated as was benzo[b]fluoranthene, benzo[k]fluoranthene, and benzo[j]fluoranthene. Chrysene and triphenylene, were sufficiently separated to allow accurate quantitation. Mean limits of detection (LODs) across all PAHs were 1.02 ± 0.84 pg μl−1 with indeno[1,2,3-c,d] pyrene having the lowest LOD at 0.26 pg μl−1 and only two analytes above 2.0 pg μl−1; acenaphthalene (2.33 pg μl−1) and dibenzo[a,e]pyrene (6.44 pg μl−1). PMID:26454790

  8. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  9. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    PubMed

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Deriving quantitative dynamics information for proteins and RNAs using ROTDIF with a graphical user interface.

    PubMed

    Berlin, Konstantin; Longhini, Andrew; Dayie, T Kwaku; Fushman, David

    2013-12-01

    To facilitate rigorous analysis of molecular motions in proteins, DNA, and RNA, we present a new version of ROTDIF, a program for determining the overall rotational diffusion tensor from single- or multiple-field nuclear magnetic resonance relaxation data. We introduce four major features that expand the program's versatility and usability. The first feature is the ability to analyze, separately or together, (13)C and/or (15)N relaxation data collected at a single or multiple fields. A significant improvement in the accuracy compared to direct analysis of R2/R1 ratios, especially critical for analysis of (13)C relaxation data, is achieved by subtracting high-frequency contributions to relaxation rates. The second new feature is an improved method for computing the rotational diffusion tensor in the presence of biased errors, such as large conformational exchange contributions, that significantly enhances the accuracy of the computation. The third new feature is the integration of the domain alignment and docking module for relaxation-based structure determination of multi-domain systems. Finally, to improve accessibility to all the program features, we introduced a graphical user interface that simplifies and speeds up the analysis of the data. Written in Java, the new ROTDIF can run on virtually any computer platform. In addition, the new ROTDIF achieves an order of magnitude speedup over the previous version by implementing a more efficient deterministic minimization algorithm. We not only demonstrate the improvement in accuracy and speed of the new algorithm for synthetic and experimental (13)C and (15)N relaxation data for several proteins and nucleic acids, but also show that careful analysis required especially for characterizing RNA dynamics allowed us to uncover subtle conformational changes in RNA as a function of temperature that were opaque to previous analysis.

  11. "Diagnosis by behavioral observation" home-videosomnography - a rigorous ethnographic approach to sleep of children with neurodevelopmental conditions.

    PubMed

    Ipsiroglu, Osman S; Hung, Yi-Hsuan Amy; Chan, Forson; Ross, Michelle L; Veer, Dorothee; Soo, Sonja; Ho, Gloria; Berger, Mai; McAllister, Graham; Garn, Heinrich; Kloesch, Gerhard; Barbosa, Adriano Vilela; Stockler, Sylvia; McKellin, William; Vatikiotis-Bateson, Eric

    2015-01-01

    Advanced video technology is available for sleep-laboratories. However, low-cost equipment for screening in the home setting has not been identified and tested, nor has a methodology for analysis of video recordings been suggested. We investigated different combinations of hardware/software for home-videosomnography (HVS) and established a process for qualitative and quantitative analysis of HVS-recordings. A case vignette (HVS analysis for a 5.5-year-old girl with major insomnia and several co-morbidities) demonstrates how methodological considerations were addressed and how HVS added value to clinical assessment. We suggest an "ideal set of hardware/software" that is reliable, affordable (∼$500) and portable (=2.8 kg) to conduct non-invasive HVS, which allows time-lapse analyses. The equipment consists of a net-book, a camera with infrared optics, and a video capture device. (1) We present an HVS-analysis protocol consisting of three steps of analysis at varying replay speeds: (a) basic overview and classification at 16× normal speed; (b) second viewing and detailed descriptions at 4-8× normal speed, and (c) viewing, listening, and in-depth descriptions at real-time speed. (2) We also present a custom software program that facilitates video analysis and note-taking (Annotator(©)), and Optical Flow software that automatically quantifies movement for internal quality control of the HVS-recording. The case vignette demonstrates how the HVS-recordings revealed the dimension of insomnia caused by restless legs syndrome, and illustrated the cascade of symptoms, challenging behaviors, and resulting medications. The strategy of using HVS, although requiring validation and reliability testing, opens the floor for a new "observational sleep medicine," which has been useful in describing discomfort-related behavioral movement patterns in patients with communication difficulties presenting with challenging/disruptive sleep/wake behaviors.

  12. Using the Origin and Pawn, Positive Affect, CASPM, and Cognitive Anxiety Content Analysis Scales in Counseling Research

    ERIC Educational Resources Information Center

    Viney, Linda L.; Caputi, Peter

    2005-01-01

    Content analysis scales apply rigorous measurement to verbal communications and make possible the quantification of text in counseling research. The limitations of the Origin and Pawn Scales (M. T. Westbrook & L. L. Viney, 1980), the Positive Affect Scale (M. T. Westbrook, 1976), the Content Analysis Scales of Psychosocial Maturity (CASPM; L.…

  13. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    PubMed

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  14. The impact of chief executive officer personality on top management team dynamics:one mechanism by which leadership affects organizational performance.

    PubMed

    Peterson, Randall S; Smith, D Brent; Martorana, Paul V; Owens, Pamela D

    2003-10-01

    This article explores 1 mechanism by which leader personality affects organizational performance. The authors hypothesized and tested the effects of leader personality on the group dynamics of the top management team (TMT) and of TMT dynamics on organizational performance. To test their hypotheses, the authors used the group dynamics q-sort method, which is designed to permit rigorous, quantitative comparisons of data derived from qualitative sources. Results from independent observations of chief executive officer (CEO) personality and TMT dynamics for 17 CEOs supported the authors' hypothesized relationships both between CEO personality and TMT group dynamics and between TMT dynamics and organizational performance.

  15. DOE Theory Graduate Student Fellowship: Gustavo Marques Tavares

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmaltz, Martin

    2015-12-30

    Marques Tavares was awarded a fellowship for his proposal “The ttbar asymmetry and beyond” to starting in September 2012. This is the final report summarizing the research activities and accomplishments achieved with this grant support. With support from the DOE graduate fellowship Marques Tavares, Katz and Xu at BU have investigated a new technique for obtaining quantitative results in strongly coupled field theories with broken conformal invariance. Such theories are especially interesting as they may be candidates for physics beyond the standard model with possible applications to strongly coupled electroweak symmetry breaking. However, because of the strong coupling even qualitativemore » results about the spectrum of such theories are not rigorously understood.« less

  16. Statistical physics of self-replication.

    PubMed

    England, Jeremy L

    2013-09-28

    Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.

  17. Food insecurity, sexual risk behavior, and adherence to antiretroviral therapy among women living with HIV: A systematic review.

    PubMed

    Chop, Elisabeth; Duggaraju, Avani; Malley, Angela; Burke, Virginia; Caldas, Stephanie; Yeh, Ping Teresa; Narasimhan, Manjulaa; Amin, Avni; Kennedy, Caitlin E

    2017-09-01

    Gender inequalities shape the experience of food insecurity among women living with HIV (WLHIV). We systematically reviewed the impact of food insecurity on sexual risk behaviors and antiretroviral therapy (ART) adherence among WLHIV. We included qualitative or quantitative peer-reviewed articles, extracted data in duplicate, and assessed rigor. Seven studies, from sub-Saharan Africa, North America, and Europe, met inclusion criteria. Food insecurity was associated with increased sexual risk through transactional sex and inability to negotiate safer sex. Hunger and food insecurity were barriers to ART initiation/adherence. Multidimensional programming and policies should simultaneously address poverty, gender inequality, food insecurity, and HIV.

  18. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  19. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.

  20. Hearing the Silenced Voices of Underserved Women -The Role of Qualitative Research in Gynecologic and Reproductive Care

    PubMed Central

    Lawson, Angela K.; Marsh, Erica E.

    2017-01-01

    Summary for Indexing In order to provide effective evidence-based health care to women, rigorous research that examines women’s lived experiences in their own voices in needed. However, clinical health research has often excluded the experiences of women and minority patient populations. Further, clinical research has often relied on quantitative research strategies; this provides an interesting but limited understanding of women’s health experiences and hinders the provision of effective patient-centered care. In this review, we define qualitative research and its unique contributions to research, and provide examples of how qualitative research has given insights into the reproductive health perspectives and behaviors of underserved women. PMID:28160888

  1. Assessment of Gait Characteristics in Total Knee Arthroplasty Patients Using a Hierarchical Partial Least Squares Method.

    PubMed

    Wang, Wei; Ackland, David C; McClelland, Jodie A; Webster, Kate E; Halgamuge, Saman

    2018-01-01

    Quantitative gait analysis is an important tool in objective assessment and management of total knee arthroplasty (TKA) patients. Studies evaluating gait patterns in TKA patients have tended to focus on discrete data such as spatiotemporal information, joint range of motion and peak values of kinematics and kinetics, or consider selected principal components of gait waveforms for analysis. These strategies may not have the capacity to capture small variations in gait patterns associated with each joint across an entire gait cycle, and may ultimately limit the accuracy of gait classification. The aim of this study was to develop an automatic feature extraction method to analyse patterns from high-dimensional autocorrelated gait waveforms. A general linear feature extraction framework was proposed and a hierarchical partial least squares method derived for discriminant analysis of multiple gait waveforms. The effectiveness of this strategy was verified using a dataset of joint angle and ground reaction force waveforms from 43 patients after TKA surgery and 31 healthy control subjects. Compared with principal component analysis and partial least squares methods, the hierarchical partial least squares method achieved generally better classification performance on all possible combinations of waveforms, with the highest classification accuracy . The novel hierarchical partial least squares method proposed is capable of capturing virtually all significant differences between TKA patients and the controls, and provides new insights into data visualization. The proposed framework presents a foundation for more rigorous classification of gait, and may ultimately be used to evaluate the effects of interventions such as surgery and rehabilitation.

  2. Interventions to Increase Attendance at Psychotherapy: A Meta-Analysis of Randomized Controlled Trials

    ERIC Educational Resources Information Center

    Oldham, Mary; Kellett, Stephen; Miles, Eleanor; Sheeran, Paschal

    2012-01-01

    Objective: Rates of nonattendance for psychotherapy hinder the effective delivery of evidence-based treatments. Although many strategies have been developed to increase attendance, the effectiveness of these strategies has not been quantified. Our aim in the present study was to undertake a meta-analysis of rigorously controlled studies to…

  3. A Comparative Study of Definitions on Limit and Continuity of Functions

    ERIC Educational Resources Information Center

    Shipman, Barbara A.

    2012-01-01

    Differences in definitions of limit and continuity of functions as treated in courses on calculus and in rigorous undergraduate analysis yield contradictory outcomes and unexpected language. There are results about limits in calculus that are false by the definitions of analysis, functions not continuous by one definition and continuous by…

  4. Tutoring Adolescents in Literacy: A Meta-Analysis

    ERIC Educational Resources Information Center

    Jun, Seung Won; Ramirez, Gloria; Cumming, Alister

    2010-01-01

    What does research reveal about tutoring adolescents in literacy? We conducted a meta-analysis, identifying 152 published studies, of which 12 met rigorous inclusion criteria. We analyzed the 12 studies for the effects of tutoring according to the type, focus, and amount of tutoring; the number, age, and language background of students; and the…

  5. Interactive visual analysis promotes exploration of long-term ecological data

    Treesearch

    T.N. Pham; J.A. Jones; R. Metoyer; F.J. Swanson; R.J. Pabst

    2013-01-01

    Long-term ecological data are crucial in helping ecologists understand ecosystem function and environmental change. Nevertheless, these kinds of data sets are difficult to analyze because they are usually large, multivariate, and spatiotemporal. Although existing analysis tools such as statistical methods and spreadsheet software permit rigorous tests of pre-conceived...

  6. Facilitating long-term changes in student approaches to learning science.

    PubMed

    Buchwitz, Brian J; Beyer, Catharine H; Peterson, Jon E; Pitre, Emile; Lalic, Nevena; Sampson, Paul D; Wakimoto, Barbara T

    2012-01-01

    Undergraduates entering science curricula differ greatly in individual starting points and learning needs. The fast pace, high enrollment, and high stakes of introductory science courses, however, limit students' opportunities to self-assess and modify learning strategies. The University of Washington's Biology Fellows Program (BFP) intervenes through a 20-session, premajors course that introduces students to the rigor expected of bioscience majors and assists their development as science learners. This study uses quantitative and qualitative approaches to assess whether the 2007-2009 BFP achieved its desired short- and long-term impacts on student learning. Adjusting for differences in students' high school grade point average and Scholastic Aptitude Test scores, we found that participation in the BFP was associated with higher grades in two subsequent gateway biology courses, across multiple quarters and instructors. Two to 4 yr after participating in the program, students attributed changes in how they approached learning science to BFP participation. They reported having learned to "think like a scientist" and to value active-learning strategies and learning communities. In addition, they reported having developed a sense of belonging in bioscience communities. The achievement of long-term impacts for a short-term instructional investment suggests a practical means to prepare diverse students for the rigors of science curricula.

  7. Predictive QSAR modeling workflow, model applicability domains, and virtual screening.

    PubMed

    Tropsha, Alexander; Golbraikh, Alexander

    2007-01-01

    Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.

  8. Facilitating Long-Term Changes in Student Approaches to Learning Science

    PubMed Central

    Buchwitz, Brian J.; Beyer, Catharine H.; Peterson, Jon E.; Pitre, Emile; Lalic, Nevena; Sampson, Paul D.; Wakimoto, Barbara T.

    2012-01-01

    Undergraduates entering science curricula differ greatly in individual starting points and learning needs. The fast pace, high enrollment, and high stakes of introductory science courses, however, limit students’ opportunities to self-assess and modify learning strategies. The University of Washington's Biology Fellows Program (BFP) intervenes through a 20-session, premajors course that introduces students to the rigor expected of bioscience majors and assists their development as science learners. This study uses quantitative and qualitative approaches to assess whether the 2007–2009 BFP achieved its desired short- and long-term impacts on student learning. Adjusting for differences in students’ high school grade point average and Scholastic Aptitude Test scores, we found that participation in the BFP was associated with higher grades in two subsequent gateway biology courses, across multiple quarters and instructors. Two to 4 yr after participating in the program, students attributed changes in how they approached learning science to BFP participation. They reported having learned to “think like a scientist” and to value active-learning strategies and learning communities. In addition, they reported having developed a sense of belonging in bioscience communities. The achievement of long-term impacts for a short-term instructional investment suggests a practical means to prepare diverse students for the rigors of science curricula. PMID:22949424

  9. The Interrelationship between Promoter Strength, Gene Expression, and Growth Rate

    PubMed Central

    Klesmith, Justin R.; Detwiler, Emily E.; Tomek, Kyle J.; Whitehead, Timothy A.

    2014-01-01

    In exponentially growing bacteria, expression of heterologous protein impedes cellular growth rates. Quantitative understanding of the relationship between expression and growth rate will advance our ability to forward engineer bacteria, important for metabolic engineering and synthetic biology applications. Recently, a work described a scaling model based on optimal allocation of ribosomes for protein translation. This model quantitatively predicts a linear relationship between microbial growth rate and heterologous protein expression with no free parameters. With the aim of validating this model, we have rigorously quantified the fitness cost of gene expression by using a library of synthetic constitutive promoters to drive expression of two separate proteins (eGFP and amiE) in E. coli in different strains and growth media. In all cases, we demonstrate that the fitness cost is consistent with the previous findings. We expand upon the previous theory by introducing a simple promoter activity model to quantitatively predict how basal promoter strength relates to growth rate and protein expression. We then estimate the amount of protein expression needed to support high flux through a heterologous metabolic pathway and predict the sizable fitness cost associated with enzyme production. This work has broad implications across applied biological sciences because it allows for prediction of the interplay between promoter strength, protein expression, and the resulting cost to microbial growth rates. PMID:25286161

  10. Cochrane Qualitative and Implementation Methods Group guidance series-paper 4: methods for assessing evidence on intervention implementation.

    PubMed

    Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane

    2018-05-01

    This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Observed and modelled effects of auroral precipitation on the thermal ionospheric plasma: comparing the MICA and Cascades2 sounding rocket events

    NASA Astrophysics Data System (ADS)

    Lynch, K. A.; Gayetsky, L.; Fernandes, P. A.; Zettergren, M. D.; Lessard, M.; Cohen, I. J.; Hampton, D. L.; Ahrns, J.; Hysell, D. L.; Powell, S.; Miceli, R. J.; Moen, J. I.; Bekkeng, T.

    2012-12-01

    Auroral precipitation can modify the ionospheric thermal plasma through a variety of processes. We examine and compare the events seen by two recent auroral sounding rockets carrying in situ thermal plasma instrumentation. The Cascades2 sounding rocket (March 2009, Poker Flat Research Range) traversed a pre-midnight poleward boundary intensification (PBI) event distinguished by a stationary Alfvenic curtain of field-aligned precipitation. The MICA sounding rocket (February 2012, Poker Flat Research Range) traveled through irregular precipitation following the passage of a strong westward-travelling surge. Previous modelling of the ionospheric effects of auroral precipitation used a one-dimensional model, TRANSCAR, which had a simplified treatment of electric fields and did not have the benefit of in situ thermal plasma data. This new study uses a new two-dimensional model which self-consistently calculates electric fields to explore both spatial and temporal effects, and compares to thermal plasma observations. A rigorous understanding of the ambient thermal plasma parameters and their effects on the local spacecraft sheath and charging, is required for quantitative interpretation of in situ thermal plasma observations. To complement this TRANSCAR analysis we therefore require a reliable means of interpreting in situ thermal plasma observation. This interpretation depends upon a rigorous plasma sheath model since the ambient ion energy is on the order of the spacecraft's sheath energy. A self-consistent PIC model is used to model the spacecraft sheath, and a test-particle approach then predicts the detector response for a given plasma environment. The model parameters are then modified until agreement is found with the in situ data. We find that for some situations, the thermal plasma parameters are strongly driven by the precipitation at the observation time. For other situations, the previous history of the precipitation at that position can have a stronger effect.

  12. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    PubMed

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  13. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    PubMed Central

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  14. A Proposed Solution to the Problem with Using Completely Random Data to Assess the Number of Factors with Parallel Analysis

    ERIC Educational Resources Information Center

    Green, Samuel B.; Levy, Roy; Thompson, Marilyn S.; Lu, Min; Lo, Wen-Juo

    2012-01-01

    A number of psychometricians have argued for the use of parallel analysis to determine the number of factors. However, parallel analysis must be viewed at best as a heuristic approach rather than a mathematically rigorous one. The authors suggest a revision to parallel analysis that could improve its accuracy. A Monte Carlo study is conducted to…

  15. Devil's Slide: An evolving feature of California's coastal landscape

    NASA Astrophysics Data System (ADS)

    Thomas, M.; Loague, K.

    2013-12-01

    Coastal landslides in the United States remain a persistent threat to human life and urban development. The focus of this study is a landslide-prone section of the central California coastline, approximately 20 km south of San Francisco, known as Devil's Slide. This investigation employs an extensive aerial image inventory, digital elevation models (DEMs), and a water balance / limit-equilibrium approach to better understand the spatial and temporal characteristics of deep-seated bedrock slides at the site. Photographic surveys of the area reveal nearly three kilometers of headscarp and a complex network of slope failures that respond to hydrologic, seismic, and anthropogenic perturbations. DEM analysis suggests that, for a 145-year period (1866 to 2010), the study area experienced an average coastal retreat rate of 0.14 m yr-1 and an average volumetric loss of 11,216 m3 yr-1. At least 38% of the landscape evolution in the steep coastal terrain has been driven by slope failure events. A loosely coupled water balance / limit-equilibrium analysis quantitatively illustrates the precarious nature of the active landslide zone at the site. The slope is shown to be unstable for a large suite of equally-likely scenarios. The analyses presented herein suggest that future work should include a rigorous characterization of pore-water pressure development, driven by comprehensive simulations of subsurface hydrologic response, to improve our understanding of slope failure initiation at the Devil's Slide site.

  16. Detection, isolation and diagnosability analysis of intermittent faults in stochastic systems

    NASA Astrophysics Data System (ADS)

    Yan, Rongyi; He, Xiao; Wang, Zidong; Zhou, D. H.

    2018-02-01

    Intermittent faults (IFs) have the properties of unpredictability, non-determinacy, inconsistency and repeatability, switching systems between faulty and healthy status. In this paper, the fault detection and isolation (FDI) problem of IFs in a class of linear stochastic systems is investigated. For the detection and isolation of IFs, it includes: (1) to detect all the appearing time and the disappearing time of an IF; (2) to detect each appearing (disappearing) time of the IF before the subsequent disappearing (appearing) time; (3) to determine where the IFs happen. Based on the outputs of the observers we designed, a novel set of residuals is constructed by using the sliding-time window technique, and two hypothesis tests are proposed to detect all the appearing time and disappearing time of IFs. The isolation problem of IFs is also considered. Furthermore, within a statistical framework, the definition of the diagnosability of IFs is proposed, and a sufficient condition is brought forward for the diagnosability of IFs. Quantitative performance analysis results for the false alarm rate and missing detection rate are discussed, and the influences of some key parameters of the proposed scheme on performance indices such as the false alarm rate and missing detection rate are analysed rigorously. The effectiveness of the proposed scheme is illustrated via a simulation example of an unmanned helicopter longitudinal control system.

  17. Identification and Characterization of Molecular Bonding Structures by ab initio Quasi-Atomic Orbital Analyses.

    PubMed

    West, Aaron C; Duchimaza-Heredia, Juan J; Gordon, Mark S; Ruedenberg, Klaus

    2017-11-22

    The quasi-atomic analysis of ab initio electronic wave functions in full valence spaces, which was developed in preceding papers, yields oriented quasi-atomic orbitals in terms of which the ab initio molecular wave function and energy can be expressed. These oriented quasi-atomic orbitals are the rigorous ab initio counterparts to the conceptual bond forming atomic hybrid orbitals of qualitative chemical reasoning. In the present work, the quasi-atomic orbitals are identified as bonding orbitals, lone pair orbitals, radical orbitals, vacant orbitals and orbitals with intermediate character. A program determines the bonding characteristics of all quasi-atomic orbitals in a molecule on the basis of their occupations, bond orders, kinetic bond orders, hybridizations and local symmetries. These data are collected in a record and provide the information for a comprehensive understanding of the synergism that generates the bonding structure that holds the molecule together. Applications to a series of molecules exhibit the complete bonding structures that are embedded in their ab initio wave functions. For the strong bonds in a molecule, the quasi-atomic orbitals provide quantitative ab initio amplifications of the Lewis dot symbols. Beyond characterizing strong bonds, the quasi-atomic analysis also yields an understanding of the weak interactions, such as vicinal, hyperconjugative and radical stabilizations, which can make substantial contributions to the molecular bonding structure.

  18. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  19. Comparing the Effectiveness of Polymer Debriding Devices Using a Porcine Wound Biofilm Model.

    PubMed

    Wilkinson, Holly N; McBain, Andrew J; Stephenson, Christian; Hardman, Matthew J

    2016-11-01

    Objective: Debridement to remove necrotic and/or infected tissue and promote active healing remains a cornerstone of contemporary chronic wound management. While there has been a recent shift toward less invasive polymer-based debriding devices, their efficacy requires rigorous evaluation. Approach: This study was designed to directly compare monofilament debriding devices to traditional gauze using a wounded porcine skin biofilm model with standardized application parameters. Biofilm removal was determined using a surface viability assay, bacterial counts, histological assessment, and scanning electron microscopy (SEM). Results: Quantitative analysis revealed that monofilament debriding devices outperformed the standard gauze, resulting in up to 100-fold greater reduction in bacterial counts. Interestingly, histological and morphological analyses suggested that debridement not only removed bacteria, but also differentially disrupted the bacterially-derived extracellular polymeric substance. Finally, SEM of post-debridement monofilaments showed structural changes in attached bacteria, implying a negative impact on viability. Innovation: This is the first study to combine controlled and defined debridement application with a biologically relevant ex vivo biofilm model to directly compare monofilament debriding devices. Conclusion: These data support the use of monofilament debriding devices for the removal of established wound biofilms and suggest variable efficacy towards biofilms composed of different species of bacteria.

  20. The evolution of self-control

    PubMed Central

    MacLean, Evan L.; Hare, Brian; Nunn, Charles L.; Addessi, Elsa; Amici, Federica; Anderson, Rindy C.; Aureli, Filippo; Baker, Joseph M.; Bania, Amanda E.; Barnard, Allison M.; Boogert, Neeltje J.; Brannon, Elizabeth M.; Bray, Emily E.; Bray, Joel; Brent, Lauren J. N.; Burkart, Judith M.; Call, Josep; Cantlon, Jessica F.; Cheke, Lucy G.; Clayton, Nicola S.; Delgado, Mikel M.; DiVincenti, Louis J.; Fujita, Kazuo; Herrmann, Esther; Hiramatsu, Chihiro; Jacobs, Lucia F.; Jordan, Kerry E.; Laude, Jennifer R.; Leimgruber, Kristin L.; Messer, Emily J. E.; de A. Moura, Antonio C.; Ostojić, Ljerka; Picard, Alejandra; Platt, Michael L.; Plotnik, Joshua M.; Range, Friederike; Reader, Simon M.; Reddy, Rachna B.; Sandel, Aaron A.; Santos, Laurie R.; Schumann, Katrin; Seed, Amanda M.; Sewall, Kendra B.; Shaw, Rachael C.; Slocombe, Katie E.; Su, Yanjie; Takimoto, Ayaka; Tan, Jingzhi; Tao, Ruoting; van Schaik, Carel P.; Virányi, Zsófia; Visalberghi, Elisabetta; Wade, Jordan C.; Watanabe, Arii; Widness, Jane; Young, Julie K.; Zentall, Thomas R.; Zhao, Yini

    2014-01-01

    Cognition presents evolutionary research with one of its greatest challenges. Cognitive evolution has been explained at the proximate level by shifts in absolute and relative brain volume and at the ultimate level by differences in social and dietary complexity. However, no study has integrated the experimental and phylogenetic approach at the scale required to rigorously test these explanations. Instead, previous research has largely relied on various measures of brain size as proxies for cognitive abilities. We experimentally evaluated these major evolutionary explanations by quantitatively comparing the cognitive performance of 567 individuals representing 36 species on two problem-solving tasks measuring self-control. Phylogenetic analysis revealed that absolute brain volume best predicted performance across species and accounted for considerably more variance than brain volume controlling for body mass. This result corroborates recent advances in evolutionary neurobiology and illustrates the cognitive consequences of cortical reorganization through increases in brain volume. Within primates, dietary breadth but not social group size was a strong predictor of species differences in self-control. Our results implicate robust evolutionary relationships between dietary breadth, absolute brain volume, and self-control. These findings provide a significant first step toward quantifying the primate cognitive phenome and explaining the process of cognitive evolution. PMID:24753565

  1. Quantitative historical analysis uncovers a single dimension of complexity that structures global variation in human social organization

    PubMed Central

    Turchin, Peter; Currie, Thomas E.; Whitehouse, Harvey; François, Pieter; Feeney, Kevin; Mullins, Daniel; Hoyer, Daniel; Collins, Christina; Grohmann, Stephanie; Mendel-Gleason, Gavin; Turner, Edward; Dupeyron, Agathe; Cioni, Enrico; Reddish, Jenny; Levine, Jill; Jordan, Greine; Brandl, Eva; Williams, Alice; Cesaretti, Rudolf; Krueger, Marta; Ceccarelli, Alessandro; Figliulo-Rosswurm, Joe; Tuan, Po-Ju; Peregrine, Peter; Marciniak, Arkadiusz; Preiser-Kapeller, Johannes; Kradin, Nikolay; Korotayev, Andrey; Palmisano, Alessio; Baker, David; Bidmead, Julye; Bol, Peter; Christian, David; Cook, Connie; Covey, Alan; Feinman, Gary; Júlíusson, Árni Daníel; Kristinsson, Axel; Miksic, John; Mostern, Ruth; Petrie, Cameron; Rudiak-Gould, Peter; ter Haar, Barend; Wallace, Vesna; Mair, Victor; Xie, Liye; Baines, John; Bridges, Elizabeth; Manning, Joseph; Lockhart, Bruce; Bogaard, Amy; Spencer, Charles

    2018-01-01

    Do human societies from around the world exhibit similarities in the way that they are structured, and show commonalities in the ways that they have evolved? These are long-standing questions that have proven difficult to answer. To test between competing hypotheses, we constructed a massive repository of historical and archaeological information known as “Seshat: Global History Databank.” We systematically coded data on 414 societies from 30 regions around the world spanning the last 10,000 years. We were able to capture information on 51 variables reflecting nine characteristics of human societies, such as social scale, economy, features of governance, and information systems. Our analyses revealed that these different characteristics show strong relationships with each other and that a single principal component captures around three-quarters of the observed variation. Furthermore, we found that different characteristics of social complexity are highly predictable across different world regions. These results suggest that key aspects of social organization are functionally related and do indeed coevolve in predictable ways. Our findings highlight the power of the sciences and humanities working together to rigorously test hypotheses about general rules that may have shaped human history. PMID:29269395

  2. Quantum many-body adiabaticity, topological Thouless pump and driven impurity in a one-dimensional quantum fluid

    NASA Astrophysics Data System (ADS)

    Lychkovskiy, Oleg; Gamayun, Oleksandr; Cheianov, Vadim

    2018-02-01

    The quantum adiabatic theorem states that a driven system can be kept arbitrarily close to the instantaneous eigenstate of its Hamiltonian if the latter varies in time slowly enough. When it comes to applying the adiabatic theorem in practice, the key question to be answered is how slow slowly enough is. This question can be an intricate one, especially for many-body systems, where the limits of slow driving and large system size may not commute. Recently we have shown how the quantum adiabaticity in many-body systems is related to the generalized orthogonality catastrophe [arXiv 1611.00663, to appear in Phys. Rev. Lett.]. We have proven a rigorous inequality relating these two phenomena and applied it to establish conditions for the quantized transport in the topological Thouless pump. In the present contribution we (i) review these developments and (ii) apply the inequality to establish the conditions for adiabaticity in a one-dimensional system consisting of a quantum fluid and an impurity particle pulled through the fluid by an external force. The latter analysis is vital for the correct quantitative description of the phenomenon of quasi-Bloch oscillations in a one-dimensional translation invariant impurity-fluid system.

  3. Sexuality Research in Iran: A Focus on Methodological and Ethical Considerations.

    PubMed

    Rahmani, Azam; Merghati-Khoei, Effat; Moghaddam-Banaem, Lida; Zarei, Fatemeh; Montazeri, Ali; Hajizadeh, Ebrahim

    2015-07-01

    Research on sensitive topics, such as sexuality, could raise technical, methodological, ethical, political, and legal challenges. The aim of this paper was to draw the methodological challenges which the authors confronted during sexuality research with young population in the Iranian culture. This study was an exploratory mixed method one conducted in 2013-14. We interviewed 63 young women aged 18-34 yr in qualitative phase and 265 young women in quantitative phase in (university and non-university) dormitories and in an Adolescent Friendly Center. Data were collected using focus group discussions and individual interviews in the qualitative phase. We employed conventional content analysis to analyze the data. To enhance the rigor of the data, multiple data collection methods, maximum variation sampling, and peer checks were applied. Five main themes emerged from the data: interaction with opposite sex, sexual risk, sexual protective, sex education, and sexual vulnerability. Challenges while conducting sex research have been discussed. These challenges included assumption of promiscuity, language of silence and privacy concerns, and sex segregation policy. We described the strategies applied in our study and the rationales for each strategy. Strategies applied in the present study can be employed in contexts with the similar methodological and moral concerns.

  4. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    PubMed

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  5. Performance of automated scoring of ER, PR, HER2, CK5/6 and EGFR in breast cancer tissue microarrays in the Breast Cancer Association Consortium

    PubMed Central

    Howat, William J; Blows, Fiona M; Provenzano, Elena; Brook, Mark N; Morris, Lorna; Gazinska, Patrycja; Johnson, Nicola; McDuffus, Leigh‐Anne; Miller, Jodi; Sawyer, Elinor J; Pinder, Sarah; van Deurzen, Carolien H M; Jones, Louise; Sironen, Reijo; Visscher, Daniel; Caldas, Carlos; Daley, Frances; Coulson, Penny; Broeks, Annegien; Sanders, Joyce; Wesseling, Jelle; Nevanlinna, Heli; Fagerholm, Rainer; Blomqvist, Carl; Heikkilä, Päivi; Ali, H Raza; Dawson, Sarah‐Jane; Figueroa, Jonine; Lissowska, Jolanta; Brinton, Louise; Mannermaa, Arto; Kataja, Vesa; Kosma, Veli‐Matti; Cox, Angela; Brock, Ian W; Cross, Simon S; Reed, Malcolm W; Couch, Fergus J; Olson, Janet E; Devillee, Peter; Mesker, Wilma E; Seyaneve, Caroline M; Hollestelle, Antoinette; Benitez, Javier; Perez, Jose Ignacio Arias; Menéndez, Primitiva; Bolla, Manjeet K; Easton, Douglas F; Schmidt, Marjanka K; Pharoah, Paul D; Sherman, Mark E

    2014-01-01

    Abstract Breast cancer risk factors and clinical outcomes vary by tumour marker expression. However, individual studies often lack the power required to assess these relationships, and large‐scale analyses are limited by the need for high throughput, standardized scoring methods. To address these limitations, we assessed whether automated image analysis of immunohistochemically stained tissue microarrays can permit rapid, standardized scoring of tumour markers from multiple studies. Tissue microarray sections prepared in nine studies containing 20 263 cores from 8267 breast cancers stained for two nuclear (oestrogen receptor, progesterone receptor), two membranous (human epidermal growth factor receptor 2 and epidermal growth factor receptor) and one cytoplasmic (cytokeratin 5/6) marker were scanned as digital images. Automated algorithms were used to score markers in tumour cells using the Ariol system. We compared automated scores against visual reads, and their associations with breast cancer survival. Approximately 65–70% of tissue microarray cores were satisfactory for scoring. Among satisfactory cores, agreement between dichotomous automated and visual scores was highest for oestrogen receptor (Kappa = 0.76), followed by human epidermal growth factor receptor 2 (Kappa = 0.69) and progesterone receptor (Kappa = 0.67). Automated quantitative scores for these markers were associated with hazard ratios for breast cancer mortality in a dose‐response manner. Considering visual scores of epidermal growth factor receptor or cytokeratin 5/6 as the reference, automated scoring achieved excellent negative predictive value (96–98%), but yielded many false positives (positive predictive value = 30–32%). For all markers, we observed substantial heterogeneity in automated scoring performance across tissue microarrays. Automated analysis is a potentially useful tool for large‐scale, quantitative scoring of immunohistochemically stained tissue microarrays available in consortia. However, continued optimization, rigorous marker‐specific quality control measures and standardization of tissue microarray designs, staining and scoring protocols is needed to enhance results. PMID:27499890

  6. Compactness and robustness: Applications in the solution of integral equations for chemical kinetics and electromagnetic scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Yajun

    This thesis employs the topological concept of compactness to deduce robust solutions to two integral equations arising from chemistry and physics: the inverse Laplace problem in chemical kinetics and the vector wave scattering problem in dielectric optics. The inverse Laplace problem occurs in the quantitative understanding of biological processes that exhibit complex kinetic behavior: different subpopulations of transition events from the "reactant" state to the "product" state follow distinct reaction rate constants, which results in a weighted superposition of exponential decay modes. Reconstruction of the rate constant distribution from kinetic data is often critical for mechanistic understandings of chemical reactions related to biological macromolecules. We devise a "phase function approach" to recover the probability distribution of rate constants from decay data in the time domain. The robustness (numerical stability) of this reconstruction algorithm builds upon the continuity of the transformations connecting the relevant function spaces that are compact metric spaces. The robust "phase function approach" not only is useful for the analysis of heterogeneous subpopulations of exponential decays within a single transition step, but also is generalizable to the kinetic analysis of complex chemical reactions that involve multiple intermediate steps. A quantitative characterization of the light scattering is central to many meteoro-logical, optical, and medical applications. We give a rigorous treatment to electromagnetic scattering on arbitrarily shaped dielectric media via the Born equation: an integral equation with a strongly singular convolution kernel that corresponds to a non-compact Green operator. By constructing a quadratic polynomial of the Green operator that cancels out the kernel singularity and satisfies the compactness criterion, we reveal the universality of a real resonance mode in dielectric optics. Meanwhile, exploiting the properties of compact operators, we outline the geometric and physical conditions that guarantee a robust solution to the light scattering problem, and devise an asymptotic solution to the Born equation of electromagnetic scattering for arbitrarily shaped dielectric in a non-perturbative manner.

  7. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    PubMed

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  8. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT

    PubMed Central

    Meltzer, S. J.; Auer, John

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions—nearly equimolecular to "physiological" solutions of sodium chloride—are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle. PMID:19867124

  9. Attitude stability of spinning satellites

    NASA Technical Reports Server (NTRS)

    Caughey, T. K.

    1980-01-01

    Some problems of attitude stability of spinning satellites are treated in a rigorous manner. With certain restrictions, linearized stability analysis correctly predicts the attitude stability of spinning satellites, even in the critical cases of the Liapunov-Poincare stability theory.

  10. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  11. Perspective: Optical measurement of feature dimensions and shapes by scatterometry

    NASA Astrophysics Data System (ADS)

    Diebold, Alain C.; Antonelli, Andy; Keller, Nick

    2018-05-01

    The use of optical scattering to measure feature shape and dimensions, scatterometry, is now routine during semiconductor manufacturing. Scatterometry iteratively improves an optical model structure using simulations that are compared to experimental data from an ellipsometer. These simulations are done using the rigorous coupled wave analysis for solving Maxwell's equations. In this article, we describe the Mueller matrix spectroscopic ellipsometry based scatterometry. Next, the rigorous coupled wave analysis for Maxwell's equations is presented. Following this, several example measurements are described as they apply to specific process steps in the fabrication of gate-all-around (GAA) transistor structures. First, simulations of measurement sensitivity for the inner spacer etch back step of horizontal GAA transistor processing are described. Next, the simulated metrology sensitivity for sacrificial (dummy) amorphous silicon etch back step of vertical GAA transistor processing is discussed. Finally, we present the application of plasmonically active test structures for improving the sensitivity of the measurement of metal linewidths.

  12. Long persistence of rigor mortis at constant low temperature.

    PubMed

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  13. Computer analysis of lighting style in fine art: steps towards inter-artist studies

    NASA Astrophysics Data System (ADS)

    Stork, David G.

    2011-03-01

    Stylometry in visual art-the mathematical description of artists' styles - has been based on a number of properties of works, such as color, brush stroke shape, visual texture, and measures of contours' curvatures. We introduce the concept of quantitative measures of lighting, such as statistical descriptions of spatial coherence, diuseness, and so forth, as properties of artistic style. Some artists of the high Renaissance, such as Leonardo, worked from nature and strove to render illumination "faithfully" photorealists, such as Richard Estes, worked from photographs and duplicated the "physics based" lighting accurately. As such, each had dierent motivations, methodologies, stagings, and "accuracies" in rendering lighting clues. Perceptual studies show that observers are poor judges of properties of lighting in photographs such as consistency (and thus by extension in paintings as well); computer methods such as rigorous cast-shadow analysis, occluding-contour analysis and spherical harmonic based estimation of light fields can be quite accurate. For this reasons, computer lighting analysis can provide a new tools for art historical studies. We review lighting analysis in paintings such as Vermeer's Girl with a pearl earring, de la Tour's Christ in the carpenter's studio, Caravaggio's Magdalen with the smoking flame and Calling of St. Matthew) and extend our corpus to works where lighting coherence is of interest to art historians, such as Caravaggio's Adoration of the Shepherds or Nativity (1609) in the Capuchin church of Santa Maria degli Angeli. Our measure of lighting coherence may help reveal the working methods of some artists and in diachronic studies of individual artists. We speculate on artists and art historical questions that may ultimately profit from future renements to these new computational tools.

  14. Semiconductor Quantum Electron Wave Transport, Diffraction, and Interference: Analysis, Device, and Measurement.

    NASA Astrophysics Data System (ADS)

    Henderson, Gregory Newell

    Semiconductor device dimensions are rapidly approaching a fundamental limit where drift-diffusion equations and the depletion approximation are no longer valid. In this regime, quantum effects can dominate device response. To increase further device density and speed, new devices must be designed that use these phenomena to positive advantage. In addition, quantum effects provide opportunities for a new class of devices which can perform functions previously unattainable with "conventional" semiconductor devices. This thesis has described research in the analysis of electron wave effects in semiconductors and the development of methods for the design, fabrication, and characterization of quantum devices based on these effects. First, an exact set of quantitative analogies are presented which allow the use of well understood optical design and analysis tools for the development of electron wave semiconductor devices. Motivated by these analogies, methods are presented for modeling electron wave grating diffraction using both an exact rigorous coupled-wave analysis and approximate analyses which are useful for grating design. Example electron wave grating switch and multiplexer designs are presented. In analogy to thin-film optics, the design and analysis of electron wave Fabry-Perot interference filters are also discussed. An innovative technique has been developed for testing these (and other) electron wave structures using Ballistic Electron Emission Microscopy (BEEM). This technique uses a liquid-helium temperature scanning tunneling microscope (STM) to perform spectroscopy of the electron transmittance as a function of electron energy. Experimental results show that BEEM can resolve even weak quantum effects, such as the reflectivity of a single interface between materials. Finally, methods are discussed for incorporating asymmetric electron wave Fabry-Perot filters into optoelectronic devices. Theoretical and experimental results show that such structures could be the basis for a new type of electrically pumped mid - to far-infrared semiconductor laser.

  15. Fluorescence Polarization and Fluctuation Analysis Monitors Subunit Proximity, Stoichiometry, and Protein Complex Hydrodynamics

    PubMed Central

    Nguyen, Tuan A.; Sarkar, Pabak; Veetil, Jithesh V.; Koushik, Srinagesh V.; Vogel, Steven S.

    2012-01-01

    Förster resonance energy transfer (FRET) microscopy is frequently used to study protein interactions and conformational changes in living cells. The utility of FRET is limited by false positive and negative signals. To overcome these limitations we have developed Fluorescence Polarization and Fluctuation Analysis (FPFA), a hybrid single-molecule based method combining time-resolved fluorescence anisotropy (homo-FRET) and fluorescence correlation spectroscopy. Using FPFA, homo-FRET (a 1–10 nm proximity gauge), brightness (a measure of the number of fluorescent subunits in a complex), and correlation time (an attribute sensitive to the mass and shape of a protein complex) can be simultaneously measured. These measurements together rigorously constrain the interpretation of FRET signals. Venus based control-constructs were used to validate FPFA. The utility of FPFA was demonstrated by measuring in living cells the number of subunits in the α-isoform of Venus-tagged calcium-calmodulin dependent protein kinase-II (CaMKIIα) holoenzyme. Brightness analysis revealed that the holoenzyme has, on average, 11.9±1.2 subunit, but values ranged from 10–14 in individual cells. Homo-FRET analysis simultaneously detected that catalytic domains were arranged as dimers in the dodecameric holoenzyme, and this paired organization was confirmed by quantitative hetero-FRET analysis. In freshly prepared cell homogenates FPFA detected only 10.2±1.3 subunits in the holoenzyme with values ranging from 9–12. Despite the reduction in subunit number, catalytic domains were still arranged as pairs in homogenates. Thus, FPFA suggests that while the absolute number of subunits in an auto-inhibited holoenzyme might vary from cell to cell, the organization of catalytic domains into pairs is preserved. PMID:22666486

  16. Rigor Made Easy: Getting Started

    ERIC Educational Resources Information Center

    Blackburn, Barbara R.

    2012-01-01

    Bestselling author and noted rigor expert Barbara Blackburn shares the secrets to getting started, maintaining momentum, and reaching your goals. Learn what rigor looks like in the classroom, understand what it means for your students, and get the keys to successful implementation. Learn how to use rigor to raise expectations, provide appropriate…

  17. Close Early Learning Gaps with Rigorous DAP

    ERIC Educational Resources Information Center

    Brown, Christopher P.; Mowry, Brian

    2015-01-01

    Rigorous DAP (developmentally appropriate practices) is a set of 11 principles of instruction intended to help close early childhood learning gaps. Academically rigorous learning environments create the conditions for children to learn at high levels. While academic rigor focuses on one dimension of education--academic--DAP considers the whole…

  18. Civil Rights Project's Response to "Re-Analysis" of Charter School Study

    ERIC Educational Resources Information Center

    Civil Rights Project / Proyecto Derechos Civiles, 2010

    2010-01-01

    The Civil Rights Project (CRP) was founded, in part, to bring rigorous social science inquiry to bear on the most pressing civil rights issues. On-going trends involving public school segregation have been a primary focus of the CRP's research, and the expanding policy emphasis on school choice prompted analysis of the much smaller--but…

  19. A Meta-Analysis of Single-Subject Research on Behavioral Momentum to Enhance Success in Students with Autism

    ERIC Educational Resources Information Center

    Cowan, Richard J.; Abel, Leah; Candel, Lindsay

    2017-01-01

    We conducted a meta-analysis of single-subject research studies investigating the effectiveness of antecedent strategies grounded in behavioral momentum for improving compliance and on-task performance for students with autism. First, we assessed the research rigor of those studies meeting our inclusionary criteria. Next, in order to apply a…

  20. A Review of the Application of Lifecycle Analysis to Renewable Energy Systems

    ERIC Educational Resources Information Center

    Lund, Chris; Biswas, Wahidul

    2008-01-01

    The lifecycle concept is a "cradle to grave" approach to thinking about products, processes, and services, recognizing that all stages have environmental and economic impacts. Any rigorous and meaningful comparison of energy supply options must be done using a lifecycle analysis approach. It has been applied to an increasing number of conventional…

  1. Considerations for the Systematic Analysis and Use of Single-Case Research

    ERIC Educational Resources Information Center

    Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith

    2012-01-01

    Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…

  2. Using High Speed Smartphone Cameras and Video Analysis Techniques to Teach Mechanical Wave Physics

    ERIC Educational Resources Information Center

    Bonato, Jacopo; Gratton, Luigi M.; Onorato, Pasquale; Oss, Stefano

    2017-01-01

    We propose the use of smartphone-based slow-motion video analysis techniques as a valuable tool for investigating physics concepts ruling mechanical wave propagation. The simple experimental activities presented here, suitable for both high school and undergraduate students, allows one to measure, in a simple yet rigorous way, the speed of pulses…

  3. Using Framework Analysis in nursing research: a worked example.

    PubMed

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  4. Augmented assessment as a means to augmented reality.

    PubMed

    Bergeron, Bryan

    2006-01-01

    Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.

  5. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    NASA Astrophysics Data System (ADS)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  6. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    PubMed

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  7. Burnett-Cattaneo continuum theory for shock waves.

    PubMed

    Holian, Brad Lee; Mareschal, Michel; Ravelo, Ramon

    2011-02-01

    We model strong shock-wave propagation, both in the ideal gas and in the dense Lennard-Jones fluid, using a refinement of earlier work, which accounts for the cold compression in the early stages of the shock rise by a nonlinear, Burnett-like, strain-rate dependence of the thermal conductivity, and relaxation of kinetic-temperature components on the hot, compressed side of the shock front. The relaxation of the disequilibrium among the three components of the kinetic temperature, namely, the difference between the component in the direction of a planar shock wave and those in the transverse directions, particularly in the region near the shock front, is accomplished at a much more quantitative level by a rigorous application of the Cattaneo-Maxwell relaxation equation to a reference solution, namely, the steady shock-wave solution of linear Navier-Stokes-Fourier theory, along with the nonlinear Burnett heat-flux term. Our new continuum theory is in nearly quantitative agreement with nonequilibrium molecular-dynamics simulations under strong shock-wave conditions, using relaxation parameters obtained from the reference solution. ©2011 American Physical Society

  8. Stability of miRNA in human urine supports its biomarker potential

    PubMed Central

    Mall, Christine; Rocke, David M; Durbin-Johnson, Blythe; Weiss, Robert H

    2013-01-01

    Aim miRNAs are showing utility as biomarkers in urologic disease, however, a rigorous evaluation of their stability in urine is lacking. Here, we evaluate the stability of miRNAs in urine under clinically relevant storage procedures. Materials & methods Eight healthy individuals provided clean catch urine samples that were stored at room temperature or at 4°C for 5 days, or subjected to ten freeze–thaw cycles at -80°C. For each condition, two miRNAs, miR-16 and miR-21, were quantitated by quantitative real-time PCR. Results All conditions demonstrated a surprising degree of stability of miRNAs in the urine: by the end of ten freeze–thaw cycles, 23–37% of the initial amount remained; over the 5-day period of storage at room temperature, 35% of the initial amount remained; and at 4°C, 42–56% of the initial amount remained. Both miRNAs also showed degradation at approximately the same rate. Conclusion miRNAs are relatively stable in urine under a variety of storage conditions, which supports their utility as urinary biomarkers. PMID:23905899

  9. Rigorous Schools and Classrooms: Leading the Way

    ERIC Educational Resources Information Center

    Williamson, Ronald; Blackburn, Barbara R.

    2010-01-01

    Turn your school into a student-centered learning environment, where rigor is at the heart of instruction in every classroom. From the bestselling author of "Rigor is Not a Four-Letter Word," Barbara Blackburn, and award-winning educator Ronald Williamson, this comprehensive guide to establishing a schoolwide culture of rigor is for principals and…

  10. Rigor Revisited: Scaffolding College Student Learning by Incorporating Their Lived Experiences

    ERIC Educational Resources Information Center

    Castillo-Montoya, Milagros

    2018-01-01

    This chapter explores how students' lived experiences contribute to the rigor of their thinking. Insights from research indicate faculty can enhance rigor by accounting for the many ways it may surface in the classroom. However, to see this type of rigor, we must revisit the way we conceptualize it for higher education.

  11. Effect of rigor temperature, ageing and display time on the meat quality and lipid oxidative stability of hot boned beef Semimembranosus muscle.

    PubMed

    Mungure, Tanyaradzwa E; Bekhit, Alaa El-Din A; Birch, E John; Stewart, Ian

    2016-04-01

    The effects of rigor temperature (5, 15, 20 and 25°C), ageing (3, 7, 14, and 21 days) and display time on meat quality and lipid oxidative stability of hot boned beef M. Semimembranosus (SM) muscle were investigated. Ultimate pH (pH(u)) was rapidly attained at higher rigor temperatures. Electrical conductivity increased with rigor temperature (p<0.001). Tenderness, purge and cooking losses were not affected by rigor temperature; however purge loss and tenderness increased with ageing (p<0.01). Lightness (L*) and redness (a*) of the SM increased as rigor temperature increased (p<0.01). Lipid oxidation was assessed using (1)H NMR where changes in aliphatic to olefinic (R(ao)) and diallylmethylene (R(ad)) proton ratios can be rapidly monitored. R(ad), R(ao), PUFA and TBARS were not affected by rigor temperature, however ageing and display increased lipid oxidation (p<0.05). This study shows that rigor temperature manipulation of hot boned beef SM muscle does not have adverse effects on lipid oxidation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    NASA Astrophysics Data System (ADS)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.

  13. Exploring Student Perceptions of Rigor Online: Toward a Definition of Rigorous Learning

    ERIC Educational Resources Information Center

    Duncan, Heather E.; Range, Bret; Hvidston, David

    2013-01-01

    Technological advances in the last decade have impacted delivery methods of university courses. More and more courses are offered in a variety of formats. While academic rigor is a term often used, its definition is less clear. This mixed-methods study explored graduate student conceptions of rigor in the online learning environment embedded…

  14. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, Part-II: Models And Procedures (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.

  15. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prowell, Stacy J; Symons, Christopher T

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  16. Political Animals: The Paradox of Ecofeminist Politics.

    ERIC Educational Resources Information Center

    Sandilands, Catriona

    1994-01-01

    Analyzes the paradox between the careful work of rigorous political analysis and philosophy and a desire for mystery and the experience of awe and wildness that demands putting aside careful reasoning and the sensing of nature in an altogether different way. (LZ)

  17. Methodological rigor and citation frequency in patient compliance literature.

    PubMed Central

    Bruer, J T

    1982-01-01

    An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334

  18. Respectful Modeling: Addressing Uncertainty in Dynamic System Models for Molecular Biology.

    PubMed

    Tsigkinopoulou, Areti; Baker, Syed Murtuza; Breitling, Rainer

    2017-06-01

    Although there is still some skepticism in the biological community regarding the value and significance of quantitative computational modeling, important steps are continually being taken to enhance its accessibility and predictive power. We view these developments as essential components of an emerging 'respectful modeling' framework which has two key aims: (i) respecting the models themselves and facilitating the reproduction and update of modeling results by other scientists, and (ii) respecting the predictions of the models and rigorously quantifying the confidence associated with the modeling results. This respectful attitude will guide the design of higher-quality models and facilitate the use of models in modern applications such as engineering and manipulating microbial metabolism by synthetic biology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Statistical Characterization and Classification of Edge-Localized Plasma Instabilities

    NASA Astrophysics Data System (ADS)

    Webster, A. J.; Dendy, R. O.

    2013-04-01

    The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.

  20. Analysis of Six Reviews on the Quality of Instruments for the Evaluation of Interprofessional Education in German-Speaking Countries.

    PubMed

    Ehlers, Jan P; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion

    2017-01-01

    Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary.

Top