Accelerated Colorimetric Micro-assay for Screening Mold Inhibitors
Carol A. Clausen; Vina W. Yang
2014-01-01
Rapid quantitative laboratory test methods are needed to screen potential antifungal agents. Existing laboratory test methods are relatively time consuming, may require specialized test equipment and rely on subjective visual ratings. A quantitative, colorimetric micro-assay has been developed that uses XTT tetrazolium salt to metabolically assess mold spore...
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
Mapping quantitative trait loci for traits defined as ratios.
Yang, Runqing; Li, Jiahan; Xu, Shizhong
2008-03-01
Many traits are defined as ratios of two quantitative traits. Methods of QTL mapping for regular quantitative traits are not optimal when applied to ratios due to lack of normality for traits defined as ratios. We develop a new method of QTL mapping for traits defined as ratios. The new method uses a special linear combination of the two component traits, and thus takes advantage of the normal property of the new variable. Simulation study shows that the new method can substantially increase the statistical power of QTL detection relative to the method which treats ratios as regular quantitative traits. The new method also outperforms the method that uses Box-Cox transformed ratio as the phenotype. A real example of QTL mapping for relative growth rate in soybean demonstrates that the new method can detect more QTL than existing methods of QTL mapping for traits defined as ratios.
Tan, York Kiat; Allen, John C; Lye, Weng Kit; Conaghan, Philip G; Chew, Li-Ching; Thumboo, Julian
2017-05-01
The aim of the study is to compare the responsiveness of two joint inflammation scoring systems (dichotomous scoring (DS) versus semi-quantitative scoring (SQS)) using novel individualized ultrasound joint selection methods and existing ultrasound joint selection methods. Responsiveness measured by the standardized response means (SRMs) using the DS and the SQS system (for both the novel and existing ultrasound joint selection methods) was derived using the baseline and the 3-month total inflammatory scores from 20 rheumatoid arthritis patients. The relative SRM gain ratios (SRM-Gains) for both scoring system (DS and SQS) comparing the novel to the existing methods were computed. Both scoring systems (DS and SQS) demonstrated substantial SRM-Gains (ranged from 3.31 to 5.67 for the DS system and ranged from 1.82 to 3.26 for the SQS system). The SRMs using the novel methods ranged from 0.94 to 1.36 for the DS system and ranged from 0.89 to 1.11 for the SQS system. The SRMs using the existing methods ranged from 0.24 to 0.32 for the DS system and ranged from 0.34 to 0.49 for the SQS system. The DS system appears to achieve high responsiveness comparable to SQS for the novel individualized ultrasound joint selection methods.
Evaluation of background parenchymal enhancement on breast MRI: a systematic review
Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto
2017-01-01
Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480
The Role of Extracellular Vesicles in Metastasis
2017-10-01
quantitative characterization of each cancerous ESV subpopulation’s role in cargo transfer. Specifically, we aim to (1) optimize an existing...the first quantitative data on which ESV subpopulations (exosomes, MVs, or oncosomes) manipulate the tumor microenvironment, the ESV cargo transferred...without cellular contaminants or without damaging the cargo. A second hindrance has been the lack of quantitative methods for measuring very small
What Good Are Statistics that Don't Generalize?
ERIC Educational Resources Information Center
Shaffer, David Williamson; Serlin, Ronald C.
2004-01-01
Quantitative and qualitative inquiry are sometimes portrayed as distinct and incompatible paradigms for research in education. Approaches to combining qualitative and quantitative research typically "integrate" the two methods by letting them co-exist independently within a single research study. Here we describe intra-sample statistical analysis…
Code of Federal Regulations, 2010 CFR
2010-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2013 CFR
2013-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2014 CFR
2014-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2011 CFR
2011-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2012 CFR
2012-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Exploring a taxonomy for aggression against women: can it aid conceptual clarity?
Cook, Sarah; Parrott, Dominic
2009-01-01
The assessment of aggression against women is demanding primarily because assessment strategies do not share a common language to describe reliably the wide range of forms of aggression women experience. The lack of a common language impairs efforts to describe these experiences, understand causes and consequences of aggression against women, and develop effective intervention and prevention efforts. This review accomplishes two goals. First, it applies a theoretically and empirically based taxonomy to behaviors assessed by existing measurement instruments. Second, it evaluates whether the taxonomy provides a common language for the field. Strengths of the taxonomy include its ability to describe and categorize all forms of aggression found in existing quantitative measures. The taxonomy also classifies numerous examples of aggression discussed in the literature but notably absent from quantitative measures. Although we use existing quantitative measures as a starting place to evaluate the taxonomy, its use is not limited to quantitative methods. Implications for theory, research, and practice are discussed.
USDA-ARS?s Scientific Manuscript database
A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...
Quantitative prediction of drug side effects based on drug-related features.
Niu, Yanqing; Zhang, Wen
2017-09-01
Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.
Mixed-methods designs in mental health services research: a review.
Palinkas, Lawrence A; Horwitz, Sarah M; Chamberlain, Patricia; Hurlburt, Michael S; Landsverk, John
2011-03-01
Despite increased calls for use of mixed-methods designs in mental health services research, how and why such methods are being used and whether there are any consistent patterns that might indicate a consensus about how such methods can and should be used are unclear. Use of mixed methods was examined in 50 peer-reviewed journal articles found by searching PubMed Central and 60 National Institutes of Health (NIH)-funded projects found by searching the CRISP database over five years (2005-2009). Studies were coded for aims and the rationale, structure, function, and process for using mixed methods. A notable increase was observed in articles published and grants funded over the study period. However, most did not provide an explicit rationale for using mixed methods, and 74% gave priority to use of quantitative methods. Mixed methods were used to accomplish five distinct types of study aims (assess needs for services, examine existing services, develop new or adapt existing services, evaluate services in randomized controlled trials, and examine service implementation), with three categories of rationale, seven structural arrangements based on timing and weighting of methods, five functions of mixed methods, and three ways of linking quantitative and qualitative data. Each study aim was associated with a specific pattern of use of mixed methods, and four common patterns were identified. These studies offer guidance for continued progress in integrating qualitative and quantitative methods in mental health services research consistent with efforts by NIH and other funding agencies to promote their use.
Quantitative Evaluation Method of Each Generation Margin for Power System Planning
NASA Astrophysics Data System (ADS)
Su, Su; Tanaka, Kazuyuki
As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.
Quantitating Organoleptic Volatile Phenols in Smoke-Exposed Vitis vinifera Berries.
Noestheden, Matthew; Thiessen, Katelyn; Dennis, Eric G; Tiet, Ben; Zandberg, Wesley F
2017-09-27
Accurate methods for quantitating volatile phenols (i.e., guaiacol, syringol, 4-ethylphenol, etc.) in smoke-exposed Vitis vinifera berries prior to fermentation are needed to predict the likelihood of perceptible smoke taint following vinification. Reported here is a complete, cross-validated analytical workflow to accurately quantitate free and glycosidically bound volatile phenols in smoke-exposed berries using liquid-liquid extraction, acid-mediated hydrolysis, and gas chromatography-tandem mass spectrometry. The reported workflow addresses critical gaps in existing methods for volatile phenols that impact quantitative accuracy, most notably the effect of injection port temperature and the variability in acid-mediated hydrolytic procedures currently used. Addressing these deficiencies will help the wine industry make accurate, informed decisions when producing wines from smoke-exposed berries.
ERIC Educational Resources Information Center
Brown, K. M.; Elliott, S. J.; Leatherdale, S. T.; Robertson-Wilson, J.
2015-01-01
The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing…
Optofluidic time-stretch quantitative phase microscopy.
Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke
2018-03-01
Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.
An overview of recent developments and current status of gluten ELISA methods
USDA-ARS?s Scientific Manuscript database
ELISA methods for detecting and quantitating allergens have been around for some time and they are continuously improved. In this context, the development of gluten methods is no exception. Around the turn of the millennium, doubts were raised whether the existing “Skerritt-ELISA” would meet the 20 ...
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
Exploring the use of storytelling in quantitative research fields using a multiple case study method
NASA Astrophysics Data System (ADS)
Matthews, Lori N. Hamlet
The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.
Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W
2012-05-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.
Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.
2012-01-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539
NASA Astrophysics Data System (ADS)
Yuan, Zhen; Li, Xiaoqi; Xi, Lei
2014-06-01
Biomedical photoacoustic tomography (PAT), as a potential imaging modality, can visualize tissue structure and function with high spatial resolution and excellent optical contrast. It is widely recognized that the ability of quantitatively imaging optical absorption and scattering coefficients from photoacoustic measurements is essential before PAT can become a powerful imaging modality. Existing quantitative PAT (qPAT), while successful, has been focused on recovering absorption coefficient only by assuming scattering coefficient a constant. An effective method for photoacoustically recovering optical scattering coefficient is presently not available. Here we propose and experimentally validate such a method for quantitative scattering coefficient imaging using photoacoustic data from one-wavelength illumination. The reconstruction method developed combines conventional PAT with the photon diffusion equation in a novel way to realize the recovery of scattering coefficient. We demonstrate the method using various objects having scattering contrast only or both absorption and scattering contrasts embedded in turbid media. The listening-to-light-scattering method described will be able to provide high resolution scattering imaging for various biomedical applications ranging from breast to brain imaging.
Watkins, Daphne C.; Wharton, Tracy; Mitchell, Jamie A.; Matusko, Niki; Kales, Helen
2016-01-01
The purpose of this study was to explore the role of non-spousal family support on mental health among older, church-going African American men. The mixed methods objective was to employ a design that used existing qualitative and quantitative data to explore the interpretive context within which social and cultural experiences occur. Qualitative data (n=21) were used to build a conceptual model that was tested using quantitative data (n= 401). Confirmatory factor analysis indicated an inverse association between non-spousal family support and distress. The comparative fit index, Tucker-Lewis fit index, and root mean square error of approximation indicated good model fit. This study offers unique methodological approaches to using existing, complementary data sources to understand the health of African American men. PMID:28943829
The Use of Natural Language in Counseling Psychology Research.
ERIC Educational Resources Information Center
Polkinghorne, Donald E.
Compelling reasons for diversification of research methods in counseling psychology must exist in order to overcome natural academic inertia. The two most common arguments for implementing diversity--inapplicability of quantitative research to counseling practice and a need to derive research methods from the "new paradigm" of human…
Aligning Person-Centred Methods and Young People's Conceptualizations of Diversity
ERIC Educational Resources Information Center
Waite, Sue; Boyask, Ruth; Lawson, Hazel
2010-01-01
Many existing studies of diversity are concerned with social groups identified by externally determined factors, for example, ethnicity, gender, or educational attainment, and examine, either quantitatively or qualitatively, issues delineated by these. In evaluating methods used in previous research, we consider ways in which the adoption of…
Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R
2004-11-21
Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).
ERIC Educational Resources Information Center
Sondergeld, Toni A.; Koskey, Kristin L. K.
2011-01-01
An abundance of comprehensive school reform (CSR) literature exists illustrating CSRs are effective in improving student outcomes. However, much of this research reports on top-down reforms, focuses on academic outcomes, and uses quantitative methods alone. Many educational researchers have argued for the use of mixed methods for providing a…
A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses
Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert
2011-01-01
Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Development of a Computer-Based Visualised Quantitative Learning System for Playing Violin Vibrato
ERIC Educational Resources Information Center
Ho, Tracy Kwei-Liang; Lin, Huann-shyang; Chen, Ching-Kong; Tsai, Jih-Long
2015-01-01
Traditional methods of teaching music are largely subjective, with the lack of objectivity being particularly challenging for violin students learning vibrato because of the existence of conflicting theories. By using a computer-based analysis method, this study found that maintaining temporal coincidence between the intensity peak and the target…
Race and Ethnicity in Research Methods. Sage Focus Editions, Volume 157.
ERIC Educational Resources Information Center
Stanfield, John H., II, Ed.; Dennis, Rutledge M., Ed.
The contributions in this volume examine the array of methods used in quantitative, qualitative, and comparative and historical research to show how research sensitive to ethnic issues can best be conducted. Rethinking and revising traditional methodologies and applying new ones can portray racial and ethnic issues as they really exist. The…
Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.
Gao, Yi; Bouix, Sylvain
2016-05-01
Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.
Visualizing dispersive features in 2D image via minimum gradient method
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Yu; Wang, Yan; Shen, Zhi -Xun
Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less
Visualizing dispersive features in 2D image via minimum gradient method
He, Yu; Wang, Yan; Shen, Zhi -Xun
2017-07-24
Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less
NASA Astrophysics Data System (ADS)
Choi, Mi-Ran; Hundertmark, Dirk; Lee, Young-Ran
2017-10-01
We prove a threshold phenomenon for the existence/non-existence of energy minimizing solitary solutions of the diffraction management equation for strictly positive and zero average diffraction. Our methods allow for a large class of nonlinearities; they are, for example, allowed to change sign, and the weakest possible condition, it only has to be locally integrable, on the local diffraction profile. The solutions are found as minimizers of a nonlinear and nonlocal variational problem which is translation invariant. There exists a critical threshold λcr such that minimizers for this variational problem exist if their power is bigger than λcr and no minimizers exist with power less than the critical threshold. We also give simple criteria for the finiteness and strict positivity of the critical threshold. Our proof of existence of minimizers is rather direct and avoids the use of Lions' concentration compactness argument. Furthermore, we give precise quantitative lower bounds on the exponential decay rate of the diffraction management solitons, which confirm the physical heuristic prediction for the asymptotic decay rate. Moreover, for ground state solutions, these bounds give a quantitative lower bound for the divergence of the exponential decay rate in the limit of vanishing average diffraction. For zero average diffraction, we prove quantitative bounds which show that the solitons decay much faster than exponentially. Our results considerably extend and strengthen the results of Hundertmark and Lee [J. Nonlinear Sci. 22, 1-38 (2012) and Commun. Math. Phys. 309(1), 1-21 (2012)].
Quantitative photoacoustic elasticity and viscosity imaging for cirrhosis detection
NASA Astrophysics Data System (ADS)
Wang, Qian; Shi, Yujiao; Yang, Fen; Yang, Sihua
2018-05-01
Elasticity and viscosity assessments are essential for understanding and characterizing the physiological and pathological states of tissue. In this work, by establishing a photoacoustic (PA) shear wave model, an approach for quantitative PA elasticity imaging based on measurement of the rise time of the thermoelastic displacement was developed. Thus, using an existing PA viscoelasticity imaging method that features a phase delay measurement, quantitative PA elasticity imaging and viscosity imaging can be obtained in a simultaneous manner. The method was tested and validated by imaging viscoelastic agar phantoms prepared at different agar concentrations, and the imaging data were in good agreement with rheometry results. Ex vivo experiments on liver pathological models demonstrated the capability for cirrhosis detection, and the results were consistent with the corresponding histological results. This method expands the scope of conventional PA imaging and has potential to become an important alternative imaging modality.
Influence analysis in quantitative trait loci detection.
Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko
2014-07-01
This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Montagne, Lisa
2012-01-01
This mixed methods study combined quantitative statistics and qualitative inquiry to determine if any differences exist between how students in face-to-face and online college English composition courses performed on and demonstrated knowledge of the California state curriculum standards, and to explore the online learning environment in this…
[Mixed methods research in public health: issues and illustration].
Guével, Marie-Renée; Pommier, Jeanine
2012-01-01
For many years, researchers in a range of fields have combined quantitative and qualitative methods. However, the combined use of quantitative and qualitative methods has only recently been conceptualized and defined as mixed methods research. Some authors have described the emerging field as a third methodological tradition (in addition to the qualitative and quantitative traditions). Mixed methods research combines different perspectives and facilitates the study of complex interventions or programs, particularly in public health, an area where interdisciplinarity is critical. However, the existing literature is primarily in English. By contrast, the literature in French remains limited. The purpose of this paper is to present the emergence of mixed methods research for francophone public health specialists. A literature review was conducted to identify the main characteristics of mixed methods research. The results provide an overall picture of the mixed methods approach through its history, definitions, and applications, and highlight the tools developed to clarify the approach (typologies) and to implement it (integration of results and quality standards). The tools highlighted in the literature review are illustrated by a study conducted in France. Mixed methods research opens new possibilities for examining complex research questions and provides relevant and promising opportunities for addressing current public health issues in France.
Quantitation of permethylated N-glycans through multiple-reaction monitoring (MRM) LC-MS/MS.
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L; Mechref, Yehia
2015-04-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.
Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS
NASA Astrophysics Data System (ADS)
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia
2015-04-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.
Future Issues and Perspectives in the Evaluation of Social Development.
ERIC Educational Resources Information Center
Marsden, David; Oakley, Peter
1991-01-01
An instrumental/technocratic approach to evaluation of social development relies on primarily quantitative methods. An interpretive approach resists claims to legitimacy and authority of "experts" and questions existing interpretations. The latter approach is characterized by cultural relativism and subjectivity. (SK)
ERIC Educational Resources Information Center
Howard, Matt C.
2018-01-01
Scale pretests analyze the suitability of individual scale items for further analysis, whether through judging their face validity, wording concerns, and/or other aspects. The current article reviews scale pretests, separated by qualitative and quantitative methods, in order to identify the differences, similarities, and even existence of the…
REVIEW OF QUANTITATIVE STANDARDS AND GUIDELINES FOR FUNGI IN INDOOR AIR
Exposure to fungal aerosols clearly causes human disease. However, methods for assessing exposure remain poorly understood, and guidelines for interpreting data are often contradictory. The purposes of this paper are to review and compare existing guidelines for indoor airborne...
Bao, Yijun; Gaylord, Thomas K
2016-11-01
Multifilter phase imaging with partially coherent light (MFPI-PC) is a promising new quantitative phase imaging method. However, the existing MFPI-PC method is based on the paraxial approximation. In the present work, an analytical nonparaxial partially coherent phase optical transfer function is derived. This enables the MFPI-PC to be extended to the realistic nonparaxial case. Simulations over a wide range of test phase objects as well as experimental measurements on a microlens array verify higher levels of imaging accuracy compared to the paraxial method. Unlike the paraxial version, the nonparaxial MFPI-PC with obliquity factor correction exhibits no systematic error. In addition, due to its analytical expression, the increase in computation time compared to the paraxial version is negligible.
Wu, Yuhua; Wang, Yulei; Li, Jun; Li, Wei; Zhang, Li; Li, Yunjing; Li, Xiaofei; Li, Jun; Zhu, Li; Wu, Gang
2014-01-01
The Cauliflower mosaic virus (CaMV) 35S promoter (P35S) is a commonly used target for detection of genetically modified organisms (GMOs). There are currently 24 reported detection methods, targeting different regions of the P35S promoter. Initial assessment revealed that due to the absence of primer binding sites in the P35S sequence, 19 of the 24 reported methods failed to detect P35S in MON88913 cotton, and the other two methods could only be applied to certain GMOs. The rest three reported methods were not suitable for measurement of P35S in some testing events, because SNPs in binding sites of the primer/probe would result in abnormal amplification plots and poor linear regression parameters. In this study, we discovered a conserved region in the P35S sequence through sequencing of P35S promoters from multiple transgenic events, and developed new qualitative and quantitative detection systems targeting this conserved region. The qualitative PCR could detect the P35S promoter in 23 unique GMO events with high specificity and sensitivity. The quantitative method was suitable for measurement of P35S promoter, exhibiting good agreement between the amount of template and Ct values for each testing event. This study provides a general P35S screening method, with greater coverage than existing methods. PMID:25483893
Wirtz, M A; Strohmer, J
2016-06-01
In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.
Measuring water and sediment discharge from a road plot with a settling basin and tipping bucket
Thomas A. Black; Charles H. Luce
2013-01-01
A simple empirical method quantifies water and sediment production from a forest road surface, and is well suited for calibration and validation of road sediment models. To apply this quantitative method, the hydrologic technician installs bordered plots on existing typical road segments and measures coarse sediment production in a settling tank. When a tipping bucket...
Levitt, Joshua; Nitenson, Adam; Koyama, Suguru; Heijmans, Lonne; Curry, James; Ross, Jason T; Kamerling, Steven; Saab, Carl Y
2018-06-23
Electroencephalography (EEG) invariably contains extra-cranial artifacts that are commonly dealt with based on qualitative and subjective criteria. Failure to account for EEG artifacts compromises data interpretation. We have developed a quantitative and automated support vector machine (SVM)-based algorithm to accurately classify artifactual EEG epochs in awake rodent, canine and humans subjects. An embodiment of this method also enables the determination of 'eyes open/closed' states in human subjects. The levels of SVM accuracy for artifact classification in humans, Sprague Dawley rats and beagle dogs were 94.17%, 83.68%, and 85.37%, respectively, whereas 'eyes open/closed' states in humans were labeled with 88.60% accuracy. Each of these results was significantly higher than chance. Comparison with Existing Methods: Other existing methods, like those dependent on Independent Component Analysis, have not been tested in non-human subjects, and require full EEG montages, instead of only single channels, as this method does. We conclude that our EEG artifact detection algorithm provides a valid and practical solution to a common problem in the quantitative analysis and assessment of EEG in pre-clinical research settings across evolutionary spectra. Copyright © 2018. Published by Elsevier B.V.
Quantitative optical scanning tests of complex microcircuits
NASA Technical Reports Server (NTRS)
Erickson, J. J.
1980-01-01
An approach for the development of the optical scanner as a screening inspection instrument for microcircuits involves comparing the quantitative differences in photoresponse images and then correlating them with electrical parameter differences in test devices. The existing optical scanner was modified so that the photoresponse data could be recorded and subsequently digitized. A method was devised for applying digital image processing techniques to the digitized photoresponse data in order to quantitatively compare the data. Electrical tests were performed and photoresponse images were recorded before and following life test intervals on two groups of test devices. Correlations were made between differences or changes in the electrical parameters of the test devices.
Quantitative ESD Guidelines for Charged Spacecraft Derived from the Physics of Discharges
NASA Technical Reports Server (NTRS)
Frederickson, A. R.
1992-01-01
Quantitative guidelines are proposed for Electrostatic Discharge (ESD) pulse shape on charged spacecraft. The guidelines are based on existing ground test data, and on a physical description of the pulsed discharge process. The guidelines are designed to predict pulse shape for surface charging and internal charging on a wide variety of spacecraft structures. The pulses depend on the area of the sample, its capacitance to ground, and the strength of the electric field in the vacuum adjacent to the charged surface. By knowing the pulse shape, current vs. time, one can determine if nearby circuits are threatened by the pulse. The quantitative guidelines might be used to estimate the level of threat to an existing spacecraft, or to redesign a spacecraft to reduce its pulses to a known safe level. The experiments which provide the data and the physics that allow one to interpret the data will be discussed, culminating in examples of how to predict pulse shape/size. This method has been used, but not confirmed, on several spacecraft.
Cardiac Rehabilitation Online Pilot: Extending Reach of Cardiac Rehabilitation.
Higgins, Rosemary O; Rogerson, Michelle; Murphy, Barbara M; Navaratnam, Hema; Butler, Michael V; Barker, Lauren; Turner, Alyna; Lefkovits, Jeffrey; Jackson, Alun C
While cardiac rehabilitation (CR) is recommended for all patients after an acute cardiac event, limitations exist in reach. The purpose of the current study was to develop and pilot a flexible online CR program based on self-management principles "Help Yourself Online." The program was designed as an alternative to group-based CR as well as to complement traditional CR. The program was based on existing self-management resources developed previously by the Heart Research Centre. Twenty-one patients admitted to Cabrini Health for an acute cardiac event were recruited to test the program. The program was evaluated using qualitative and quantitative methods. Quantitative results demonstrated that patients believed the program would assist them in their self-management. Qualitative evaluation, using focus group and interview methods with 15 patients, showed that patients perceived the online CR approach to be a useful instrument for self-management. Broader implications of the data include the acceptability of the intervention, timing of intervention delivery, and patients' desire for additional online community support.
Zhou, Yan; Cao, Hui
2013-01-01
We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.
Remily-Wood, Elizabeth R.; Benson, Kaaron; Baz, Rachid C.; Chen, Y. Ann; Hussein, Mohamad; Hartley-Brown, Monique A.; Sprung, Robert W.; Perez, Brianna; Liu, Richard Z.; Yoder, Sean; Teer, Jamie; Eschrich, Steven A.; Koomen, John M.
2014-01-01
Purpose Quantitative mass spectrometry assays for immunoglobulins (Igs) are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, e.g. multiple myeloma. Experimental design Using LC-MS/MS data, Ig constant region peptides and transitions were selected for liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM). Quantitative assays were used to assess Igs in serum from 83 patients. Results LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1–4, IgA1–2, IgM, IgD, and IgE, as well as kappa(κ) and lambda(λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 multiple myeloma cell line and two MM patients. Conclusions and Clinical Relevance LC-MRM assays targeting constant region peptides determine the type and isoform of the involved immunoglobulin and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher interassay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. PMID:24723328
Remily-Wood, Elizabeth R; Benson, Kaaron; Baz, Rachid C; Chen, Y Ann; Hussein, Mohamad; Hartley-Brown, Monique A; Sprung, Robert W; Perez, Brianna; Liu, Richard Z; Yoder, Sean J; Teer, Jamie K; Eschrich, Steven A; Koomen, John M
2014-10-01
Quantitative MS assays for Igs are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, for example, multiple myeloma (MM). Using LC-MS/MS data, Ig constant region peptides, and transitions were selected for LC-MRM MS. Quantitative assays were used to assess Igs in serum from 83 patients. RNA sequencing and peptide-based LC-MRM are used to define peptides for quantification of the disease-specific Ig. LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1-4, IgA1-2, IgM, IgD, and IgE, as well as kappa (κ) and lambda (λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 MM cell line and two MM patients. LC-MRM assays targeting constant region peptides determine the type and isoform of the involved Ig and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher inter-assay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Reisdorph, Nichole; Armstrong, Michael; Powell, Roger; Quinn, Kevin; Legg, Kevin; Leung, Donald; Reisdorph, Rick
2018-05-01
Previous work from our laboratories utilized a novel skin taping method and mass spectrometry-based proteomics to discover clinical biomarkers of skin conditions; these included atopic dermatitis, Staphylococcus aureus colonization, and eczema herpeticum. While suitable for discovery purposes, semi-quantitative proteomics is generally time-consuming and expensive. Furthermore, depending on the method used, discovery-based proteomics can result in high variation and inadequate sensitivity to detect low abundant peptides. Therefore, we strove to develop a rapid, sensitive, and reproducible method to quantitate disease-related proteins from skin tapings. We utilized isotopically-labeled peptides and tandem mass spectrometry to obtain absolute quantitation values on 14 peptides from 7 proteins; these proteins had shown previous importance in skin disease. The method demonstrated good reproducibility, dynamic range, and linearity (R 2 > 0.993) when n = 3 standards were analyzed across 0.05-2.5 pmol. The method was used to determine if differences exist between skin proteins in a small group of atopic versus non-atopic individuals (n = 12). While only minimal differences were found, peptides were detected in all samples and exhibited good correlation between peptides for 5 of the 7 proteins (R 2 = 0.71-0.98). This method can be applied to larger cohorts to further establish the relationships of these proteins to skin disease. Copyright © 2017. Published by Elsevier B.V.
Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia
2015-01-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan structures was determined to be 30% while it was found to be 35% for either fucosylated or sialylated structures The optimum CE for mannose and complex type N-glycan structures was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan structures in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these structures was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitudes. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan structures enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples. PMID:25698222
A PROPOSED FRAMEWORK FOR ASSESSING RISK FROM LESS-THAN-LIFETIME EXPOSURES TO CARCINOGENS
Quantitative cancer risk assessment methods have been developed for daily, lifetime human exposures, but not for exposures that are less than lifetime. Few examples for less-than-Iifetime exposures exist in the published literature. To move cancer risk assessment beyond reliance ...
Development of a Patient-Centered Antipsychotic Medication Adherence Intervention
ERIC Educational Resources Information Center
Pyne, Jeffrey M.; Fischer, Ellen P.; Gilmore, LaNissa; McSweeney, Jean C.; Stewart, Katharine E.; Mittal, Dinesh; Bost, James E.; Valenstein, Marcia
2014-01-01
Objective: A substantial gap exists between patients and their mental health providers about patient's perceived barriers, facilitators, and motivators (BFMs) for taking antipsychotic medications. This article describes how we used an intervention mapping (IM) framework coupled with qualitative and quantitative item-selection methods to…
ERIC Educational Resources Information Center
Bond, Sarah
2017-01-01
The purpose of this mixed-methods study was to identify and describe what correlations, if any, exist between the composite Phelps Kindergarten Readiness Scale (PKRS) score, the visual-perceptual subtest of the PKRS, and reading achievement by the end of grade one. The quantitative data used in this study were the PKRS scores from 421 students…
Evaluation of methods for determining hardware projected life
NASA Technical Reports Server (NTRS)
1971-01-01
An investigation of existing methods of predicting hardware life is summarized by reviewing programs having long life requirements, current research efforts on long life problems, and technical papers reporting work on life predicting techniques. The results indicate that there are no accurate quantitative means to predict hardware life for system level hardware. The effectiveness of test programs and the cause of hardware failures is considered.
Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C
2015-05-01
Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.
Clark, A M
1998-06-01
Critiques of logical positivism form the foundation for a significant number of nursing research papers, with the philosophy being inappropriately deemed synonymous with empirical method. Frequently, proposing an alternative method to those identified with the quantitative paradigm, these critiques are based on a poor foundation. This paper highlights an alternative philosophy to positivism which can also underpin empirical inquiry, that of post-positivism. Post-positivism is contrasted with positivism, which is presented as an outmoded and rejected philosophy which should cease to significantly shape inquiry. Though some acknowledgement of post-positivism has occurred in the nursing literature, this has yet to permeate into mainstream nursing research. Many still base their arguments on a positivistic view of science. Through achievement of a better understanding of post-positivism and greater focus on explicating the philosophical assumptions underpinning all research methods, the distinctions that have long been perceived to exist between qualitative and quantitative methodologies can be confined to the past. Rather methods will be selected solely on the nature of research questions.
Determining open cluster membership. A Bayesian framework for quantitative member classification
NASA Astrophysics Data System (ADS)
Stott, Jonathan J.
2018-01-01
Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.
Hattotuwagama, Channa K; Doytchinova, Irini A; Flower, Darren R
2007-01-01
Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.
Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko
2008-04-01
The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.
Salvador, Arnaud; Dubreuil, Didier; Denouel, Jannick; Millerioux, L
2005-06-25
A sensitive LC-MS-MS assay for the quantitative determination of bromocriptine has been developed and validated and is described in this work. The assay involved the extraction of the analyte from 1 ml of human plasma using a solid phase extraction on Oasis MCX cartridges. Chromatography was performed on a Symmetry C18 (2.1 mm x 100 mm, 3.5 microm) column using a mobile phase consisting of 25:75:01 acetonitrile-water-formic acid with a flow rate of 250 microl/min. The linearity was within the concentration range of 2-500 pg/ml. The lower limit of quantification was 2 pg/ml. This method has been demonstrated to be an improvement over existing methods due to its greater sensitivity and specificity.
Quantitative interpretation of Great Lakes remote sensing data
NASA Technical Reports Server (NTRS)
Shook, D. F.; Salzman, J.; Svehla, R. A.; Gedney, R. T.
1980-01-01
The paper discusses the quantitative interpretation of Great Lakes remote sensing water quality data. Remote sensing using color information must take into account (1) the existence of many different organic and inorganic species throughout the Great Lakes, (2) the occurrence of a mixture of species in most locations, and (3) spatial variations in types and concentration of species. The radiative transfer model provides a potential method for an orderly analysis of remote sensing data and a physical basis for developing quantitative algorithms. Predictions and field measurements of volume reflectances are presented which show the advantage of using a radiative transfer model. Spectral absorptance and backscattering coefficients for two inorganic sediments are reported.
Teacher Burnout: A Laughing Matter
ERIC Educational Resources Information Center
McKenzie, Kemberly Elaine
2009-01-01
This study employed a quantitative research method and examined the relationship among teacher burnout, job-related stress and humor coping styles in 306 high school teachers from the south suburbs of Chicago, Illinois. In particular, this study sought to determine if a relationship existed among the three variables by examining if teacher burnout…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Hao; Zhang, Guifu; Zhao, Kun
A hybrid method of combining linear programming (LP) and physical constraints is developed to estimate specific differential phase (K DP) and to improve rain estimation. Moreover, the hybrid K DP estimator and the existing estimators of LP, least squares fitting, and a self-consistent relation of polarimetric radar variables are evaluated and compared using simulated data. Our simulation results indicate the new estimator's superiority, particularly in regions where backscattering phase (δ hv) dominates. Further, a quantitative comparison between auto-weather-station rain-gauge observations and K DP-based radar rain estimates for a Meiyu event also demonstrate the superiority of the hybrid K DP estimatormore » over existing methods.« less
The use of virtual environments for percentage view analysis.
Schofield, Damian; Cox, Christopher J B
2005-09-01
It is recognised that Visual Impact Assessment (VIA), unlike many other aspects of Environmental Impact Assessments (EIA), relies less upon measurement than upon experience and judgement. Hence, it is necessary for a more structured and consistent approach towards VIA, reducing the amount of bias and subjectivity. For proposed developments, there are very few quantitative techniques for the evaluation of visibility, and these existing methods can be highly inaccurate and time consuming. Percentage view changes are one of the few quantitative techniques, and the use of computer technology can reduce the inaccuracy and the time spent evaluating the visibility of either existing or proposed developments. For over 10 years, research work undertaken by the authors at the University of Nottingham has employed Computer Graphics (CG) and Virtual Reality (VR) in civilian and industrial contexts for environmental planning, design visualisation, accident reconstruction, risk analysis, data visualisation and training simulators. This paper describes a method to quantitatively assess the visual impact of proposed developments on the landscape using CG techniques. This method allows the determination of accurate percentage view changes with the use of a computer-generated model of the environment and the application of specialist software that has been developed at the University of Nottingham. The principles are easy to understand and therefore planners, authorisation agencies and members of the public can use and understand the results. A case study is shown to demonstrate the application and the capabilities of the technology.
Musante, CJ; Abernethy, DR; Allerheiligen, SR; Lauffenburger, DA
2016-01-01
Quantitative Systems Pharmacology (QSP) is experiencing increased application in the drug discovery and development process. Like its older sibling, systems biology, the QSP field is comprised of a mix of established disciplines and methods, from molecular biology to engineering to pharmacometrics.1 As a result, there exist critical segments of the discipline that differ dramatically in approach and a need to bring these groups together toward a common goal. PMID:27639191
Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J
2015-12-01
The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Quantitative PCR detection of Batrachochytrium dendrobatidis DNA from sediments and water
Kirshtein, Julie D.; Anderson, Chauncey W.; Wood, J.S.; Longcore, Joyce E.; Voytek, Mary A.
2007-01-01
The fungal pathogen Batrachochytrium dendrobatidis (Bd) causes chytridiomycosis, a disease implicated in amphibian declines on 5 continents. Polymerase chain reaction (PCR) primer sets exist with which amphibians can be tested for this disease, and advances in sampling techniques allow non-invasive testing of animals. We developed filtering and PCR based quantitative methods by modifying existing PCR assays to detect Bd DNA in water and sediments, without the need for testing amphibians; we tested the methods at 4 field sites. The SYBR based assay using Boyle primers (SYBR/Boyle assay) and the Taqman based assay using Wood primers performed similarly with samples generated in the laboratory (Bd spiked filters), but the SYBR/Boyle assay detected Bd DNA in more field samples. We detected Bd DNA in water from 3 of 4 sites tested, including one pond historically negative for chytridiomycosis. Zoospore equivalents in sampled water ranged from 19 to 454 l-1 (nominal detection limit is 10 DNA copies, or about 0.06 zoospore). We did not detect DNA of Bd from sediments collected at any sites. Our filtering and amplification methods provide a new tool to investigate critical aspects of Bd in the environment. ?? Inter-Research 2007.
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
NASA Astrophysics Data System (ADS)
Liang, Li; Takaaki, Ohkubo; Guang-hui, Li
2018-03-01
In recent years, earthquakes have occurred frequently, and the seismic performance of existing school buildings has become particularly important. The main method for improving the seismic resistance of existing buildings is reinforcement. However, there are few effective methods to evaluate the effect of reinforcement. Ambient vibration measurement experiments were conducted before and after seismic retrofitting using wireless measurement system and the changes of vibration characteristics were compared. The changes of acceleration response spectrum, natural periods and vibration modes indicate that the wireless vibration measurement system can be effectively applied to evaluate the effect of seismic retrofitting. The method can evaluate the effect of seismic retrofitting qualitatively, it is difficult to evaluate the effect of seismic retrofitting quantitatively at this stage.
Huang, Hao; Zhang, Guifu; Zhao, Kun; ...
2016-10-20
A hybrid method of combining linear programming (LP) and physical constraints is developed to estimate specific differential phase (K DP) and to improve rain estimation. Moreover, the hybrid K DP estimator and the existing estimators of LP, least squares fitting, and a self-consistent relation of polarimetric radar variables are evaluated and compared using simulated data. Our simulation results indicate the new estimator's superiority, particularly in regions where backscattering phase (δ hv) dominates. Further, a quantitative comparison between auto-weather-station rain-gauge observations and K DP-based radar rain estimates for a Meiyu event also demonstrate the superiority of the hybrid K DP estimatormore » over existing methods.« less
Fuzzy method of recognition of high molecular substances in evidence-based biology
NASA Astrophysics Data System (ADS)
Olevskyi, V. I.; Smetanin, V. T.; Olevska, Yu. B.
2017-10-01
Nowadays modern requirements to achieving reliable results along with high quality of researches put mathematical analysis methods of results at the forefront. Because of this, evidence-based methods of processing experimental data have become increasingly popular in the biological sciences and medicine. Their basis is meta-analysis, a method of quantitative generalization of a large number of randomized trails contributing to a same special problem, which are often contradictory and performed by different authors. It allows identifying the most important trends and quantitative indicators of the data, verification of advanced hypotheses and discovering new effects in the population genotype. The existing methods of recognizing high molecular substances by gel electrophoresis of proteins under denaturing conditions are based on approximate methods for comparing the contrast of electrophoregrams with a standard solution of known substances. We propose a fuzzy method for modeling experimental data to increase the accuracy and validity of the findings of the detection of new proteins.
Identifying core competencies for public health epidemiologists.
Bondy, Susan J; Johnson, Ian; Cole, Donald C; Bercovitz, Kim
2008-01-01
Public health authorities have prioritized the identification of competencies, yet little empirical data exist to support decisions on competency selection among particular disciplines. We sought perspectives on important competencies among epidemiologists familiar with or practicing in public health settings (local to national). Using a sequential, qualitative-quantitative mixed method design, we conducted key informant interviews with 12 public health practitioners familiar with front-line epidemiologists' practice, followed by a web-based survey of members of a provincial association of public health epidemiologists (90 respondents of 155 eligible) and a consensus workshop. Competency statements were drawn from existing core competency lists and those identified by key informants, and ranked by extent of agreement in importance for entry-level practitioners. Competencies in quantitative methods and analysis, critical appraisal of scientific evidence and knowledge transfer of scientific data to other members of the public health team were all regarded as very important for public health epidemiologists. Epidemiologist competencies focused on the provision, interpretation and 'translation' of evidence to inform decision-making by other public health professionals. Considerable tension existed around some potential competency items, particularly in the areas of more advanced database and data-analytic skills. Empirical data can inform discussions of discipline-specific competencies as one input to decisions about competencies appropriate for epidemiologists in the public health workforce.
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
Hong, Quan Nha; Pluye, Pierre; Bujold, Mathieu; Wassef, Maggy
2017-03-23
Systematic reviews of qualitative and quantitative evidence can provide a rich understanding of complex phenomena. This type of review is increasingly popular, has been used to provide a landscape of existing knowledge, and addresses the types of questions not usually covered in reviews relying solely on either quantitative or qualitative evidence. Although several typologies of synthesis designs have been developed, none have been tested on a large sample of reviews. The aim of this review of reviews was to identify and develop a typology of synthesis designs and methods that have been used and to propose strategies for synthesizing qualitative and quantitative evidence. A review of systematic reviews combining qualitative and quantitative evidence was performed. Six databases were searched from inception to December 2014. Reviews were included if they were systematic reviews combining qualitative and quantitative evidence. The included reviews were analyzed according to three concepts of synthesis processes: (a) synthesis methods, (b) sequence of data synthesis, and (c) integration of data and synthesis results. A total of 459 reviews were included. The analysis of this literature highlighted a lack of transparency in reporting how evidence was synthesized and a lack of consistency in the terminology used. Two main types of synthesis designs were identified: convergent and sequential synthesis designs. Within the convergent synthesis design, three subtypes were found: (a) data-based convergent synthesis design, where qualitative and quantitative evidence is analyzed together using the same synthesis method, (b) results-based convergent synthesis design, where qualitative and quantitative evidence is analyzed separately using different synthesis methods and results of both syntheses are integrated during a final synthesis, and (c) parallel-results convergent synthesis design consisting of independent syntheses of qualitative and quantitative evidence and an interpretation of the results in the discussion. Performing systematic reviews of qualitative and quantitative evidence is challenging because of the multiple synthesis options. The findings provide guidance on how to combine qualitative and quantitative evidence. Also, recommendations are made to improve the conducting and reporting of this type of review.
Biurrun Manresa, José A.; Arguissain, Federico G.; Medina Redondo, David E.; Mørch, Carsten D.; Andersen, Ole K.
2015-01-01
The agreement between humans and algorithms on whether an event-related potential (ERP) is present or not and the level of variation in the estimated values of its relevant features are largely unknown. Thus, the aim of this study was to determine the categorical and quantitative agreement between manual and automated methods for single-trial detection and estimation of ERP features. To this end, ERPs were elicited in sixteen healthy volunteers using electrical stimulation at graded intensities below and above the nociceptive withdrawal reflex threshold. Presence/absence of an ERP peak (categorical outcome) and its amplitude and latency (quantitative outcome) in each single-trial were evaluated independently by two human observers and two automated algorithms taken from existing literature. Categorical agreement was assessed using percentage positive and negative agreement and Cohen’s κ, whereas quantitative agreement was evaluated using Bland-Altman analysis and the coefficient of variation. Typical values for the categorical agreement between manual and automated methods were derived, as well as reference values for the average and maximum differences that can be expected if one method is used instead of the others. Results showed that the human observers presented the highest categorical and quantitative agreement, and there were significantly large differences between detection and estimation of quantitative features among methods. In conclusion, substantial care should be taken in the selection of the detection/estimation approach, since factors like stimulation intensity and expected number of trials with/without response can play a significant role in the outcome of a study. PMID:26258532
Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong
2017-07-01
One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Financial Burden of Attending University in Georgia: Implications for Rural Students
ERIC Educational Resources Information Center
Chankseliani, Maia
2013-01-01
By evaluating the impact of policies to financially support university students in Georgia, this article demonstrates the systematic spatial disparities that exist in a context of formally equal competition. The author uses a mixed-methods design, combining quantitative evidence on the entire population of Georgian university applicants in…
A Systematic Review of Research on the Flipped Learning Method in Engineering Education
ERIC Educational Resources Information Center
Karabulut-Ilgu, Aliye; Jaramillo Cherrez, Nadia; Jahren, Charles T.
2018-01-01
The purpose of this article is to describe the current state of knowledge and practice in the flipped learning approach in engineering education and to provide guidance for practitioners by critically appraising and summarizing existing research. This article is a qualitative synthesis of quantitative and qualitative research investigating the…
ERIC Educational Resources Information Center
Roberts, Kelly D.; Park, Hye Jin; Brown, Steven; Cook, Bryan
2011-01-01
Universal Design for Instruction (UDI) in postsecondary education is a relatively new concept/framework that has generated significant support. The purpose of this literature review was to examine existing empirical research, including qualitative, quantitative, and mixed methods, on the use of UDI (and related terms) in postsecondary education.…
Projecting Enrollment in Rural Schools: A Study of Three Vermont School Districts
ERIC Educational Resources Information Center
Grip, Richard S.
2004-01-01
Large numbers of rural districts have experienced sharp declines in enrollment, unlike their suburban counterparts. Accurate enrollment projections are required, whether a district needs to build new schools or consolidate existing ones. For school districts having more than 600 students, a quantitative method such as the Cohort-Survival Ratio…
USDA-ARS?s Scientific Manuscript database
MicroRNAs (miRNAs) ubiquitously exist in microorganisms, plants and animals, and appear to modulate a wide range of critical biological processes. However, no definitive conclusion has been reached regarding the uptake of exogenous dietary small RNAs into mammalian circulation and organs and cross-k...
Validation of the Quantitative Diagnostic Thinking Inventory for Athletic Training: A Pilot Study
ERIC Educational Resources Information Center
Kicklighter, Taz; Barnum, Mary; Geisler, Paul R.; Martin, Malissa
2016-01-01
Context: The cognitive process of making a clinical decision lies somewhere on a continuum between novices using hypothetico-deductive reasoning and experts relying more on case pattern recognition. Although several methods exist for measuring facets of clinical reasoning in specific situations, none have been experimentally applied, as of yet, to…
Wang, Zhenyu; Li, Shiming; Ferguson, Stephen; Goodnow, Robert; Ho, Chi-Tang
2008-01-01
Polymethoxyflavones (PMFs), which exist exclusively in the citrus genus, have biological activities including anti-inflammatory, anticarcinogenic, and antiatherogenic properties. A validated RPLC method was developed for quantitative analysis of six major PMFs, namely nobiletin, tangeretin, sinensetin, 5,6,7,4'-tetramethoxyflavone, 3,5,6,7,3',4'-hexamethoxyflavone, and 3,5,6,7,8,3',4'-heptamethoxyflavone. The polar embedded LC stationary phase was able to fully resolve the six analogues. The developed method was fully validated in terms of linearity, accuracy, precision, sensitivity, and system suitability. The LOD of the method was calculated as 0.15 microg/mL and the recovery rate was between 97.0 and 105.1%. This analytical method was successfully applied to quantify the individual PMFs in four commercially available citrus peel extracts (CPEs). Each extract shows significant difference in the PMF composition and concentration. This method may provide a simple, rapid, and reliable tool to help reveal the correlation between the bioactivity of the PMF extracts and the individual PMF content.
Smith, Eric G.
2015-01-01
Background: Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method: A method is presented that exploits the recently-identified phenomenon of “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors. Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure. Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results: Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met. Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations: Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions: To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward. The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226
Experimental Demonstration of In-Place Calibration for Time Domain Microwave Imaging System
NASA Astrophysics Data System (ADS)
Kwon, S.; Son, S.; Lee, K.
2018-04-01
In this study, the experimental demonstration of in-place calibration was conducted using the developed time domain measurement system. Experiments were conducted using three calibration methods—in-place calibration and two existing calibrations, that is, array rotation and differential calibration. The in-place calibration uses dual receivers located at an equal distance from the transmitter. The received signals at the dual receivers contain similar unwanted signals, that is, the directly received signal and antenna coupling. In contrast to the simulations, the antennas are not perfectly matched and there might be unexpected environmental errors. Thus, we experimented with the developed experimental system to demonstrate the proposed method. The possible problems with low signal-to-noise ratio and clock jitter, which may exist in time domain systems, were rectified by averaging repeatedly measured signals. The tumor was successfully detected using the three calibration methods according to the experimental results. The cross correlation was calculated using the reconstructed image of the ideal differential calibration for a quantitative comparison between the existing rotation calibration and the proposed in-place calibration. The mean value of cross correlation between the in-place calibration and ideal differential calibration was 0.80, and the mean value of cross correlation of the rotation calibration was 0.55. Furthermore, the results of simulation were compared with the experimental results to verify the in-place calibration method. A quantitative analysis was also performed, and the experimental results show a tendency similar to the simulation.
Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane
2018-05-01
This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Dokukin, M; Sokolov, I
2015-07-28
Dynamic mechanical spectroscopy (DMS), which allows measuring frequency-dependent viscoelastic properties, is important to study soft materials, tissues, biomaterials, polymers. However, the existing DMS techniques (nanoindentation) have limited resolution when used on soft materials, preventing them from being used to study mechanics at the nanoscale. The nanoindenters are not capable of measuring cells, nanointerfaces of composite materials. Here we present a highly accurate DMS modality, which is a combination of three different methods: quantitative nanoindentation (nanoDMA), gentle force and fast response of atomic force microscopy (AFM), and Fourier transform (FT) spectroscopy. This new spectroscopy (which we suggest to call FT-nanoDMA) is fast and sensitive enough to allow DMS imaging of nanointerfaces, single cells, while attaining about 100x improvements on polymers in both spatial (to 10-70 nm) and temporal resolution (to 0.7 s/pixel) compared to the current art. Multiple frequencies are measured simultaneously. The use of 10 frequencies are demonstrated here (up to 300 Hz which is a rather relevant range for biological materials and polymers, in both ambient conditions and liquid). The method is quantitatively verified on known polymers and demonstrated on cells and polymers blends. Analysis shows that FT-nanoDMA is highly quantitative. The FT-nanoDMA spectroscopy can easily be implemented in the existing AFMs.
Dokukin, M.; Sokolov, I.
2015-01-01
Dynamic mechanical spectroscopy (DMS), which allows measuring frequency-dependent viscoelastic properties, is important to study soft materials, tissues, biomaterials, polymers. However, the existing DMS techniques (nanoindentation) have limited resolution when used on soft materials, preventing them from being used to study mechanics at the nanoscale. The nanoindenters are not capable of measuring cells, nanointerfaces of composite materials. Here we present a highly accurate DMS modality, which is a combination of three different methods: quantitative nanoindentation (nanoDMA), gentle force and fast response of atomic force microscopy (AFM), and Fourier transform (FT) spectroscopy. This new spectroscopy (which we suggest to call FT-nanoDMA) is fast and sensitive enough to allow DMS imaging of nanointerfaces, single cells, while attaining about 100x improvements on polymers in both spatial (to 10–70 nm) and temporal resolution (to 0.7s/pixel) compared to the current art. Multiple frequencies are measured simultaneously. The use of 10 frequencies are demonstrated here (up to 300 Hz which is a rather relevant range for biological materials and polymers, in both ambient conditions and liquid). The method is quantitatively verified on known polymers and demonstrated on cells and polymers blends. Analysis shows that FT-nanoDMA is highly quantitative. The FT-nanoDMA spectroscopy can easily be implemented in the existing AFMs. PMID:26218346
Image-Based Quantification of Plant Immunity and Disease.
Laflamme, Bradley; Middleton, Maggie; Lo, Timothy; Desveaux, Darrell; Guttman, David S
2016-12-01
Measuring the extent and severity of disease is a critical component of plant pathology research and crop breeding. Unfortunately, existing visual scoring systems are qualitative, subjective, and the results are difficult to transfer between research groups, while existing quantitative methods can be quite laborious. Here, we present plant immunity and disease image-based quantification (PIDIQ), a quantitative, semi-automated system to rapidly and objectively measure disease symptoms in a biologically relevant context. PIDIQ applies an ImageJ-based macro to plant photos in order to distinguish healthy tissue from tissue that has yellowed due to disease. It can process a directory of images in an automated manner and report the relative ratios of healthy to diseased leaf area, thereby providing a quantitative measure of plant health that can be statistically compared with appropriate controls. We used the Arabidopsis thaliana-Pseudomonas syringae model system to show that PIDIQ is able to identify both enhanced plant health associated with effector-triggered immunity as well as elevated disease symptoms associated with effector-triggered susceptibility. Finally, we show that the quantitative results provided by PIDIQ correspond to those obtained via traditional in planta pathogen growth assays. PIDIQ provides a simple and effective means to nondestructively quantify disease from whole plants and we believe it will be equally effective for monitoring disease on excised leaves and stems.
[A novel quantitative approach to study dynamic anaerobic process at micro scale].
Zhang, Zhong-Liang; Wu, Jing; Jiang, Jian-Kai; Jiang, Jie; Li, Huai-Zhi
2012-11-01
Anaerobic digestion is attracting more and more interests because of its advantages such as low cost and recovery of clean energy etc. In order to overcome the drawbacks of the existed methods to study the dynamic anaerobic process, a novel microscopical quantitative approach at the granule level was developed combining both the microdevice and the quantitative image analysis techniques. This experiment displayed the process and characteristics of the gas production at static state for the first time and the results indicated that the method was of satisfactory repeatability. The gas production process at static state could be divided into three stages including rapid linear increasing stage, decelerated increasing stage and slow linear increasing stage. The rapid linear increasing stage was long and the biogas rate was high under high initial organic loading rate. The results showed that it was feasible to make the anaerobic process to be carried out in the microdevice; furthermore this novel method was reliable and could clearly display the dynamic process of the anaerobic reaction at the micro scale. The results are helpful to understand the anaerobic process.
Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.
Hamlet, Stephen M
2010-01-01
The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.
Hybrid statistics-simulations based method for atom-counting from ADF STEM images.
De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra
2017-06-01
A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.
Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207
Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.
Liang, Xiaoping; Zhang, Qizhi; Jiang, Huabei
2006-11-10
We show that a two-step reconstruction method can be adapted to improve the quantitative accuracy of the refractive index reconstruction in phase-contrast diffuse optical tomography (PCDOT). We also describe the possibility of imaging tissue glucose concentration with PCDOT. In this two-step method, we first use our existing finite-element reconstruction algorithm to recover the position and shape of a target. We then use the position and size of the target as a priori information to reconstruct a single value of the refractive index within the target and background regions using a region reconstruction method. Due to the extremely low contrast available in the refractive index reconstruction, we incorporate a data normalization scheme into the two-step reconstruction to combat the associated low signal-to-noise ratio. Through a series of phantom experiments we find that this two-step reconstruction method can considerably improve the quantitative accuracy of the refractive index reconstruction. The results show that the relative error of the reconstructed refractive index is reduced from 20% to within 1.5%. We also demonstrate the possibility of PCDOT for recovering glucose concentration using these phantom experiments.
Bailey, Ajay; Hutter, Inge
2008-10-01
With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is frequently used to identify risk behaviours and adherence behaviour in the field of HIV/AIDS. Risk behaviour studies that apply HBM have been largely quantitative and use of qualitative methodology is rare. The marriage of qualitative and quantitative methods has never been easy. The challenge is in triangulating the methods. Method triangulation has been largely used to combine insights from the qualitative and quantitative methods but not to link both the methods. In this paper we suggest a linked trajectory of method triangulation (LTMT). The linked trajectory aims to first gather individual level information through in-depth interviews and then to present the information as vignettes in focus group discussions. We thus validate information obtained from in-depth interviews and gather emic concepts that arise from the interaction. We thus capture both the interpretation and the interaction angles of the qualitative method. Further, using the qualitative information gained, a survey is designed. In doing so, the survey questions are grounded and contextualized. We employed this linked trajectory of method triangulation in a study on the risk assessment of HIV/AIDS among migrant and mobile men. Fieldwork was carried out in Goa, India. Data come from two waves of studies, first an explorative qualitative study (2003), second a larger study (2004-2005), including in-depth interviews (25), focus group discussions (21) and a survey (n=1259). By employing the qualitative to quantitative LTMT we can not only contextualize the existing concepts of the HBM, but also validate new concepts and identify new risk groups.
Knapp, Julika; Allesch, Astrid; Müller, Wolfgang; Bockreis, Anke
2017-11-01
Recycling of waste materials is desirable to reduce the consumption of limited primary resources, but also includes the risk of recycling unwanted, hazardous substances. In Austria, the legal framework demands secondary products must not present a higher risk than comparable products derived from primary resources. However, the act provides no definition on how to assess this risk potential. This paper describes the development of different quantitative and qualitative methods to estimate the transfer of contaminants in recycling processes. The quantitative methods comprise the comparison of concentrations of harmful substances in recycling products to corresponding primary products and to existing limit values. The developed evaluation matrix, which considers further aspects, allows for the assessment of the qualitative risk potential. The results show that, depending on the assessed waste fraction, particular contaminants can be critical. Their concentrations were higher than in comparable primary materials and did not comply with existing limit values. On the other hand, the results show that a long-term, well-established quality control system can assure compliance with the limit values. The results of the qualitative assessment obtained with the evaluation matrix support the results of the quantitative assessment. Therefore, the evaluation matrix can be suitable to quickly screen waste streams used for recycling to estimate their potential environmental and health risks. To prevent the transfer of contaminants into product cycles, improved data of relevant substances in secondary resources are necessary. In addition, regulations for material recycling are required to assure adequate quality control measures, including limit values. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Xuesong; Northrop, William F.
2016-04-01
This paper describes a quantitative approach to approximate multiple scattering through an isotropic turbid slab based on Markov Chain theorem. There is an increasing need to utilize multiple scattering for optical diagnostic purposes; however, existing methods are either inaccurate or computationally expensive. Here, we develop a novel Markov Chain approximation approach to solve multiple scattering angular distribution (AD) that can accurately calculate AD while significantly reducing computational cost compared to Monte Carlo simulation. We expect this work to stimulate ongoing multiple scattering research and deterministic reconstruction algorithm development with AD measurements.
NASA Astrophysics Data System (ADS)
Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun
2017-12-01
For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.
Flemming, Kate
2010-01-01
This paper is a report of a Critical Interpretive Synthesis to synthesize quantitative research, in the form of an effectiveness review and a guideline, with qualitative research to examine the use of morphine to treat cancer-related pain. Critical Interpretive Synthesis is a new method of reviewing, developed from meta-ethnography, which integrates systematic review methodology with a qualitative tradition of enquiry. It has not previously been used specifically to synthesize effectiveness and qualitative literature. Data sources. An existing systematic review of quantitative research and a guideline examining the effectiveness of oral morphine to treat cancer pain were identified. Electronic searches of Medline, CINAHL, Embase, PsychINFO, Health Management Information Consortium database and the Social Science Citation Index to identify qualitative research were carried out in May 2008. Qualitative research papers reporting on the use of morphine to treat cancer pain were identified. The findings of the effectiveness research were used as a framework to guide the translation of findings from qualitative research using an integrative grid. A secondary translation of findings from the qualitative research, not specifically mapped to the effectiveness literature, was guided by the framework. Nineteen qualitative papers were synthesized with the quantitative effectiveness literature, producing 14 synthetic constructs. These were developed into four synthesizing arguments which drew on patients', carers' and healthcare professionals' interpretations of the meaning and context of the use of morphine to treat cancer pain. Critical Interpretive Synthesis can be adapted to synthesize reviews of quantitative research into effectiveness with qualitative research and fits into an existing typology of approaches to synthesizing qualitative and quantitative research.
Kwei, Johnny; Halstead, Fenella D; Dretzke, Janine; Oppenheim, Beryl A; Moiemen, Naiem S
2015-11-06
Sepsis from burn injuries can result from colonisation of burn wounds, especially in large surface area burns. Reducing bacterial infection will reduce morbidity and mortality, and mortality for severe burns can be as high as 15 %. There are various quantitative and semi-quantitative techniques to monitor bacterial load on wounds. In the UK, burn wounds are typically monitored for the presence or absence of bacteria through the collection and culture of swabs, but no absolute count is obtained. Quantitative burn wound culture provides a measure of bacterial count and is gaining increased popularity in some countries. It is however more resource intensive, and evidence for its utility appears to be inconsistent. This systematic review therefore aims to assess the evidence on the utility and reliability of different quantitative microbiology techniques in terms of diagnosing or predicting clinical outcomes. Standard systematic review methods aimed at minimising bias will be employed for study identification, selection and data extraction. Bibliographic databases and ongoing trial registers will be searched and conference abstracts screened. Studies will be eligible if they are prospective studies or systematic reviews of burn patients (any age) for whom quantitative microbiology has been performed, whether it is compared to another method. Quality assessment will be based on quality assessment tools for diagnostic and prognostic studies and tailored to the review as necessary. Synthesis is likely to be primarily narrative, but meta-analysis may be considered where clinical and methodological homogeneity exists. Given the increasing use of quantitative methods, this is a timely systematic review, which will attempt to clarify the evidence base. As far as the authors are aware, it will be the first to address this topic. PROSPERO, CRD42015023903.
Boundary cooled rocket engines for space storable propellants
NASA Technical Reports Server (NTRS)
Kesselring, R. C.; Mcfarland, B. L.; Knight, R. M.; Gurnitz, R. N.
1972-01-01
An evaluation of an existing analytical heat transfer model was made to develop the technology of boundary film/conduction cooled rocket thrust chambers to the space storable propellant combination oxygen difluoride/diborane. Critical design parameters were identified and their importance determined. Test reduction methods were developed to enable data obtained from short duration hot firings with a thin walled (calorimeter) chamber to be used quantitatively evaluate the heat absorbing capability of the vapor film. The modification of the existing like-doublet injector was based on the results obtained from the calorimeter firings.
Seeing and Reading Red: Hue and Color-word Correlation in Images and Attendant Text on the WWW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newsam, S
2004-07-12
This work represents an initial investigation into determining whether correlations actually exist between metadata and content descriptors in multimedia datasets. We provide a quantitative method for evaluating whether the hue of images on the WWW is correlated with the occurrence of color-words in metadata such as URLs, image names, and attendant text. It turns out that such a correlation does exist: the likelihood that a particular color appears in an image whose URL, name, and/or attendant text contains the corresponding color-word is generally at least twice the likelihood that the color appears in a randomly chosen image on the WWW.more » While this finding might not be significant in and of itself, it represents an initial step towards quantitatively establishing that other, perhaps more useful correlations exist. These correlations form the basis for exciting novel approaches that leverage semi-supervised datasets, such as the WWW, to overcome the semantic gap that has hampered progress in multimedia information retrieval for some time now.« less
Impact of immersion oils and mounting media on the confocal imaging of dendritic spines
Peterson, Brittni M.; Mermelstein, Paul G.; Meisel, Robert L.
2015-01-01
Background Structural plasticity, such as changes in dendritic spine morphology and density, reflect changes in synaptic connectivity and circuitry. Procedural variables used in different methods for labeling dendritic spines have been quantitatively evaluated for their impact on the ability to resolve individual spines in confocal microscopic analyses. In contrast, there have been discussions, though no quantitative analyses, of the potential effects of choosing specific mounting media and immersion oils on dendritic spine resolution. New Method Here we provide quantitative data measuring the impact of these variables on resolving dendritic spines in 3D confocal analyses. Medium spiny neurons from the rat striatum and nucleus accumbens are used as examples. Results Both choice of mounting media and immersion oil affected the visualization of dendritic spines, with choosing the appropriate immersion oil as being more imperative. These biologic data are supported by quantitative measures of the 3D diffraction pattern (i.e. point spread function) of a point source of light under the same mounting medium and immersion oil combinations. Comparison with Existing Method Although not a new method, this manuscript provides quantitative data demonstrating that different mounting media and immersion oils can impact the ability to resolve dendritic spines. These findings highlight the importance of reporting which mounting medium and immersion oil are used in preparations for confocal analyses, especially when comparing published results from different laboratories. Conclusion Collectively, these data suggest that choosing the appropriate immersion oil and mounting media is critical for obtaining the best resolution, and consequently more accurate measures of dendritic spine densities. PMID:25601477
Spatially Regularized Machine Learning for Task and Resting-state fMRI
Song, Xiaomu; Panych, Lawrence P.; Chen, Nan-kuei
2015-01-01
Background Reliable mapping of brain function across sessions and/or subjects in task- and resting-state has been a critical challenge for quantitative fMRI studies although it has been intensively addressed in the past decades. New Method A spatially regularized support vector machine (SVM) technique was developed for the reliable brain mapping in task- and resting-state. Unlike most existing SVM-based brain mapping techniques, which implement supervised classifications of specific brain functional states or disorders, the proposed method performs a semi-supervised classification for the general brain function mapping where spatial correlation of fMRI is integrated into the SVM learning. The method can adapt to intra- and inter-subject variations induced by fMRI nonstationarity, and identify a true boundary between active and inactive voxels, or between functionally connected and unconnected voxels in a feature space. Results The method was evaluated using synthetic and experimental data at the individual and group level. Multiple features were evaluated in terms of their contributions to the spatially regularized SVM learning. Reliable mapping results in both task- and resting-state were obtained from individual subjects and at the group level. Comparison with Existing Methods A comparison study was performed with independent component analysis, general linear model, and correlation analysis methods. Experimental results indicate that the proposed method can provide a better or comparable mapping performance at the individual and group level. Conclusions The proposed method can provide accurate and reliable mapping of brain function in task- and resting-state, and is applicable to a variety of quantitative fMRI studies. PMID:26470627
The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge
ERIC Educational Resources Information Center
Rice, Amber H.; Kitchel, Tracy
2015-01-01
The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…
Application of Diversity Indices to Quantify Early Life-History Diversity for Chinook Salmon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Gary E.; Sather, Nichole K.; Skalski, John R.
2014-03-01
We developed an index of early life history diversity (ELHD) for Pacific salmon (Oncorhynchus spp.) Early life history diversity is the variation in morphological and behavioral traits expressed within and among populations by individual juvenile salmon during their downstream migration. A standard quantitative method does not exist for this prominent concept in salmon biology.
ERIC Educational Resources Information Center
Sezer, Adem; Inel, Yusuf; Seçkin, Ahmet Çagdas; Uluçinar, Ufuk
2017-01-01
This study aimed to detect any relationship that may exist between classroom teacher candidates' class participation and their attention levels. The research method was a convergent parallel design, mixing quantitative and qualitative research techniques, and the study group was composed of 21 freshmen studying in the Classroom Teaching Department…
Quantitative PCR Method for Diagnosis of Citrus Bacterial Canker†
Cubero, J.; Graham, J. H.; Gottwald, T. R.
2001-01-01
For diagnosis of citrus bacterial canker by PCR, an internal standard is employed to ensure the quality of the DNA extraction and that proper requisites exist for the amplification reaction. The ratio of PCR products from the internal standard and bacterial target is used to estimate the initial bacterial concentration in citrus tissues with lesions. PMID:11375206
Generation 1.5 Written Error Patterns: A Comparative Study
ERIC Educational Resources Information Center
Doolan, Stephen M.; Miller, Donald
2012-01-01
In an attempt to contribute to existing research on Generation 1.5 students, the current study uses quantitative and qualitative methods to compare error patterns in a corpus of Generation 1.5, L1, and L2 community college student writing. This error analysis provides one important way to determine if error patterns in Generation 1.5 student…
Exploring the Relationship between Academic Dishonesty and Moral Development in Law School Students
ERIC Educational Resources Information Center
Edmondson, Macey Lynd
2013-01-01
This mixed methods study explored whether a relationship existed between moral development and dishonest academic behaviors in law students. The quantitative portion of the study utilized a survey adapted from James Rest's Defining Issues Test and Donald McCabe's Academic Integrity Survey. Law students were solicited by email from two public…
NASA Astrophysics Data System (ADS)
Min, Junwei; Yao, Baoli; Ketelhut, Steffi; Kemper, Björn
2017-02-01
The modular combination of optical microscopes with digital holographic microscopy (DHM) has been proven to be a powerful tool for quantitative live cell imaging. The introduction of condenser and different microscope objectives (MO) simplifies the usage of the technique and makes it easier to measure different kinds of specimens with different magnifications. However, the high flexibility of illumination and imaging also causes variable phase aberrations that need to be eliminated for high resolution quantitative phase imaging. The existent phase aberrations compensation methods either require add additional elements into the reference arm or need specimen free reference areas or separate reference holograms to build up suitable digital phase masks. These inherent requirements make them unpractical for usage with highly variable illumination and imaging systems and prevent on-line monitoring of living cells. In this paper, we present a simple numerical method for phase aberration compensation based on the analysis of holograms in spatial frequency domain with capabilities for on-line quantitative phase imaging. From a single shot off-axis hologram, the whole phase aberration can be eliminated automatically without numerical fitting or pre-knowledge of the setup. The capabilities and robustness for quantitative phase imaging of living cancer cells are demonstrated.
Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.
Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C
2016-01-01
Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.
Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments
Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.
2016-01-01
Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846
Developing an Engineering Design Process Assessment using Mixed Methods.
Wind, Stefanie A; Alemdar, Meltem; Lingle, Jeremy A; Gale, Jessica D; Moore, Roxanne A
Recent reforms in science education worldwide include an emphasis on engineering design as a key component of student proficiency in the Science, Technology, Engineering, and Mathematics disciplines. However, relatively little attention has been directed to the development of psychometrically sound assessments for engineering. This study demonstrates the use of mixed methods to guide the development and revision of K-12 Engineering Design Process (EDP) assessment items. Using results from a middle-school EDP assessment, this study illustrates the combination of quantitative and qualitative techniques to inform item development and revisions. Overall conclusions suggest that the combination of quantitative and qualitative evidence provides an in-depth picture of item quality that can be used to inform the revision and development of EDP assessment items. Researchers and practitioners can use the methods illustrated here to gather validity evidence to support the interpretation and use of new and existing assessments.
Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes
Zhang, Hong; Pei, Yun
2016-01-01
Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions. PMID:27529266
NASA Astrophysics Data System (ADS)
Woolfrey, John R.; Avery, Mitchell A.; Doweyko, Arthur M.
1998-03-01
Two three-dimensional quantitative structure-activity relationship (3D-QSAR) methods, comparative molecular field analysis (CoMFA) and hypothetical active site lattice (HASL), were compared with respect to the analysis of a training set of 154 artemisinin analogues. Five models were created, including a complete HASL and two trimmed versions, as well as two CoMFA models (leave-one-out standard CoMFA and the guided-region selection protocol). Similar r2 and q2 values were obtained by each method, although some striking differences existed between CoMFA contour maps and the HASL output. Each of the four predictive models exhibited a similar ability to predict the activity of a test set of 23 artemisinin analogues, although some differences were noted as to which compounds were described well by either model.
Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes.
Zhang, Hong; Pei, Yun
2016-08-12
Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions.
Hagen, C K; Diemoz, P C; Endrizzi, M; Rigon, L; Dreossi, D; Arfelli, F; Lopez, F C M; Longo, R; Olivo, A
2014-04-07
X-ray phase contrast imaging (XPCi) methods are sensitive to phase in addition to attenuation effects and, therefore, can achieve improved image contrast for weakly attenuating materials, such as often encountered in biomedical applications. Several XPCi methods exist, most of which have already been implemented in computed tomographic (CT) modality, thus allowing volumetric imaging. The Edge Illumination (EI) XPCi method had, until now, not been implemented as a CT modality. This article provides indications that quantitative 3D maps of an object's phase and attenuation can be reconstructed from EI XPCi measurements. Moreover, a theory for the reconstruction of combined phase and attenuation maps is presented. Both reconstruction strategies find applications in tissue characterisation and the identification of faint, weakly attenuating details. Experimental results for wires of known materials and for a biological object validate the theory and confirm the superiority of the phase over conventional, attenuation-based image contrast.
[Recurrence plot analysis of HRV for brain ischemia and asphyxia].
Chen, Xiaoming; Qiu, Yihong; Zhu, Yisheng
2008-02-01
Heart rate variability (HRV) is the tiny variability existing in the cycles of the heart beats, which reflects the corresponding balance between sympathetic and vagus nerves. Since the nonlinear characteristic of HRV is confirmed, the Recurrence Plot method, a nonlinear dynamic analysis method based on the complexity, could be used to analyze HRV. The results showed the recurrence plot structures and some quantitative indices (L-Mean, L-Entr) during asphyxia insult vary significantly as compared to those in normal conditions, which offer a new method to monitor brain asphyxia injury.
The Role of Introductory Geosciences in Students' Quantitative Literacy
NASA Astrophysics Data System (ADS)
Wenner, J. M.; Manduca, C.; Baer, E. M.
2006-12-01
Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated into existing introductory geoscience courses. In addition, participants at the workshop (http://serc.carleton.edu/quantskills/workshop06/index.html) submitted and modified more than 20 activities and model courses (with syllabi) designed to use best practices for helping introductory geoscience students to become quantitatively literate. We present insights from the workshop and other sources for a framework that can aid in increasing quantitative literacy of students from a variety of backgrounds in the introductory geoscience classroom.
Ndabarora, Eléazar; Mchunu, Gugu
2014-01-01
Various studies have reported that university students, who are mostly young people, rarely use existing HIV/AIDS preventive methods. Although studies have shown that young university students have a high degree of knowledge about HIV/AIDS and HIV modes of transmission, they are still not utilising the existing HIV prevention methods and still engage in risky sexual practices favourable to HIV. Some variables, such as awareness of existing HIV/AIDS prevention methods, have been associated with utilisation of such methods. The study aimed to explore factors that influence use of existing HIV/AIDS prevention methods among university students residing in a selected campus, using the Health Belief Model (HBM) as a theoretical framework. A quantitative research approach and an exploratory-descriptive design were used to describe perceived factors that influence utilisation by university students of HIV/AIDS prevention methods. A total of 335 students completed online and manual questionnaires. Study findings showed that the factors which influenced utilisation of HIV/AIDS prevention methods were mainly determined by awareness of the existing university-based HIV/AIDS prevention strategies. Most utilised prevention methods were voluntary counselling and testing services and free condoms. Perceived susceptibility and perceived threat of HIV/AIDS score was also found to correlate with HIV risk index score. Perceived susceptibility and perceived threat of HIV/AIDS showed correlation with self-efficacy on condoms and their utilisation. Most HBM variables were not predictors of utilisation of HIV/AIDS prevention methods among students. Intervention aiming to improve the utilisation of HIV/AIDS prevention methods among students at the selected university should focus on removing identified barriers, promoting HIV/AIDS prevention services and providing appropriate resources to implement such programmes.
IWGT report on quantitative approaches to genotoxicity risk ...
This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast
Yap, John Stephen; Fan, Jianqing; Wu, Rongling
2009-12-01
Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.
Optshrink LR + S: accelerated fMRI reconstruction using non-convex optimal singular value shrinkage.
Aggarwal, Priya; Shrivastava, Parth; Kabra, Tanay; Gupta, Anubha
2017-03-01
This paper presents a new accelerated fMRI reconstruction method, namely, OptShrink LR + S method that reconstructs undersampled fMRI data using a linear combination of low-rank and sparse components. The low-rank component has been estimated using non-convex optimal singular value shrinkage algorithm, while the sparse component has been estimated using convex l 1 minimization. The performance of the proposed method is compared with the existing state-of-the-art algorithms on real fMRI dataset. The proposed OptShrink LR + S method yields good qualitative and quantitative results.
Xu, Chao; Fang, Jian; Shen, Hui; Wang, Yu-Ping; Deng, Hong-Wen
2018-01-25
Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in extreme phenotypic samples, EPS can boost the association power compared to random sampling. Most existing statistical methods for EPS examine the genetic factors individually, despite many quantitative traits have multiple genetic factors underlying their variation. It is desirable to model the joint effects of genetic factors, which may increase the power and identify novel quantitative trait loci under EPS. The joint analysis of genetic data in high-dimensional situations requires specialized techniques, e.g., the least absolute shrinkage and selection operator (LASSO). Although there are extensive research and application related to LASSO, the statistical inference and testing for the sparse model under EPS remain unknown. We propose a novel sparse model (EPS-LASSO) with hypothesis test for high-dimensional regression under EPS based on a decorrelated score function. The comprehensive simulation shows EPS-LASSO outperforms existing methods with stable type I error and FDR control. EPS-LASSO can provide a consistent power for both low- and high-dimensional situations compared with the other methods dealing with high-dimensional situations. The power of EPS-LASSO is close to other low-dimensional methods when the causal effect sizes are small and is superior when the effects are large. Applying EPS-LASSO to a transcriptome-wide gene expression study for obesity reveals 10 significant body mass index associated genes. Our results indicate that EPS-LASSO is an effective method for EPS data analysis, which can account for correlated predictors. The source code is available at https://github.com/xu1912/EPSLASSO. hdeng2@tulane.edu. Supplementary data are available at Bioinformatics online. © The Author (2018). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Gibergans-Báguena, J.; Llasat, M. C.
2007-12-01
The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.
Wu, Qingqing; Xiang, Shengnan; Wang, Wenjun; Zhao, Jinyan; Xia, Jinhua; Zhen, Yueran; Liu, Bang
2018-05-01
Various detection methods have been developed to date for identification of animal species. New techniques based on PCR approach have raised the hope of developing better identification methods, which can overcome the limitations of the existing methods. PCR-based methods used the mitochondrial DNA (mtDNA) as well as nuclear DNA sequences. In this study, by targeting nuclear DNA, multiplex PCR and real-time PCR methods were developed to assist with qualitative and quantitative analysis. The multiplex PCR was found to simultaneously and effectively distinguish four species (fox, dog, mink, and rabbit) ingredients by the different sizes of electrophoretic bands: 480, 317, 220, and 209 bp. Real-time fluorescent PCR's amplification profiles and standard curves showed good quantitative measurement responses and linearity, as indicated by good repeatability and coefficient of determination R 2 > 0.99. The quantitative results of quaternary DNA mixtures including mink, fox, dog, and rabbit DNA are in line with our expectations: R.D. (relative deviation) varied between 1.98 and 12.23% and R.S.D. (relative standard deviation) varied between 3.06 and 11.51%, both of which are well within the acceptance criterion of ≤ 25%. Combining the two methods is suitable for the rapid identification and accurate quantification of fox-, dog-, mink-, and rabbit-derived ingredients in the animal products.
Baldwin, Aleta; Dodge, Brian; Schick, Vanessa; Herbenick, Debra; Sanders, Stephanie A; Dhoot, Roshni; Fortenberry, J Dennis
2017-11-01
Disclosure of sexual identity among sexual minority women is related to better outcomes and improved quality of care. The existing literature on sexual minority women's experiences of identity disclosure and related interactions with healthcare providers draws little distinction between different groups of sexual minority women, despite the different barriers, stigma and health outcomes that exist between them. This paper presents factors influencing identity disclosure and describes the characteristics of interactions that sexual minority women have with their healthcare providers around sexual identity and health. Using a mixed-methods approach, both qualitative and quantitative data were gathered using an online survey. The sample included lesbian, bisexual, queer and pansexual women from across the USA. Qualitative and quantitative data were analysed concurrently, and qualitative themes were quantified and integrated into quantitative analyses. Identity disclosure, reasons for disclosing identity and characteristics of interactions with providers varied by identity, but often overlapped. Bisexual and pansexual participants were significantly less likely to disclose identity than lesbian participants. There were no significant differences related to age or ethnicity. Findings from this study have the potential to inform ethical medical practices and improve healthcare quality among sexual minority women.
cp-R, an interface the R programming language for clinical laboratory method comparisons.
Holmes, Daniel T
2015-02-01
Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Alvarez-Nunez, Tanya Mae
2012-01-01
Scope and Method of Study: This quantitative, non-experimental study sought to determine if a statistically significant difference existed in student achievement on the PSE exam in Belizean primary schools for students who have teachers with varying levels of self-efficacy (high, medium and low). The Teacher Efficacy Scale (TES), which captures…
2011-09-22
OPs) are a group of pesticides that inhibit enzymes such as acetylcholinesterase. Numerous OP structural variants exist and toxicity data can be...and human toxicity studies especially for OPs lacking experimental data. 15. SUBJECT TERMS QSAR Organophosphates...structure and mechanism of toxicity c) Linking QSAR and OP PBPK/PD 2. Methods a) Physiochemical Descriptors b) Regression Techniques 3. Results a
A Quantitative Experimental Study of the Effectiveness of Systems to Identify Network Attackers
ERIC Educational Resources Information Center
Handorf, C. Russell
2016-01-01
This study analyzed the meta-data collected from a honeypot that was run by the Federal Bureau of Investigation for a period of 5 years. This analysis compared the use of existing industry methods and tools, such as Intrusion Detection System alerts, network traffic flow and system log traffic, within the Open Source Security Information Manager…
ERIC Educational Resources Information Center
Van Zyl, Douglas G.
2011-01-01
Purpose of the study. The purpose of this study was to examine single-gender groupings for sixth grade mathematics classes as a strategy to improve student achievement. The method of research was quantitative, with MAP mathematics test data being used to determine if any relationship exists between the strategy and student achievement. Findings.…
Assessing crown fire potential by linking models of surface and crown fire behavior
Joe H. Scott; Elizabeth D. Reinhardt
2001-01-01
Fire managers are increasingly concerned about the threat of crown fires, yet only now are quantitative methods for assessing crown fire hazard being developed. Links among existing mathematical models of fire behavior are used to develop two indices of crown fire hazard-the Torching Index and Crowning Index. These indices can be used to ordinate different forest...
Estimating root biomass and distribution after fire in a Great Basin woodland using cores and pits
Benjamin M. Rau; Dale W. Johnson; Jeanne C. Chambers; Robert R. Blank; Annmarie Lucchesi
2009-01-01
Quantifying root biomass is critical to an estimation and understanding of ecosystem net primary production, biomass partitioning, and belowground competition. We compared 2 methods for determining root biomass: a new soil-coring technique and traditional excavation of quantitative pits. We conducted the study in an existing Joint Fire Sciences demonstration area in...
ERIC Educational Resources Information Center
Jiang, Chunjiao; Li, Song
2008-01-01
Based on the quantitative research and comparative study method, this paper attempts to make a systematic study and analysis of regional differences which have existed since 1949 in higher education among East, Middle and West China. The study is intended to explore the causes, regional differences, social changes, and their co-related…
The Relationship of Grade Span in 9th Grade to Math Achievement in High School
ERIC Educational Resources Information Center
West, John; Miller, Mary Lou; Myers, Jim; Norton, Timothy
2015-01-01
Purpose, Scope, and Method of Study: The purpose of this study was to determine if a correlation exists between grade span for ninth grade and gains in math achievement test scores in 10th grade and 12th grade. A quantitative, longitudinal, correlational research design was employed to investigate the research questions. The population was high…
ERIC Educational Resources Information Center
Horne, Lela M.; Rachal, John R.; Shelley, Kyna
2012-01-01
A mixed methods framework utilized quantitative and qualitative data to determine whether statistically significant differences existed between high school and GED[R] student perceptions of credential value. An exploratory factor analysis (n=326) extracted four factors and then a MANOVA procedure was performed with a stratified quota sample…
Surface colour photometry of galaxies with Schmidt telescopes.
NASA Technical Reports Server (NTRS)
Wray, J. D.
1972-01-01
A method is described which owes its practicality to the capability of Schmidt telescopes to record a number of galaxy images on a single plate and to the existence of high speed computer controlled area-scanning precision microdensitometers such as the Photometric Data Systems model 1010. The method of analysis results in quantitative color-index information which is displayed in a manner that allows any user to effectively study the morphological properties of the distribution of color-index in galaxies.
Musante, C J; Abernethy, D R; Allerheiligen, S R; Lauffenburger, D A; Zager, M G
2016-09-01
Quantitative Systems Pharmacology (QSP) is experiencing increased application in the drug discovery and development process. Like its older sibling, systems biology, the QSP field is comprised of a mix of established disciplines and methods, from molecular biology to engineering to pharmacometrics. As a result, there exist critical segments of the discipline that differ dramatically in approach and a need to bring these groups together toward a common goal. © 2016 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
NASA Astrophysics Data System (ADS)
Egan, James; McMillan, Normal; Denieffe, David
2011-08-01
Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Fei; Parkinson, B. A.; Divan, Ralu
Interdigitated array (IDA) electrodes have been applied to study the EC’ (electron transfer reaction followed by a catalytic reaction) reactions and a new method of quantitative analysis of IDA results was developed. In this new method, currents on IDA generator and collector electrodes for an EC’ mechanism are derived from the number of redox cycles and the contribution of non-catalytic current. And the fractions of bipotential recycling species and catalytic-active species are calculated, which helps understanding the catalytic reaction mechanism. The homogeneous hydrogen evolution reaction catalyzed by [Ni(PPh2NBn2)2]2+ (where PPh2NBn2 is 1,5-dibenzyl-3,7-diphenyl-1,5-diaza-3,7-diphosphacyclooctane) electrocatalyst was examined and analyzed with IDA electrodes.more » Besides, the existence of reaction intermediates in the catalytic cycle is inferred from the electrochemical behavior of a glassy carbon disk electrodes and carbon IDA electrodes. This quantitative analysis of IDA electrode cyclic voltammetry currents can be used as a simple and straightforward method for determining reaction mechanism in other catalytic systems as well.« less
NASA Astrophysics Data System (ADS)
Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph
2016-12-01
Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.
Kruid, Jan; Fogel, Ronen; Limson, Janice Leigh
2017-05-01
Identifying the most efficient oxidation process to achieve maximum removal of a target pollutant compound forms the subject of much research. There exists a need to develop rapid screening tools to support research in this area. In this work we report on the development of a quantitative assay as a means for identifying catalysts capable of decolourising methylene blue through the generation of oxidising species from hydrogen peroxide. Here, a previously described methylene blue test strip method was repurposed as a quantitative, aqueous-based spectrophotometric assay. From amongst a selection of metal salts and metallophthalocyanine complexes, monitoring of the decolourisation of the cationic dye methylene blue (via Fenton-like and non-Fenton oxidation reactions) by the assay identified the following to be suitable oxidation catalysts: CuSO 4 (a Fenton-like catalyst), iron(II)phthalocyanine (a non-Fenton oxidation catalyst), as well as manganese(II) phthalocyanine. The applicability of the method was examined for the removal of bisphenol A (BPA), as measured by HPLC, during parallel oxidation experiments. The order of catalytic activity was identified as FePc > MnPc > CuSO 4 for both BPA and MB. The quantitative MB decolourisation assay may offer a rapid method for screening a wide range of potential catalysts for oxidation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Forman, Jane; Creswell, John W; Damschroder, Laura; Kowalski, Christine P; Krein, Sarah L
2008-12-01
Infection control professionals and hospital epidemiologists are accustomed to using quantitative research. Although quantitative studies are extremely important in the field of infection control and prevention, often they cannot help us explain why certain factors affect the use of infection control practices and identify the underlying mechanisms through which they do so. Qualitative research methods, which use open-ended techniques, such as interviews, to collect data and nonstatistical techniques to analyze it, provide detailed, diverse insights of individuals, useful quotes that bring a realism to applied research, and information about how different health care settings operate. Qualitative research can illuminate the processes underlying statistical correlations, inform the development of interventions, and show how interventions work to produce observed outcomes. This article describes the key features of qualitative research and the advantages that such features add to existing quantitative research approaches in the study of infection control. We address the goal of qualitative research, the nature of the research process, sampling, data collection and analysis, validity, generalizability of findings, and presentation of findings. Health services researchers are increasingly using qualitative methods to address practical problems by uncovering interacting influences in complex health care environments. Qualitative research methods, applied with expertise and rigor, can contribute important insights to infection prevention efforts.
Rastogi, L.; Dash, K.; Arunachalam, J.
2013-01-01
The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814
Gregory, T Ryan; Nathwani, Paula; Bonnett, Tiffany R; Huber, Dezene P W
2013-09-01
A study was undertaken to evaluate both a pre-existing method and a newly proposed approach for the estimation of nuclear genome sizes in arthropods. First, concerns regarding the reliability of the well-established method of flow cytometry relating to impacts of rearing conditions on genome size estimates were examined. Contrary to previous reports, a more carefully controlled test found negligible environmental effects on genome size estimates in the fly Drosophila melanogaster. Second, a more recently touted method based on quantitative real-time PCR (qPCR) was examined in terms of ease of use, efficiency, and (most importantly) accuracy using four test species: the flies Drosophila melanogaster and Musca domestica and the beetles Tribolium castaneum and Dendroctonus ponderosa. The results of this analysis demonstrated that qPCR has the tendency to produce substantially different genome size estimates from other established techniques while also being far less efficient than existing methods.
Intensity non-uniformity correction in MRI: existing methods and their validation.
Belaroussi, Boubakeur; Milles, Julien; Carme, Sabin; Zhu, Yue Min; Benoit-Cattin, Hugues
2006-04-01
Magnetic resonance imaging is a popular and powerful non-invasive imaging technique. Automated analysis has become mandatory to efficiently cope with the large amount of data generated using this modality. However, several artifacts, such as intensity non-uniformity, can degrade the quality of acquired data. Intensity non-uniformity consists in anatomically irrelevant intensity variation throughout data. It can be induced by the choice of the radio-frequency coil, the acquisition pulse sequence and by the nature and geometry of the sample itself. Numerous methods have been proposed to correct this artifact. In this paper, we propose an overview of existing methods. We first sort them according to their location in the acquisition/processing pipeline. Sorting is then refined based on the assumptions those methods rely on. Next, we present the validation protocols used to evaluate these different correction schemes both from a qualitative and a quantitative point of view. Finally, availability and usability of the presented methods is discussed.
Wutich, Amber; Budds, Jessica; Eichelberger, Laura; Geere, Jo; Harris, Leila; Horney, Jennifer; Jepson, Wendy; Norman, Emma; O'Reilly, Kathleen; Pearson, Amber; Shah, Sameer; Shinn, Jamie; Simpson, Karen; Staddon, Chad; Stoler, Justin; Teodoro, Manuel P; Young, Sera
2017-11-01
Household water insecurity has serious implications for the health, livelihoods and wellbeing of people around the world. Existing methods to assess the state of household water insecurity focus largely on water quality, quantity or adequacy, source or reliability, and affordability. These methods have significant advantages in terms of their simplicity and comparability, but are widely recognized to oversimplify and underestimate the global burden of household water insecurity. In contrast, a broader definition of household water insecurity should include entitlements and human capabilities, sociocultural dynamics, and political institutions and processes. This paper proposes a mix of qualitative and quantitative methods that can be widely adopted across cultural, geographic, and demographic contexts to assess hard-to-measure dimensions of household water insecurity. In doing so, it critically evaluates existing methods for assessing household water insecurity and suggests ways in which methodological innovations advance a broader definition of household water insecurity.
Current issues with standards in the measurement and documentation of human skeletal anatomy.
Magee, Justin; McClelland, Brian; Winder, John
2012-09-01
Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18-65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. © 2012 The Authors. Journal of Anatomy © 2012 Anatomical Society.
Current issues with standards in the measurement and documentation of human skeletal anatomy
Magee, Justin; McClelland, Brian; Winder, John
2012-01-01
Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18–65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. PMID:22747678
Miller, C.; Waddell, K.; Tang, N.
2010-01-01
RP-122 Peptide quantitation using Multiple Reaction Monitoring (MRM) has been established as an important methodology for biomarker verification andvalidation.This requires high throughput combined with high sensitivity to analyze potentially thousands of target peptides in each sample.Dynamic MRM allows the system to only acquire the required MRMs of the peptide during a retention window corresponding to when each peptide is eluting. This reduces the number of concurrent MRM and therefore improves quantitation and sensitivity. MRM Selector allows the user to generate an MRM transition list with retention time information from discovery data obtained on a QTOF MS system.This list can be directly imported into the triple quadrupole acquisition software.However, situations can exist where a) the list of MRMs contain an excess of MRM transitions allowable under the ideal acquisition conditions chosen ( allowing for cycle time and chromatography conditions), or b) too many transitions in a certain retention time region which would result in an unacceptably low dwell time and cycle time.A new tool - MRM viewer has been developed to help users automatically generate multiple dynamic MRM methods from a single MRM list.In this study, a list of 3293 MRM transitions from a human plasma sample was compiled.A single dynamic MRM method with 3293 transitions results in a minimum dwell time of 2.18ms.Using MRM viewer we can generate three dynamic MRM methods with a minimum dwell time of 20ms which can give a better quality MRM quantitation.This tool facilitates both high throughput and high sensitivity for MRM quantitation.
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.
Fitterer, Jessica L; Nelson, Trisalyn A
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime
Fitterer, Jessica L.; Nelson, Trisalyn A.
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016
Curreli, Francesca; Robles, Monica A; Friedman-Kien, Alvin E; Flore, Ornella
2003-02-01
Kaposi's sarcoma-associated herpesvirus is a novel herpesvirus linked to AIDS-related neoplasms. Currently it is difficult to evaluate the number of virions in viral preparation or in samples obtained from patients with Kaposi's sarcoma (KS), since no protocol for determining the plaque forming units of KSHV exists. We constructed a fragment of a different size than the target viral DNA to carry out a competitive-quantitative PCR. Both fragment and viral DNA were added to a single PCR reaction to compete for the same set of primers. By knowing the amount of the competitor added to the reaction, we could determine the number of viral DNA molecules. We used this assay successfully to detect and quantify KSHV genomes from KS skin biopsies and pleural effusion lymphoma, and from different viral preparations. To date, this is the most convenient and economic method that allows an accurate and fast viral detection/quantitation with a single PCR.
A Novel Method for Tracking Individuals of Fruit Fly Swarms Flying in a Laboratory Flight Arena.
Cheng, Xi En; Qian, Zhi-Ming; Wang, Shuo Hong; Jiang, Nan; Guo, Aike; Chen, Yan Qiu
2015-01-01
The growing interest in studying social behaviours of swarming fruit flies, Drosophila melanogaster, has heightened the need for developing tools that provide quantitative motion data. To achieve such a goal, multi-camera three-dimensional tracking technology is the key experimental gateway. We have developed a novel tracking system for tracking hundreds of fruit flies flying in a confined cubic flight arena. In addition to the proposed tracking algorithm, this work offers additional contributions in three aspects: body detection, orientation estimation, and data validation. To demonstrate the opportunities that the proposed system offers for generating high-throughput quantitative motion data, we conducted experiments on five experimental configurations. We also performed quantitative analysis on the kinematics and the spatial structure and the motion patterns of fruit fly swarms. We found that there exists an asymptotic distance between fruit flies in swarms as the population density increases. Further, we discovered the evidence for repulsive response when the distance between fruit flies approached the asymptotic distance. Overall, the proposed tracking system presents a powerful method for studying flight behaviours of fruit flies in a three-dimensional environment.
A novel image-based quantitative method for the characterization of NETosis
Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.
2015-01-01
NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624
[Modeling continuous scaling of NDVI based on fractal theory].
Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng
2013-07-01
Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.
NASA Astrophysics Data System (ADS)
Cravey, Kristopher J.
Notable performance differences exist between nuclear and fossil power generation plants in areas such as safety, outage duration efficiency, and capacity factor. This study explored the relationship of organizational culture and implicit leadership theory to these performance differences. A mixed methods approach consisting of quantitative instruments, namely the Organizational Culture Assessment Instrument and the GLOBE Leadership Scales, and qualitative interviews were used in this study. Subjects were operations middle managers in a U.S. energy company that serves nuclear or fossil power plants. Results from the quantitative instruments revealed no differences between nuclear and fossil groups in regards to organizational culture types and implicit leadership theories. However, the qualitative results did reveal divergence between the two groups in regards to what is valued in the organization and how that drives behaviors and decision making. These organizational phenomenological differences seem to explain why performance differences exist between nuclear and fossil plants because, ultimately, they affect how the organization functions.
Chen, Jianrong; Mei, Rongwu; Shen, Liguo; Ding, Linxian; He, Yiming; Lin, Hongjun; Hong, Huachang
2015-03-01
The interfacial interactions between a foulant particle and rough membrane surface in a submerged membrane bioreactor (MBR) were quantitatively assessed by using a new-developed method. It was found that the profile of total interaction versus separation distance was complicated. There were an energy barrier and two negative energy ranges in the profile. Further analysis showed that roughness scale significantly affected the strength and properties of interfacial interactions. It was revealed that there existed a critical range of roughness scale within which the total energy in the separation distance ranged from 0 to several nanometers was continually repulsive. Decrease in foulant size would increase the strength of specific interaction energy, but did not change the existence of a critical roughness scale range. These findings suggested the possibility to "tailor" membrane surface morphology for membrane fouling mitigation, and thus gave significant implications for membrane selection and fabrication in MBRs. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahi-Anwar, M; Lo, P; Kim, H
Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifiesmore » the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel component to automatically verify image acquisition parameters and automated adherence to specifications. Institutional research agreement, Siemens Healthcare; Past recipient, research grant support, Siemens Healthcare; Consultant, Toshiba America Medical Systems; Consultant, Samsung Electronics; NIH Grant support from: U01 CA181156.« less
Capillary density: An important parameter in nailfold capillaroscopy.
Emrani, Zahra; Karbalaie, Abdolamir; Fatemi, Alimohammad; Etehadtavakol, Mahnaz; Erlandsson, Björn-Erik
2017-01-01
Nailfold capillaroscopy is one of the various noninvasive bioengineering methods used to investigate skin microcirculation. It is an effective examination for assessing microvascular changes in the peripheral circulation; hence it has a significant role for the diagnosis of Systemic sclerosis with the classic changes of giant capillaries as well as the decline in capillary density with capillary dropout. The decline in capillary density is one of microangiopathic features existing in connective tissue disease. It is detectable with nailfold capillaroscopy. This parameter is assessed by applying quantitative measurement. In this article, we reviewed a common method for calculating the capillary density and the relation between the number of capillaries as well as the existence of digital ulcers, pulmonary arterial hypertension, autoantibodies, scleroderma patterns and different scoring system. Copyright © 2016 Elsevier Inc. All rights reserved.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Towards machine ecoregionalization of Earth's landmass using pattern segmentation method
NASA Astrophysics Data System (ADS)
Nowosad, Jakub; Stepinski, Tomasz F.
2018-07-01
We present and evaluate a quantitative method for delineation of ecophysiographic regions throughout the entire terrestrial landmass. The method uses the new pattern-based segmentation technique which attempts to emulate the qualitative, weight-of-evidence approach to a delineation of ecoregions in a computer code. An ecophysiographic region is characterized by homogeneous physiography defined by the cohesiveness of patterns of four variables: land cover, soils, landforms, and climatic patterns. Homogeneous physiography is a necessary but not sufficient condition for a region to be an ecoregion, thus machine delineation of ecophysiographic regions is the first, important step toward global ecoregionalization. In this paper, we focus on the first-order approximation of the proposed method - delineation on the basis of the patterns of the land cover alone. We justify this approximation by the existence of significant spatial associations between various physiographic variables. Resulting ecophysiographic regionalization (ECOR) is shown to be more physiographically homogeneous than existing global ecoregionalizations (Terrestrial Ecoregions of the World (TEW) and Bailey's Ecoregions of the Continents (BEC)). The presented quantitative method has an advantage of being transparent and objective. It can be verified, easily updated, modified and customized for specific applications. Each region in ECOR contains detailed, SQL-searchable information about physiographic patterns within it. It also has a computer-generated label. To give a sense of how ECOR compares to TEW and, in the U.S., to EPA Level III ecoregions, we contrast these different delineations using two specific sites as examples. We conclude that ECOR yields regionalization somewhat similar to EPA level III ecoregions, but for the entire world, and by automatic means.
3D Slicer as an Image Computing Platform for the Quantitative Imaging Network
Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron
2012-01-01
Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690
Helmerhorst, Hendrik J F; Brage, Søren; Warren, Janet; Besson, Herve; Ekelund, Ulf
2012-08-31
Physical inactivity is one of the four leading risk factors for global mortality. Accurate measurement of physical activity (PA) and in particular by physical activity questionnaires (PAQs) remains a challenge. The aim of this paper is to provide an updated systematic review of the reliability and validity characteristics of existing and more recently developed PAQs and to quantitatively compare the performance between existing and newly developed PAQs.A literature search of electronic databases was performed for studies assessing reliability and validity data of PAQs using an objective criterion measurement of PA between January 1997 and December 2011. Articles meeting the inclusion criteria were screened and data were extracted to provide a systematic overview of measurement properties. Due to differences in reported outcomes and criterion methods a quantitative meta-analysis was not possible.In total, 31 studies testing 34 newly developed PAQs, and 65 studies examining 96 existing PAQs were included. Very few PAQs showed good results on both reliability and validity. Median reliability correlation coefficients were 0.62-0.71 for existing, and 0.74-0.76 for new PAQs. Median validity coefficients ranged from 0.30-0.39 for existing, and from 0.25-0.41 for new PAQs.Although the majority of PAQs appear to have acceptable reliability, the validity is moderate at best. Newly developed PAQs do not appear to perform substantially better than existing PAQs in terms of reliability and validity. Future PAQ studies should include measures of absolute validity and the error structure of the instrument.
2012-01-01
Physical inactivity is one of the four leading risk factors for global mortality. Accurate measurement of physical activity (PA) and in particular by physical activity questionnaires (PAQs) remains a challenge. The aim of this paper is to provide an updated systematic review of the reliability and validity characteristics of existing and more recently developed PAQs and to quantitatively compare the performance between existing and newly developed PAQs. A literature search of electronic databases was performed for studies assessing reliability and validity data of PAQs using an objective criterion measurement of PA between January 1997 and December 2011. Articles meeting the inclusion criteria were screened and data were extracted to provide a systematic overview of measurement properties. Due to differences in reported outcomes and criterion methods a quantitative meta-analysis was not possible. In total, 31 studies testing 34 newly developed PAQs, and 65 studies examining 96 existing PAQs were included. Very few PAQs showed good results on both reliability and validity. Median reliability correlation coefficients were 0.62–0.71 for existing, and 0.74–0.76 for new PAQs. Median validity coefficients ranged from 0.30–0.39 for existing, and from 0.25–0.41 for new PAQs. Although the majority of PAQs appear to have acceptable reliability, the validity is moderate at best. Newly developed PAQs do not appear to perform substantially better than existing PAQs in terms of reliability and validity. Future PAQ studies should include measures of absolute validity and the error structure of the instrument. PMID:22938557
NASA Astrophysics Data System (ADS)
Carbonneau, Patrice; Fonstad, Mark A.; Marcus, W. Andrew; Dugdale, Stephen J.
2012-01-01
The structure and function of rivers have long been characterized either by: (1) qualitative models such as the River Continuum Concept or Serial Discontinuity Concept which paint broad descriptive portraits of how river habitats and communities vary, or (2) quantitative models, such as downstream hydraulic geometry, which rely on a limited number of measurements spread widely throughout a river basin. In contrast, authors such as Fausch et al. (2002) and Wiens (2002) proposed applying existing quantitative, spatially comprehensive ecology and landscape ecology methods to rivers. This new framework for river sciences which preserves variability and spatial relationships is called a riverine landscape or a 'riverscape'. Application of this riverscape concept requires information on the spatial distribution of organism-scale habitats throughout entire river systems. This article examines the ways in which recent technical and methodological developments can allow us to quantitatively implement and realize the riverscape concept. Using 3-cm true color aerial photos and 5-m resolution elevation data from the River Tromie, Scotland, we apply the newly developed Fluvial Information System which integrates a suite of cutting edge, high resolution, remote sensing methods in a spatially explicit framework. This new integrated approach allows for the extraction of primary fluvial variables such as width, depth, particle size, and elevation. From these first-order variables, we derive second-order geomorphic and hydraulic variables including velocity, stream power, Froude number and shear stress. Channel slope can be approximated from available topographic data. Based on these first and second-order variables, we produce riverscape metrics that begin to explore how geomorphic structures may influence river habitats, including connectivity, patchiness of habitat, and habitat distributions. The results show a complex interplay of geomorphic variable and habitat patchiness that is not predicted by existing fluvial theory. Riverscapes, thus, challenge the existing understanding of how rivers structure themselves and will force development of new paradigms.
Geuna, S
2000-11-20
Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.
General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.
de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael
2016-11-01
Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.
The analysis of morphometric data on rocky mountain wolves and artic wolves using statistical method
NASA Astrophysics Data System (ADS)
Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Hamzah, Nor Shamsidah Amir; Nor, Maria Elena; Ahmad, Noor’ani; Azia Hazida Mohamad Azmi, Nur; Latip, Muhammad Faez Ab; Hilmi Azman, Ahmad
2018-04-01
Morphometrics is a quantitative analysis depending on the shape and size of several specimens. Morphometric quantitative analyses are commonly used to analyse fossil record, shape and size of specimens and others. The aim of the study is to find the differences between rocky mountain wolves and arctic wolves based on gender. The sample utilised secondary data which included seven variables as independent variables and two dependent variables. Statistical modelling was used in the analysis such was the analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA). The results showed there exist differentiating results between arctic wolves and rocky mountain wolves based on independent factors and gender.
Safety evaluation methodology for advanced coal extraction systems
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1981-01-01
Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.
Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.
XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.
Ching, Daniel J; Gürsoy, Dogˇa
2017-03-01
The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
Note: Measuring instrument of singlet oxygen quantum yield in photodynamic effects
NASA Astrophysics Data System (ADS)
Li, Zhongwei; Zhang, Pengwei; Zang, Lixin; Qin, Feng; Zhang, Zhiguo; Zhang, Hongli
2017-06-01
Using diphenylisobenzofuran (C20H14O) as a singlet oxygen (1O2) reporter, a comparison method, which can be used to measure the singlet oxygen quantum yield (ΦΔ) of the photosensitizer quantitatively, is presented in this paper. Based on this method, an automatic measuring instrument of singlet oxygen quantum yield is developed. The singlet oxygen quantum yield of the photosensitizer hermimether and aloe-emodin is measured. It is found that the measuring results are identical to the existing ones, which verifies the validity of the measuring instrument.
Hanson, Jeffery A; Yang, Haw
2008-11-06
The statistical properties of the cross correlation between two time series has been studied. An analytical expression for the cross correlation function's variance has been derived. On the basis of these results, a statistically robust method has been proposed to detect the existence and determine the direction of cross correlation between two time series. The proposed method has been characterized by computer simulations. Applications to single-molecule fluorescence spectroscopy are discussed. The results may also find immediate applications in fluorescence correlation spectroscopy (FCS) and its variants.
XDesign: An open-source software package for designing X-ray imaging phantoms and experiments
Ching, Daniel J.; Gursoy, Dogˇa
2017-02-21
Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.
Causality and Causal Inference in Social Work: Quantitative and Qualitative Perspectives
Palinkas, Lawrence A.
2015-01-01
Achieving the goals of social work requires matching a specific solution to a specific problem. Understanding why the problem exists and why the solution should work requires a consideration of cause and effect. However, it is unclear whether it is desirable for social workers to identify cause and effect, whether it is possible for social workers to identify cause and effect, and, if so, what is the best means for doing so. These questions are central to determining the possibility of developing a science of social work and how we go about doing it. This article has four aims: (1) provide an overview of the nature of causality; (2) examine how causality is treated in social work research and practice; (3) highlight the role of quantitative and qualitative methods in the search for causality; and (4) demonstrate how both methods can be employed to support a “science” of social work. PMID:25821393
Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul
2016-12-01
Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Informatics methods to enable sharing of quantitative imaging research data.
Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L
2012-11-01
The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.
Comparative analysis of quantitative efficiency evaluation methods for transportation networks
He, Yuxin; Hong, Jian
2017-01-01
An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165
Chen, Cheng; Wang, Wei; Ozolek, John A.; Rohde, Gustavo K.
2013-01-01
We describe a new supervised learning-based template matching approach for segmenting cell nuclei from microscopy images. The method uses examples selected by a user for building a statistical model which captures the texture and shape variations of the nuclear structures from a given dataset to be segmented. Segmentation of subsequent, unlabeled, images is then performed by finding the model instance that best matches (in the normalized cross correlation sense) local neighborhood in the input image. We demonstrate the application of our method to segmenting nuclei from a variety of imaging modalities, and quantitatively compare our results to several other methods. Quantitative results using both simulated and real image data show that, while certain methods may work well for certain imaging modalities, our software is able to obtain high accuracy across several imaging modalities studied. Results also demonstrate that, relative to several existing methods, the template-based method we propose presents increased robustness in the sense of better handling variations in illumination, variations in texture from different imaging modalities, providing more smooth and accurate segmentation borders, as well as handling better cluttered nuclei. PMID:23568787
Lucas, Patricia J; Baird, Janis; Arai, Lisa; Law, Catherine; Roberts, Helen M
2007-01-01
Background The inclusion of qualitative studies in systematic reviews poses methodological challenges. This paper presents worked examples of two methods of data synthesis (textual narrative and thematic), used in relation to one review, with the aim of enabling researchers to consider the strength of different approaches. Methods A systematic review of lay perspectives of infant size and growth was conducted, locating 19 studies (including both qualitative and quantitative). The data extracted from these were synthesised using both a textual narrative and a thematic synthesis. Results The processes of both methods are presented, showing a stepwise progression to the final synthesis. Both methods led us to similar conclusions about lay views toward infant size and growth. Differences between methods lie in the way they dealt with study quality and heterogeneity. Conclusion On the basis of the work reported here, we consider textual narrative and thematic synthesis have strengths and weaknesses in relation to different research questions. Thematic synthesis holds most potential for hypothesis generation, but may obscure heterogeneity and quality appraisal. Textual narrative synthesis is better able to describe the scope of existing research and account for the strength of evidence, but is less good at identifying commonality. PMID:17224044
Comparative analysis of quantitative efficiency evaluation methods for transportation networks.
He, Yuxin; Qin, Jin; Hong, Jian
2017-01-01
An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess's Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified.
Mapping of epistatic quantitative trait loci in four-way crosses.
He, Xiao-Hong; Qin, Hongde; Hu, Zhongli; Zhang, Tianzhen; Zhang, Yuan-Ming
2011-01-01
Four-way crosses (4WC) involving four different inbred lines often appear in plant and animal commercial breeding programs. Direct mapping of quantitative trait loci (QTL) in these commercial populations is both economical and practical. However, the existing statistical methods for mapping QTL in a 4WC population are built on the single-QTL genetic model. This simple genetic model fails to take into account QTL interactions, which play an important role in the genetic architecture of complex traits. In this paper, therefore, we attempted to develop a statistical method to detect epistatic QTL in 4WC population. Conditional probabilities of QTL genotypes, computed by the multi-point single locus method, were used to sample the genotypes of all putative QTL in the entire genome. The sampled genotypes were used to construct the design matrix for QTL effects. All QTL effects, including main and epistatic effects, were simultaneously estimated by the penalized maximum likelihood method. The proposed method was confirmed by a series of Monte Carlo simulation studies and real data analysis of cotton. The new method will provide novel tools for the genetic dissection of complex traits, construction of QTL networks, and analysis of heterosis.
Ostrinskaya, Alla; Kunz, Roderick R; Clark, Michelle; Kingsborough, Richard P; Ong, Ta-Hsuan; Deneault, Sandra
2018-05-24
A flow-injection analysis tandem mass spectrometry (FIA MSMS) method was developed for rapid quantitative analysis of 10 different inorganic and organic explosives. Performance is optimized by tailoring the ionization method (APCI/ESI), de-clustering potentials, and collision energies for each specific analyte. In doing so, a single instrument can be used to detect urea nitrate, potassium chlorate, 2,4,6-trinitrotoluene, 2,4,6-trinitrophenylmethylnitramine, triacetone triperoxide, hexamethylene triperoxide diamine, pentaerythritol tetranitrate, 1,3,5-trinitroperhydro-1,3,5-triazine, nitroglycerin, and octohy-dro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine with sensitivities all in the picogram per milliliter range. In conclusion, FIA APCI/ESI MSMS is a fast (<1 min/sample), sensitive (~pg/mL LOQ), and precise (intraday RSD < 10%) method for trace explosive detection that can play an important role in criminal and attributional forensics, counterterrorism, and environmental protection areas, and has the potential to augment or replace several of the existing explosive detection methods. © 2018 American Academy of Forensic Sciences.
Probability Density Functions of Observed Rainfall in Montana
NASA Technical Reports Server (NTRS)
Larsen, Scott D.; Johnson, L. Ronald; Smith, Paul L.
1995-01-01
The question of whether a rain rate probability density function (PDF) can vary uniformly between precipitation events is examined. Image analysis on large samples of radar echoes is possible because of advances in technology. The data provided by such an analysis easily allow development of radar reflectivity factors (and by extension rain rate) distribution. Finding a PDF becomes a matter of finding a function that describes the curve approximating the resulting distributions. Ideally, one PDF would exist for all cases; or many PDF's that have the same functional form with only systematic variations in parameters (such as size or shape) exist. Satisfying either of theses cases will, validate the theoretical basis of the Area Time Integral (ATI). Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89 percent of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit. Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89% of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit.
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Tataw, David Besong; Ekúndayò, Olúgbémiga T
2017-01-01
This article reports on the use of sequential and integrated mixed-methods approach in a focused population and small-area analysis. The study framework integrates focus groups, survey research, and community engagement strategies in a search for evidence related to prostate cancer screening services utilization as a component of cancer prevention planning in a marginalized African American community in the United States. Research and data analysis methods are synthesized by aggregation, configuration, and interpretive analysis. The results of synthesis show that qualitative and quantitative data validate and complement each other in advancing our knowledge of population characteristics, variable associations, the complex context in which variables exist, and the best options for prevention and service planning. Synthesis of findings and interpretive analysis provided two important explanations which seemed inexplicable in regression outputs: (a) Focus group data on the limitations of the church as an educational source explain the negative association between preferred educational channels and screening behavior found in quantitative analysis. (b) Focus group data on unwelcoming provider environments explain the inconsistent relationship between knowledge of local sites and screening services utilization found in quantitative analysis. The findings suggest that planners, evaluators, and scientists should grow their planning and evaluation evidence from the community they serve.
Mathematical modelling and quantitative methods.
Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G
2002-01-01
The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.
Myocardium tracking via matching distributions.
Ben Ayed, Ismail; Li, Shuo; Ross, Ian; Islam, Ali
2009-01-01
The goal of this study is to investigate automatic myocardium tracking in cardiac Magnetic Resonance (MR) sequences using global distribution matching via level-set curve evolution. Rather than relying on the pixelwise information as in existing approaches, distribution matching compares intensity distributions, and consequently, is well-suited to the myocardium tracking problem. Starting from a manual segmentation of the first frame, two curves are evolved in order to recover the endocardium (inner myocardium boundary) and the epicardium (outer myocardium boundary) in all the frames. For each curve, the evolution equation is sought following the maximization of a functional containing two terms: (1) a distribution matching term measuring the similarity between the non-parametric intensity distributions sampled from inside and outside the curve to the model distributions of the corresponding regions estimated from the previous frame; (2) a gradient term for smoothing the curve and biasing it toward high gradient of intensity. The Bhattacharyya coefficient is used as a similarity measure between distributions. The functional maximization is obtained by the Euler-Lagrange ascent equation of curve evolution, and efficiently implemented via level-set. The performance of the proposed distribution matching was quantitatively evaluated by comparisons with independent manual segmentations approved by an experienced cardiologist. The method was applied to ten 2D mid-cavity MR sequences corresponding to ten different subjects. Although neither shape prior knowledge nor curve coupling were used, quantitative evaluation demonstrated that the results were consistent with manual segmentations. The proposed method compares well with existing methods. The algorithm also yields a satisfying reproducibility. Distribution matching leads to a myocardium tracking which is more flexible and applicable than existing methods because the algorithm uses only the current data, i.e., does not require a training, and consequently, the solution is not bounded to some shape/intensity prior information learned from of a finite training set.
The Water-Energy-Food Nexus: A systematic review of methods for nexus assessment
NASA Astrophysics Data System (ADS)
Albrecht, Tamee R.; Crootof, Arica; Scott, Christopher A.
2018-04-01
The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex resource and development challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, while the WEF nexus offers a promising conceptual approach, the use of WEF nexus methods to systematically evaluate water, energy, and food interlinkages or support development of socially and politically-relevant resource policies has been limited. This paper reviews WEF nexus methods to provide a knowledge base of existing approaches and promote further development of analytical methods that align with nexus thinking. The systematic review of 245 journal articles and book chapters reveals that (a) use of specific and reproducible methods for nexus assessment is uncommon (less than one-third); (b) nexus methods frequently fall short of capturing interactions among water, energy, and food—the very linkages they conceptually purport to address; (c) assessments strongly favor quantitative approaches (nearly three-quarters); (d) use of social science methods is limited (approximately one-quarter); and (e) many nexus methods are confined to disciplinary silos—only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. To help overcome these limitations, we derive four key features of nexus analytical tools and methods—innovation, context, collaboration, and implementation—from the literature that reflect WEF nexus thinking. By evaluating existing nexus analytical approaches based on these features, we highlight 18 studies that demonstrate promising advances to guide future research. This paper finds that to address complex resource and development challenges, mixed-methods and transdisciplinary approaches are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and decision-makers.
A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.
Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao
2015-06-15
ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Harris, Kamelia; Band, Rebecca J; Cooper, Hazel; Macintyre, Vanessa G; Mejia, Anilena; Wearden, Alison J
2016-11-01
The objective of this study was to systematically review existing empirical research assessing levels and correlates of distress in significant others of patients with chronic fatigue syndrome/myalgic encephalomyelitis (CFS/ME). Systematic searches in CINAHL, Web of Science and PsycINFO were conducted in August 2014. The search was repeated in January 2015 to check for newly published articles. Studies published in English with quantitative, qualitative, or mixed designs exploring distress, poor subjective health, poor mental health, reduced quality of life and well-being, and symptoms of depression and anxiety in significant others (>18 years) of children and adults with CFS/ME were included. Quality appraisal of included studies was carried out. Quantitative and qualitative studies were summarized separately. Six articles met eligibility criteria. Two quantitative studies with significant others of adult patients, and one quantitative and two mixed-methods studies with significant others of child patients showed moderate to high levels of distress. One qualitative study (adult patients) found minimal evidence of distress and that acceptance of CFS/ME was related to better adjustment. In the quantitative and mixed-methods studies, significant others who attributed some level of responsibility for symptoms to the patient, or who were female, or whose partners had poorer mental health, had higher levels of distress. The small number of studies to date, the contrary evidence from a qualitative study, and the limited data available on levels of distress in significant others of patients with CFS/ME mean that our conclusion that distress levels are elevated is provisional. We recommend that future qualitative studies focus on this particular topic. Further longitudinal studies exploring correlates of distress within the context of a predictive theoretical model would be helpful. Statement of contribution What is already known on this subject? Chronic fatigue syndrome (CFS/ME) entails considerable economic, social, and personal costs. Uncertainties exist around diagnosis and management. This may lead to particular difficulties for significant others trying to support patients. What does this study add? Few studies have examined distress and its correlates in significant others of people with CFS/ME. Significant others report elevated levels of distress on quantitative measures. © 2016 The British Psychological Society.
ERIC Educational Resources Information Center
Slayter, Elspeth M.
2017-01-01
Existing research suggests a majority of faculty include social justice content in research courses but not through the use of existing quantitative data for in-class activities that foster mastery of data analysis and interpretation and curiosity about social justice-related topics. By modeling data-driven dialogue and the deconstruction of…
Storlazzi, Curt; Dartnell, Peter; Hatcher, Gerry; Gibbs, Ann E.
2016-01-01
The rugosity or complexity of the seafloor has been shown to be an important ecological parameter for fish, algae, and corals. Historically, rugosity has been measured either using simple and subjective manual methods such as ‘chain-and-tape’ or complicated and expensive geophysical methods. Here, we demonstrate the application of structure-from-motion (SfM) photogrammetry to generate high-resolution, three-dimensional bathymetric models of a fringing reef from existing underwater video collected to characterize the seafloor. SfM techniques are capable of achieving spatial resolution that can be orders of magnitude greater than large-scale lidar and sonar mapping of coral reef ecosystems. The resulting data provide finer-scale measurements of bathymetry and rugosity that are more applicable to ecological studies of coral reefs than provided by the more expensive and time-consuming geophysical methods. Utilizing SfM techniques for characterizing the benthic habitat proved to be more effective and quantitatively powerful than conventional methods and thus might portend the end of the ‘chain-and-tape’ method for measuring benthic complexity.
Lucas, Patricia J; Baird, Janis; Arai, Lisa; Law, Catherine; Roberts, Helen M
2007-01-15
The inclusion of qualitative studies in systematic reviews poses methodological challenges. This paper presents worked examples of two methods of data synthesis (textual narrative and thematic), used in relation to one review, with the aim of enabling researchers to consider the strength of different approaches. A systematic review of lay perspectives of infant size and growth was conducted, locating 19 studies (including both qualitative and quantitative). The data extracted from these were synthesised using both a textual narrative and a thematic synthesis. The processes of both methods are presented, showing a stepwise progression to the final synthesis. Both methods led us to similar conclusions about lay views toward infant size and growth. Differences between methods lie in the way they dealt with study quality and heterogeneity. On the basis of the work reported here, we consider textual narrative and thematic synthesis have strengths and weaknesses in relation to different research questions. Thematic synthesis holds most potential for hypothesis generation, but may obscure heterogeneity and quality appraisal. Textual narrative synthesis is better able to describe the scope of existing research and account for the strength of evidence, but is less good at identifying commonality.
Leaner and greener analysis of cannabinoids.
Mudge, Elizabeth M; Murch, Susan J; Brown, Paula N
2017-05-01
There is an explosion in the number of labs analyzing cannabinoids in marijuana (Cannabis sativa L., Cannabaceae) but existing methods are inefficient, require expert analysts, and use large volumes of potentially environmentally damaging solvents. The objective of this work was to develop and validate an accurate method for analyzing cannabinoids in cannabis raw materials and finished products that is more efficient and uses fewer toxic solvents. An HPLC-DAD method was developed for eight cannabinoids in cannabis flowers and oils using a statistically guided optimization plan based on the principles of green chemistry. A single-laboratory validation determined the linearity, selectivity, accuracy, repeatability, intermediate precision, limit of detection, and limit of quantitation of the method. Amounts of individual cannabinoids above the limit of quantitation in the flowers ranged from 0.02 to 14.9% w/w, with repeatability ranging from 0.78 to 10.08% relative standard deviation. The intermediate precision determined using HorRat ratios ranged from 0.3 to 2.0. The LOQs for individual cannabinoids in flowers ranged from 0.02 to 0.17% w/w. This is a significant improvement over previous methods and is suitable for a wide range of applications including regulatory compliance, clinical studies, direct patient medical services, and commercial suppliers.
NASA Astrophysics Data System (ADS)
Lim, Hongki; Fessler, Jeffrey A.; Wilderman, Scott J.; Brooks, Allen F.; Dewaraja, Yuni K.
2018-06-01
While the yield of positrons used in Y-90 PET is independent of tissue media, Y-90 SPECT imaging is complicated by the tissue dependence of bremsstrahlung photon generation. The probability of bremsstrahlung production is proportional to the square of the atomic number of the medium. Hence, the same amount of activity in different tissue regions of the body will produce different numbers of bremsstrahlung photons. Existing reconstruction methods disregard this tissue-dependency, potentially impacting both qualitative and quantitative imaging of heterogeneous regions of the body such as bone with marrow cavities. In this proof-of-concept study, we propose a new maximum-likelihood method that incorporates bremsstrahlung generation probabilities into the system matrix, enabling images of the desired Y-90 distribution to be reconstructed instead of the ‘bremsstrahlung distribution’ that is obtained with existing methods. The tissue-dependent probabilities are generated by Monte Carlo simulation while bone volume fractions for each SPECT voxel are obtained from co-registered CT. First, we demonstrate the tissue dependency in a SPECT/CT imaging experiment with Y-90 in bone equivalent solution and water. Visually, the proposed reconstruction approach better matched the true image and the Y-90 PET image than the standard bremsstrahlung reconstruction approach. An XCAT phantom simulation including bone and marrow regions also demonstrated better agreement with the true image using the proposed reconstruction method. Quantitatively, compared with the standard reconstruction, the new method improved estimation of the liquid bone:water activity concentration ratio by 40% in the SPECT measurement and the cortical bone:marrow activity concentration ratio by 58% in the XCAT simulation.
Impact of HIV type 1 subtype variation on viral RNA quantitation.
Parekh, B; Phillips, S; Granade, T C; Baggs, J; Hu, D J; Respess, R
1999-01-20
We evaluated the performance of three HIV-1 RNA quantitation methods (Amplicor HIV-1 MONITOR-1.0, NASBA, and Quantiplex HIV RNA 2.0 [branched DNA (bDNA)]) using plasma specimens (N = 60) from individuals from Asia and Africa infected with one of three HIV-1 subtypes (A, Thai B [B'] or E; N = 20 each). Our results demonstrate that of the 20 subtype A specimens, 19 were quantifiable by the bDNA assay compared with 15 by the MONITOR-1.0 and 13 by NASBA. Of those quantifiable, the mean log10 difference was 0.93 between bDNA and MONITOR-1.0 and 0.46 between bDNA and NASBA. For subtype B' specimens, the correlation among methods was better with only 2 specimens missed by NASBA and 3 by the bDNA assay. However the missed specimens had viral burden near the lower limit (1000 copies/ml) for these assays. For the 20 subtype E specimens, MONITOR-1.0 and NASBA quantified RNA in 17 and 14 specimens, respectively, as compared with 19 specimens quantified by the bDNA assay. The correlation among different assays, especially between bDNA/NASBA and MONITOR-1.0/NASBA, was poor, although the mean log10 difference for subtype E specimens was 0.4 between bDNA and MONITOR-1.0 and only 0.08 between bDNA and NASBA. The addition of a new primer set, designed for non-B HIV-1 subtypes, to the existing MONITOR assay (MONITOR-1.0+) resulted in RNA detection in all 60 specimens and significantly improved the efficiency of quantitation for subtypes A and E. Our data indicate that HIV-1 subtype variation can have a major influence on viral load quantitation by different methods. Periodic evaluation and modification of these quantitative methods may be necessary to ensure reliable quantification of divergent viruses.
The FY 1980 Department of Defense Program for Research, Development, and Acquisition
1979-02-01
materiel. Up to a point, superior performance is an offset to this quantitative disadvantage. Lanchester’s theory of warfare derived simplified relations...intermediate ranges. Underground Test. The next scheduled underground test ( UGT ), MINERS IRON, in FY 1980, will provide engineering and design data on...methods of discriminating between UGTs and earthquakes, and address U.S. capabilities to monitor both the existing Threshold Test Ban Treaty and the
Zhu, Zhi; Zhang, Wenhua; Leng, Xuefei; Zhang, Mingxia; Guan, Zhichao; Lu, Jiangquan; Yang, Chaoyong James
2012-10-21
Genetic alternations can serve as highly specific biomarkers to distinguish fatal bacteria or cancer cells from their normal counterparts. However, these mutations normally exist in very rare amount in the presence of a large excess of non-mutated analogs. Taking the notorious pathogen E. coli O157:H7 as the target analyte, we have developed an agarose droplet-based microfluidic ePCR method for highly sensitive, specific and quantitative detection of rare pathogens in the high background of normal bacteria. Massively parallel singleplex and multiplex PCR at the single-cell level in agarose droplets have been successfully established. Moreover, we challenged the system with rare pathogen detection and realized the sensitive and quantitative analysis of a single E. coli O157:H7 cell in the high background of 100,000 excess normal K12 cells. For the first time, we demonstrated rare pathogen detection through agarose droplet microfluidic ePCR. Such a multiplex single-cell agarose droplet amplification method enables ultra-high throughput and multi-parameter genetic analysis of large population of cells at the single-cell level to uncover the stochastic variations in biological systems.
Manlove, Kezia; Stanley, Laura; Peck, Alyssa
2015-10-01
Quantitative evaluation of vehicle occupant protection programs is critical for ensuring efficient government resource allocation, but few methods exist for conducting evaluation across multiple programs simultaneously. Here we present an analysis of occupant protection efficacy in the state of Montana. This approach relies on seat belt compliance rates as measured by the National Occupant Protection Usage Survey (NOPUS). A hierarchical logistic regression model is used to estimate the impacts of four Montana Department of Transportation (MDT)-funded occupant protection programs used in the state of Montana, following adjustment for a suite of potential confounders. Activity from two programs, Buckle Up coalitions and media campaigns, are associated with increased seat belt use in Montana, whereas the impact of another program, Selective Traffic Enforcement, is potentially masked by other program activity. A final program, Driver's Education, is not associated with any shift in seat belt use. This method allows for a preliminary quantitative estimation of program impacts without requiring states to obtain any new seat belt use data. This approach provides states a preliminary look at program impacts, and a means for carefully planning future program allocation and investigation. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Novel Method for Tracking Individuals of Fruit Fly Swarms Flying in a Laboratory Flight Arena
Cheng, Xi En; Qian, Zhi-Ming; Wang, Shuo Hong; Jiang, Nan; Guo, Aike; Chen, Yan Qiu
2015-01-01
The growing interest in studying social behaviours of swarming fruit flies, Drosophila melanogaster, has heightened the need for developing tools that provide quantitative motion data. To achieve such a goal, multi-camera three-dimensional tracking technology is the key experimental gateway. We have developed a novel tracking system for tracking hundreds of fruit flies flying in a confined cubic flight arena. In addition to the proposed tracking algorithm, this work offers additional contributions in three aspects: body detection, orientation estimation, and data validation. To demonstrate the opportunities that the proposed system offers for generating high-throughput quantitative motion data, we conducted experiments on five experimental configurations. We also performed quantitative analysis on the kinematics and the spatial structure and the motion patterns of fruit fly swarms. We found that there exists an asymptotic distance between fruit flies in swarms as the population density increases. Further, we discovered the evidence for repulsive response when the distance between fruit flies approached the asymptotic distance. Overall, the proposed tracking system presents a powerful method for studying flight behaviours of fruit flies in a three-dimensional environment. PMID:26083385
Charpentier, R.R.; Gautier, D.L.
2011-01-01
The USGS has assessed undiscovered petroleum resources in the Arctic through geological mapping, basin analysis and quantitative assessment. The new map compilation provided the base from which geologists subdivided the Arctic for burial history modelling and quantitative assessment. The CARA was a probabilistic, geologically based study that used existing USGS methodology, modified somewhat for the circumstances of the Arctic. The assessment relied heavily on analogue modelling, with numerical input as lognormal distributions of sizes and numbers of undiscovered accumulations. Probabilistic results for individual assessment units were statistically aggregated taking geological dependencies into account. Fourteen papers in this Geological Society volume present summaries of various aspects of the CARA. ?? 2011 The Geological Society of London.
Methodology for determining the investment attractiveness of construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana
2018-03-01
The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.
Creating objects and object categories for studying perception and perceptual learning.
Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay
2012-11-02
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties. Many innovative and useful methods currently exist for creating novel objects and object categories (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
Comparing acoustic measurement data in eastern margin of Sea of Japan and Umitaka Spur
NASA Astrophysics Data System (ADS)
Kondo, M., IV; Aoyama, C.
2017-12-01
Recently, methane hydrate is gaining remarkable attention for the abundant amount in the sea area around Japan, and also for its carbon dioxide emission amount being as little as that of natural resources when being combusted. Two types of methane hydrates are known to exist, depending on the settings. Sand layer type methane hydrate contains sand particles and is extracted mainly in the Pacific Ocean, around 100 to 400m below seafloor. On the other hand, shallow type is found around 100m below seafloor in the Sea of Japan, in clumps, veins, and particles. One of the extraction methods of shallow type methane hydrate observed in the Sea of Japan is the seabed exploration system using acoustic equipment, which Chiharu Aoyama owns patent. Methane plumes often exist in the vicinity of shallow type methane hydrate and these seeping plumes are visualized as images on acoustic instruments such as quantitative echo sounder and multi-beam echo sounder (hereinafter referred to as MBES). These images look like rising beams on monitors and are called gas plumes. Methane hydrate can be explored effectively in the area using this method and it is understood that even when the same plume is examined, backscattering strength data (hereinafter referred as SV) observed using acoustic equipment will not be the same on quantitative echo sounder and MBES. In June 2017, measurement of acoustic data using multi-beam sonar (EM122) and quantitative echo sounder (EA600) were performed onboard DAIICHI KAIYOMARU (KAIYO ENGINEERING CO., LTD) at eastern margin of Sea of Japn and at Umitaka Spur. In this study, author will make comparisons of the acoustic data.
Ultrasonics Equipped Crimp Tool: A New Technology for Aircraft Wiring Safety
NASA Technical Reports Server (NTRS)
Yost, William T.; Perey, Daniel F.; Cramer, Elliott
2006-01-01
We report on the development of a new measurement technique to quantitatively assess the condition of wire crimp connections. This ultrasonic (UT) method transmits high frequency sound waves through the joint under inspection. The wire-crimp region filters and scatters the ultrasonic energy as it passes through the crimp and wire. The resulting output (both time and frequency domains) provides a quantitative measure of the joint quality that is independent and unaffected by current. Crimps of poor mechanical and electrical quality will result in low temporal output and will distort the spectrum into unique and predictable patterns, depending on crimp "quality". This inexpensive, real-time measurement system can provide certification of crimps as they are made and recertification of existing wire crimps currently in service. The measurements for re-certification do not require that the wire be disconnected from its circuit. No other technology exists to measure in-situ the condition of wire joints (no electrical currents through the crimp are used in this analytical technique). We discuss the signals obtained from this instrument, and correlate these signals with destructive wire pull tests.
Compound analysis via graph kernels incorporating chirality.
Brown, J B; Urata, Takashi; Tamura, Takeyuki; Arai, Midori A; Kawabata, Takeo; Akutsu, Tatsuya
2010-12-01
High accuracy is paramount when predicting biochemical characteristics using Quantitative Structural-Property Relationships (QSPRs). Although existing graph-theoretic kernel methods combined with machine learning techniques are efficient for QSPR model construction, they cannot distinguish topologically identical chiral compounds which often exhibit different biological characteristics. In this paper, we propose a new method that extends the recently developed tree pattern graph kernel to accommodate stereoisomers. We show that Support Vector Regression (SVR) with a chiral graph kernel is useful for target property prediction by demonstrating its application to a set of human vitamin D receptor ligands currently under consideration for their potential anti-cancer effects.
Walker, Jean T; Martin, Tina M; Haynie, Lisa; Norwood, Anne; White, Jill; Grant, LaVerne
2007-01-01
Accelerated baccalaureate nursing programs are in great demand in the United States. Currently there are 197 such programs, but little research has been conducted on student characteristics and program outcomes. This quantitative study explores preferences of second-degree students and traditional generic students with regard to teaching methods and relationships with faculty. The results indicate that statistically significant differences exist between the two groups of students. Three areas of significance are ability for self-directed learning, expectations of faculty and classroom structure, and obtaining a grade that really matters.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
Efficient Bayesian mixed model analysis increases association power in large cohorts
Loh, Po-Ru; Tucker, George; Bulik-Sullivan, Brendan K; Vilhjálmsson, Bjarni J; Finucane, Hilary K; Salem, Rany M; Chasman, Daniel I; Ridker, Paul M; Neale, Benjamin M; Berger, Bonnie; Patterson, Nick; Price, Alkes L
2014-01-01
Linear mixed models are a powerful statistical tool for identifying genetic associations and avoiding confounding. However, existing methods are computationally intractable in large cohorts, and may not optimize power. All existing methods require time cost O(MN2) (where N = #samples and M = #SNPs) and implicitly assume an infinitesimal genetic architecture in which effect sizes are normally distributed, which can limit power. Here, we present a far more efficient mixed model association method, BOLT-LMM, which requires only a small number of O(MN)-time iterations and increases power by modeling more realistic, non-infinitesimal genetic architectures via a Bayesian mixture prior on marker effect sizes. We applied BOLT-LMM to nine quantitative traits in 23,294 samples from the Women’s Genome Health Study (WGHS) and observed significant increases in power, consistent with simulations. Theory and simulations show that the boost in power increases with cohort size, making BOLT-LMM appealing for GWAS in large cohorts. PMID:25642633
Stability basin estimates fall risk from observed kinematics, demonstrated on the Sit-to-Stand task.
Shia, Victor; Moore, Talia Yuki; Holmes, Patrick; Bajcsy, Ruzena; Vasudevan, Ram
2018-04-27
The ability to quantitatively measure stability is essential to ensuring the safety of locomoting systems. While the response to perturbation directly reflects the stability of a motion, this experimental method puts human subjects at risk. Unfortunately, existing indirect methods for estimating stability from unperturbed motion have been shown to have limited predictive power. This paper leverages recent advances in dynamical systems theory to accurately estimate the stability of human motion without requiring perturbation. This approach relies on kinematic observations of a nominal Sit-to-Stand motion to construct an individual-specific dynamic model, input bounds, and feedback control that are then used to compute the set of perturbations from which the model can recover. This set, referred to as the stability basin, was computed for 14 individuals, and was able to successfully differentiate between less and more stable Sit-to-Stand strategies for each individual with greater accuracy than existing methods. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue
Nyengaard, Jens Randel; Lind, Martin; Spector, Myron
2015-01-01
Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715
A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.
Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron
2015-04-01
To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.
NASA Astrophysics Data System (ADS)
Singh, Manpreet; Alabanza, Anginelle; Gonzalez, Lorelis E.; Wang, Weiwei; Reeves, W. Brian; Hahm, Jong-In
2016-02-01
Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules.Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules. Electronic supplementary information (ESI) available: Typical SEM images of the ZnO NRs used in the biomarker assays are provided in Fig. S1. See DOI: 10.1039/c5nr08706f
NASA Astrophysics Data System (ADS)
Beach, Daniel G.
2017-08-01
Paralytic shellfish toxins (PSTs) are neurotoxins produced by dinoflagellates and cyanobacteria that cause paralytic shellfish poisoning in humans. PST quantitation by LC-MS is challenging because of their high polarity, lability as gas-phase ions, and large number of potentially interfering analogues. Differential mobility spectrometry (DMS) has the potential to improve the performance of LC-MS methods for PSTs in terms of selectivity and limits of detection. This work describes a comprehensive investigation of the separation of 16 regulated PSTs by DMS and the development of highly selective LC-DMS-MS methods for PST quantitation. The effects of all DMS parameters on the separation of PSTs from one another were first investigated in detail. The labile nature of 11α-gonyautoxin epimers gave unique insight into fragmentation of labile analytes before, during, and after the DMS analyzer. Two sets of DMS parameters were identified that either optimized the resolution of PSTs from one another or transmitted them at a limited number of compensation voltage (CV) values corresponding to structural subclasses. These were used to develop multidimensional LC-DMS-MS/MS methods using existing HILIC-MS/MS parameters. In both cases, improved selectivity was observed when using DMS, and the quantitative capabilities of a rapid UPLC-DMS-MS/MS method were evaluated. Limits of detection of the developed method were similar to those without DMS, and differences were highly analyte-dependant. Analysis of shellfish matrix reference materials showed good agreement with established methods. The developed methods will be useful in cases where specific matrix interferences are encountered in the LC-MS/MS analysis of PSTs in complex biological samples.
Chen, Lixun; Jiang, Ling; Shen, Aizong; Wei, Wei
2016-09-01
The frequently low quality of submitted spontaneous reports is of an increasing concern; to our knowledge, no validated instrument exists for assessing case reports' quality comprehensively enough. This work was conducted to develop such a quality instrument for assessing the spontaneous reports of adverse drug reaction (ADR)/adverse drug event (ADE) in China. Initial evaluation indicators were generated using systematic and literature data analysis. Final indicators and their weights were identified using Delphi method. The final quality instrument was developed by adopting the synthetic scoring method. A consensus was reached after four rounds of Delphi survey. The developed quality instrument consisted of 6 first-rank indicators, 18 second-rank indicators, and 115 third-rank indicators, and each rank indicator has been weighted. It evaluates the quality of spontaneous reports of ADR/ADE comprehensively and quantitatively on six parameters: authenticity, duplication, regulatory, completeness, vigilance level, and reporting time frame. The developed instrument was tested with good reliability and validity, which can be used to comprehensively and quantitatively assess the submitted spontaneous reports of ADR/ADE in China.
What can acute medicine learn from qualitative methods?
Heasman, Brett; Reader, Tom W
2015-10-01
The contribution of qualitative methods to evidence-based medicine is growing, with qualitative studies increasingly used to examine patient experience and unsafe organizational cultures. The present review considers qualitative research recently conducted on teamwork and organizational culture in the ICU and also other acute domains. Qualitative studies have highlighted the importance of interpersonal and social aspects of healthcare on managing and responding to patient care needs. Clear/consistent communication, compassion, and trust underpin successful patient-physician interactions, with improved patient experiences linked to patient safety and clinical effectiveness across a wide range of measures and outcomes. Across multidisciplinary teams, good communication facilitates shared understanding, decision-making and coordinated action, reducing patient risk in the process. Qualitative methods highlight the complex nature of risk management in hospital wards, which is highly contextualized to the demands and resources available, and influenced by multilayered social contexts. In addition to augmenting quantitative research, qualitative investigations enable the investigation of questions on social behaviour that are beyond the scope of quantitative assessment alone. To develop improved patient-centred care, health professionals should therefore consider integrating qualitative procedures into their existing assessments of patient/staff satisfaction.
Proposed Objective Odor Control Test Methodology for Waste Containment
NASA Technical Reports Server (NTRS)
Vos, Gordon
2010-01-01
The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.
Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations
Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.
2013-01-01
Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359
Rodenacker, K; Aubele, M; Hutzler, P; Adiga, P S
1997-01-01
In molecular pathology numerical chromosome aberrations have been found to be decisive for the prognosis of malignancy in tumours. The existence of such aberrations can be detected by interphase fluorescence in situ hybridization (FISH). The gain or loss of certain base sequences in the desoxyribonucleic acid (DNA) can be estimated by counting the number of FISH signals per cell nucleus. The quantitative evaluation of such events is a necessary condition for a prospective use in diagnostic pathology. To avoid occlusions of signals, the cell nucleus has to be analyzed in three dimensions. Confocal laser scanning microscopy is the means to obtain series of optical thin sections from fluorescence stained or marked material to fulfill the conditions mentioned above. A graphical user interface (GUI) to a software package for display, inspection, count and (semi-)automatic analysis of 3-D images for pathologists is outlined including the underlying methods of 3-D image interaction and segmentation developed. The preparative methods are briefly described. Main emphasis is given to the methodical questions of computer-aided analysis of large 3-D image data sets for pathologists. Several automated analysis steps can be performed for segmentation and succeeding quantification. However tumour material is in contrast to isolated or cultured cells even for visual inspection, a difficult material. For the present a fully automated digital image analysis of 3-D data is not in sight. A semi-automatic segmentation method is thus presented here.
Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; ...
2013-05-15
Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sagemore » grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.« less
Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine
2013-01-01
Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.
Cloud cover determination in polar regions from satellite imagery
NASA Technical Reports Server (NTRS)
Barry, R. G.; Maslanik, J. A.; Key, J. R.
1987-01-01
A definition is undertaken of the spectral and spatial characteristics of clouds and surface conditions in the polar regions, and to the creation of calibrated, geometrically correct data sets suitable for quantitative analysis. Ways are explored in which this information can be applied to cloud classifications as new methods or as extensions to existing classification schemes. A methodology is developed that uses automated techniques to merge Advanced Very High Resolution Radiometer (AVHRR) and Scanning Multichannel Microwave Radiometer (SMMR) data, and to apply first-order calibration and zenith angle corrections to the AVHRR imagery. Cloud cover and surface types are manually interpreted, and manual methods are used to define relatively pure training areas to describe the textural and multispectral characteristics of clouds over several surface conditions. The effects of viewing angle and bidirectional reflectance differences are studied for several classes, and the effectiveness of some key components of existing classification schemes is tested.
Stakeholder prioritization of zoonoses in Japan with analytic hierarchy process method.
Kadohira, M; Hill, G; Yoshizaki, R; Ota, S; Yoshikawa, Y
2015-05-01
There exists an urgent need to develop iterative risk assessment strategies of zoonotic diseases. The aim of this study is to develop a method of prioritizing 98 zoonoses derived from animal pathogens in Japan and to involve four major groups of stakeholders: researchers, physicians, public health officials, and citizens. We used a combination of risk profiling and analytic hierarchy process (AHP). Profiling risk was accomplished with semi-quantitative analysis of existing public health data. AHP data collection was performed by administering questionnaires to the four stakeholder groups. Results showed that researchers and public health officials focused on case fatality as the chief important factor, while physicians and citizens placed more weight on diagnosis and prevention, respectively. Most of the six top-ranked diseases were similar among all stakeholders. Transmissible spongiform encephalopathy, severe acute respiratory syndrome, and Ebola fever were ranked first, second, and third, respectively.
NASA Astrophysics Data System (ADS)
Basten, Van; Latief, Yusuf; Berawi, Mohammed Ali; Budiman, Rachmat; Riswanto
2017-03-01
Total completed building construction value in Indonesia increased 116% during 2009 to 2011. That's followed by increasing 11% energy consumption in Indonesia in the last three years with 70% energy met to the electricity needs of commercial building. In addition, a few application of green building concept in Indonesia made the greenhouse gas emissions or CO2 amount increased by 25%. Construction, operation, and maintain of building cost consider relatively high. The evaluation in this research is used to improve the building performance with some of green concept alternatives. The research methodology is conducted by combination of qualitative and quantitative approaches through interview and case study. Assessing the successful of optimization functions in the existing green building is based on the operational and maintenance phase with the Life Cycle Assessment (LCA) Method. The result of optimization that is the largest efficiency and effective of building life cycle.
Combinational Reasoning of Quantitative Fuzzy Topological Relations for Simple Fuzzy Regions
Liu, Bo; Li, Dajun; Xia, Yuanping; Ruan, Jian; Xu, Lili; Wu, Huanyi
2015-01-01
In recent years, formalization and reasoning of topological relations have become a hot topic as a means to generate knowledge about the relations between spatial objects at the conceptual and geometrical levels. These mechanisms have been widely used in spatial data query, spatial data mining, evaluation of equivalence and similarity in a spatial scene, as well as for consistency assessment of the topological relations of multi-resolution spatial databases. The concept of computational fuzzy topological space is applied to simple fuzzy regions to efficiently and more accurately solve fuzzy topological relations. Thus, extending the existing research and improving upon the previous work, this paper presents a new method to describe fuzzy topological relations between simple spatial regions in Geographic Information Sciences (GIS) and Artificial Intelligence (AI). Firstly, we propose a new definition for simple fuzzy line segments and simple fuzzy regions based on the computational fuzzy topology. And then, based on the new definitions, we also propose a new combinational reasoning method to compute the topological relations between simple fuzzy regions, moreover, this study has discovered that there are (1) 23 different topological relations between a simple crisp region and a simple fuzzy region; (2) 152 different topological relations between two simple fuzzy regions. In the end, we have discussed some examples to demonstrate the validity of the new method, through comparisons with existing fuzzy models, we showed that the proposed method can compute more than the existing models, as it is more expressive than the existing fuzzy models. PMID:25775452
NASA Astrophysics Data System (ADS)
Cao, Qian; Thawait, Gaurav; Gang, Grace J.; Zbijewski, Wojciech; Reigel, Thomas; Brown, Tyler; Corner, Brian; Demehri, Shadpour; Siewerdsen, Jeffrey H.
2015-02-01
Joint space morphology can be indicative of the risk, presence, progression, and/or treatment response of disease or trauma. We describe a novel methodology of characterizing joint space morphology in high-resolution 3D images (e.g. cone-beam CT (CBCT)) using a model based on elementary electrostatics that overcomes a variety of basic limitations of existing 2D and 3D methods. The method models each surface of a joint as a conductor at fixed electrostatic potential and characterizes the intra-articular space in terms of the electric field lines resulting from the solution of Gauss’ Law and the Laplace equation. As a test case, the method was applied to discrimination of healthy and osteoarthritic subjects (N = 39) in 3D images of the knee acquired on an extremity CBCT system. The method demonstrated improved diagnostic performance (area under the receiver operating characteristic curve, AUC > 0.98) compared to simpler methods of quantitative measurement and qualitative image-based assessment by three expert musculoskeletal radiologists (AUC = 0.87, p-value = 0.007). The method is applicable to simple (e.g. the knee or elbow) or multi-axial joints (e.g. the wrist or ankle) and may provide a useful means of quantitatively assessing a variety of joint pathologies.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-01
... emissions, (applicants are encouraged to provide quantitative information regarding expected reductions in...). Applicants are encouraged to provide quantitative information that validates the existence of substantial... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
Visualization techniques to aid in the analysis of multispectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.
Reporting Qualitative Research: Standards, Challenges, and Implications for Health Design.
Peditto, Kathryn
2018-04-01
This Methods column describes the existing reporting standards for qualitative research, their application to health design research, and the challenges to implementation. Intended for both researchers and practitioners, this article provides multiple perspectives on both reporting and evaluating high-quality qualitative research. Two popular reporting standards exist for reporting qualitative research-the Consolidated Criteria for Reporting Qualitative Research (COREQ) and the Standards for Reporting Qualitative Research (SRQR). Though compiled using similar procedures, they differ in their criteria and the methods to which they apply. Creating and applying reporting criteria is inherently difficult due to the undefined and fluctuating nature of qualitative research when compared to quantitative studies. Qualitative research is expansive and occasionally controversial, spanning many different methods of inquiry and epistemological approaches. A "one-size-fits-all" standard for reporting qualitative research can be restrictive, but COREQ and SRQR both serve as valuable tools for developing responsible qualitative research proposals, effectively communicating research decisions, and evaluating submissions. Ultimately, tailoring a set of standards specific to health design research and its frequently used methods would ensure quality research and aid reviewers in their evaluations.
2016-01-01
Family Policy’s SECO program, which reviewed existing SECO metrics and data sources, as well as analytic methods of previ- ous research, to determine ...process that requires an iterative cycle of assessment of collected data (typically, but not solely, quantitative data) to determine whether SECO...RAND suggests five steps to develop and implement the SECO inter- nal monitoring system: Step 1. Describe the logic or theory of how activities are
ERIC Educational Resources Information Center
Saalu, L. C.; Abraham A. A.; Aina, W. O.
2010-01-01
Problem-based learning (PBL) is a method of teaching that uses hypothetical clinical cases, individual investigation and group process. In recent years, in medical education, problem-based learning (PBL) has increasingly been adopted as the preferred pedagogy in many countries around the world. Controversy, however, still exists as the potential…
Comprehensive Understanding of the Zipingpu Reservoir to the Ms8.0 Wenchuan Earthquake
NASA Astrophysics Data System (ADS)
Cheng, H.; Pang, Y. J.; Zhang, H.; Shi, Y.
2014-12-01
After the Wenchuan earthquake occurred, whether the big earthquake triggered by the storage of the Zipingpu Reservoir has attracted wide attention in international academic community. In addition to the qualitative discussion, many scholars also adopted the quantitative analysis methods to calculate the stress changes, but due to the different results, they draw very different conclusions. Here, we take the dispute of different teams in the quantitative calculation of Zipingpu reservoir as a starting point. In order to find out the key influence factors of quantitative calculation and know about the existing uncertainty elements during the numerical simulation, we analyze factors which may cause the differences. The preliminary results show that the calculation methods (analytical method or numerical method), dimension of models (2-D or 3-D), diffusion model, diffusion coefficient and focal mechanism are the main factors resulted in the differences, especially the diffusion coefficient of the fractured rock mass. The change of coulomb failure stress of the epicenter of Wenchuan earthquake attained from 2-D model is about 3 times of that of 3-D model. And it is not reasonable that only considering the fault permeability (assuming the permeability of rock mass as infinity) or only considering homogeneous isotropic rock mass permeability (ignoring the fault permeability). The different focal mechanisms also could dramatically affect the change of coulomb failure stress of the epicenter of Wenchuan earthquake, and the differences can research 2-7 times. And the differences the change of coulomb failure stress can reach several hundreds times, when selecting different diffusion coefficients. According to existing research that the magnitude of coulomb failure stress change is about several kPa, we could not rule out the possibility that the Zipingpu Reservoir may trigger the 2008 Wenchuan earthquake. However, for the background stress is not clear and coulomb failure stress change is too little, we also not sure there must be a connection between reservoir and earthquake. In future work, we should target on the basis of field survey and indoor experiment, improve the model and develop high performance simulation.
Accurate sparse-projection image reconstruction via nonlocal TV regularization.
Zhang, Yi; Zhang, Weihua; Zhou, Jiliu
2014-01-01
Sparse-projection image reconstruction is a useful approach to lower the radiation dose; however, the incompleteness of projection data will cause degeneration of imaging quality. As a typical compressive sensing method, total variation has obtained great attention on this problem. Suffering from the theoretical imperfection, total variation will produce blocky effect on smooth regions and blur edges. To overcome this problem, in this paper, we introduce the nonlocal total variation into sparse-projection image reconstruction and formulate the minimization problem with new nonlocal total variation norm. The qualitative and quantitative analyses of numerical as well as clinical results demonstrate the validity of the proposed method. Comparing to other existing methods, our method more efficiently suppresses artifacts caused by low-rank reconstruction and reserves structure information better.
The state of rehabilitation research: art or science?
Tate, Denise G
2006-02-01
Rehabilitation research has been criticized as not standing up enough to the rigors of scientific method to be called "science." The field has been portrayed as slow to promote its scientific achievements and to include them under the rubric of evidence-based rehabilitation. Following in the footsteps of psychology, rehabilitation as a broad-based discipline has faced many similar obstacles in achieving scientific status. Controversy exists about what exactly constitutes rehabilitation science versus its art and its respective multidisciplinary domains. The conception of these domains is directly related to current methods available to assess the state of the discipline and its research accomplishments. I used quantitative methods, such as randomized clinical and/or controlled trials (RCTs) and systematic reviews, to assess the status of rehabilitation research. Findings suggest that, as a field, rehabilitation makes significant contributions to science, measurable by the number and quality of RCTs and systematic reviews conducted so far on topics of critical importance for clinical care. In "artful" complement, qualitative approaches can be used as research tools to aid investigators in seeking knowledge beyond that obtained by quantitative methods, assessing many complexities associated with the various contexts of rehabilitation research. Other requirements to develop a common vision of rehabilitation science are also discussed.
Analysis of swimming performance: perceptions and practices of US-based swimming coaches.
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid
2016-01-01
In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.
Geisler, B P; Widerberg, K F; Berghöfer, A; Willich, S N
2010-01-01
This paper's aim is to identify existing and developing new concepts of organization, management, and leadership at a large European university hospital; and to evaluate whether mixed qualitative-quantitative methods with both internal and external input can provide helpful views of the possible future of large health care providers. Using the Delphi method in semi-structured, semi-quantitative interviews, with managers and employees as experts, the authors performed a vertical and a horizontal internal analysis. In addition, input from innovative faculties in other countries was obtained through structured power questions. These two sources were used to create three final scenarios, which evaluated using traditional strategic planning methods. There is found a collaboration scenario in which faculty and hospital are separated; a split scenario which divides the organization into three independent hospitals; and a corporation scenario in which corporate activities are bundled in three separate entities. In complex mergers of knowledge-driven organizations, the employees of the own organization (in addition to external consultants) might be tapped as a knowledge resource to successful future business models. The paper uses a real world consulting case to present a new set of methods for strategic planning in large health care provider organizations.
2017-01-01
Mapping gene expression as a quantitative trait using whole genome-sequencing and transcriptome analysis allows to discover the functional consequences of genetic variation. We developed a novel method and ultra-fast software Findr for higly accurate causal inference between gene expression traits using cis-regulatory DNA variations as causal anchors, which improves current methods by taking into consideration hidden confounders and weak regulations. Findr outperformed existing methods on the DREAM5 Systems Genetics challenge and on the prediction of microRNA and transcription factor targets in human lymphoblastoid cells, while being nearly a million times faster. Findr is publicly available at https://github.com/lingfeiwang/findr. PMID:28821014
1, 2, 3, 4: infusing quantitative literacy into introductory biology.
Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as a result of... encouraged to provide quantitative information that validates the existence of substantial transportation... quantitative and qualitative measures. Therefore, applicants for TIGER Discretionary Grants are generally...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... provide quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as... provide quantitative information that validates the existence of substantial transportation-related costs... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
Intrinsic and Extrinsic Evaluation of Spatiotemporal Text Representations in Twitter Streams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, Lawrence A.; Shaffer, Kyle J.; Arendt, Dustin L.
Language in social media is a dynamic system, constantly evolving and adapting, with words and concepts rapidly emerging, disappearing, and changing their meaning. These changes can be estimated using word representations in context, over time and across locations. A number of methods have been proposed to track these spatiotemporal changes but no general method exists to evaluate the quality of these representations. Previous work largely focused on qualitative evaluation, which we improve by proposing a set of visualizations that highlight changes in text representation over both space and time. We demonstrate usefulness of novel spatiotemporal representations to explore and characterizemore » specific aspects of the corpus of tweets collected from European countries over a two-week period centered around the terrorist attacks in Brussels in March 2016. In addition, we quantitatively evaluate spatiotemporal representations by feeding them into a downstream classification task – event type prediction. Thus, our work is the first to provide both intrinsic (qualitative) and extrinsic (quantitative) evaluation of text representations for spatiotemporal trends.« less
Incorporating temporal and clinical reasoning in a new measure of continuity of care.
Spooner, S. A.
1994-01-01
Previously described quantitative methods for measuring continuity of care have assumed that perfect continuity exists when a patient sees only one provider, regardless of the temporal pattern and clinical context of the visits. This paper describes an implementation of a new operational model of continuity--the Temporal Continuity Index--that takes into account time intervals between well visits in a pediatric residency continuity clinic. Ideal continuity in this model is achieved when intervals between visits are appropriate based on the age of the patient and clinical context of the encounters. The fundamental concept in this model is the expectation interval, which contains the length of the maximum ideal follow-up interval for a visit and the maximum follow-up interval. This paper describes an initial implementation of the TCI model and compares TCI calculations to previous quantitative methods and proposes its use as part of the assessment of resident education in outpatient settings. PMID:7950019
Garvelink, Mirjam M; Ngangue, Patrice A G; Adekpedjou, Rheda; Diouf, Ndeye T; Goh, Larissa; Blair, Louisa; Légaré, France
2016-04-01
We conducted a mixed-methods knowledge synthesis to assess the effectiveness of interventions to improve caregivers' involvement in decision making with seniors, and to describe caregivers' experiences of decision making in the absence of interventions. We analyzed forty-nine qualitative, fourteen quantitative, and three mixed-methods studies. The qualitative studies indicated that caregivers had unmet needs for information, discussions of values and needs, and decision support, which led to negative sentiments after decision making. Our results indicate that there have been insufficient quantitative evaluations of interventions to involve caregivers in decision making with seniors and that the evaluations that do exist found few clinically significant effects. Elements of usual care that received positive evaluations were the availability of a decision coach and a supportive decision-making environment. Additional rigorously evaluated interventions are needed to help caregivers be more involved in decision making with seniors. Project HOPE—The People-to-People Health Foundation, Inc.
Quantitative neuroanatomy for connectomics in Drosophila
Schneider-Mizell, Casey M; Gerhard, Stephan; Longair, Mark; Kazimiers, Tom; Li, Feng; Zwart, Maarten F; Champion, Andrew; Midgley, Frank M; Fetter, Richard D; Saalfeld, Stephan; Cardona, Albert
2016-01-01
Neuronal circuit mapping using electron microscopy demands laborious proofreading or reconciliation of multiple independent reconstructions. Here, we describe new methods to apply quantitative arbor and network context to iteratively proofread and reconstruct circuits and create anatomically enriched wiring diagrams. We measured the morphological underpinnings of connectivity in new and existing reconstructions of Drosophila sensorimotor (larva) and visual (adult) systems. Synaptic inputs were preferentially located on numerous small, microtubule-free 'twigs' which branch off a single microtubule-containing 'backbone'. Omission of individual twigs accounted for 96% of errors. However, the synapses of highly connected neurons were distributed across multiple twigs. Thus, the robustness of a strong connection to detailed twig anatomy was associated with robustness to reconstruction error. By comparing iterative reconstruction to the consensus of multiple reconstructions, we show that our method overcomes the need for redundant effort through the discovery and application of relationships between cellular neuroanatomy and synaptic connectivity. DOI: http://dx.doi.org/10.7554/eLife.12059.001 PMID:26990779
Cao, Mingyan; Mo, Wenjun David; Shannon, Anthony; Wei, Ziping; Washabaugh, Michael; Cash, Patricia
Aspartate (Asp) isomerization is a common post-translational modification of recombinant therapeutic proteins that can occur during manufacturing, storage, or administration. Asp isomerization in the complementarity-determining regions of a monoclonal antibody may affect the target binding and thus a sufficiently robust quality control method for routine monitoring is desirable. In this work, we utilized a liquid chromatography-mass spectrometry (LC/MS)-based approach to identify the Asp isomerization in the complementarity-determining regions of a therapeutic monoclonal antibody. To quantitate the site-specific Asp isomerization of the monoclonal antibody, a UV detection-based quantitation assay utilizing the same LC platform was developed. The assay was qualified and implemented for routine monitoring of this product-specific modification. Compared with existing methods, this analytical paradigm is applicable to identify Asp isomerization (or other modifications) and subsequently develop a rapid, sufficiently robust quality control method for routine site-specific monitoring and quantitation to ensure product quality. This approach first identifies and locates a product-related impurity (a critical quality attribute) caused by isomerization, deamidation, oxidation, or other post-translational modifications, and then utilizes synthetic peptides and MS to assist the development of a LC-UV-based chromatographic method that separates and quantifies the product-related impurities by UV peaks. The established LC-UV method has acceptable peak specificity, precision, linearity, and accuracy; it can be validated and used in a good manufacturing practice environment for lot release and stability testing. Aspartate isomerization is a common post-translational modification of recombinant proteins during manufacture process and storage. Isomerization in the complementarity-determining regions (CDRs) of a monoclonal antibody A (mAb-A) has been detected and has been shown to have impact on the binding affinity to the antigen. In this work, we utilized a mass spectrometry-based peptide mapping approach to detect and quantitate the Asp isomerization in the CDRs of mAb-A. To routinely monitor the CDR isomerization of mAb-A, a focused peptide mapping method utilizing reversed phase chromatographic separation and UV detection has been developed and qualified. This approach is generally applicable to monitor isomerization and other post-translational modifications of proteins in a specific and high-throughput mode to ensure product quality. © PDA, Inc. 2016.
Vass, Caroline M; Payne, Katherine
2017-09-01
There is emerging interest in the use of discrete choice experiments as a means of quantifying the perceived balance between benefits and risks (quantitative benefit-risk assessment) of new healthcare interventions, such as medicines, under assessment by regulatory agencies. For stated preference data on benefit-risk assessment to be used in regulatory decision making, the methods to generate these data must be valid, reliable and capable of producing meaningful estimates understood by decision makers. Some reporting guidelines exist for discrete choice experiments, and for related methods such as conjoint analysis. However, existing guidelines focus on reporting standards, are general in focus and do not consider the requirements for using discrete choice experiments specifically for quantifying benefit-risk assessments in the context of regulatory decision making. This opinion piece outlines the current state of play in using discrete choice experiments for benefit-risk assessment and proposes key areas needing to be addressed to demonstrate that discrete choice experiments are an appropriate and valid stated preference elicitation method in this context. Methodological research is required to establish: how robust the results of discrete choice experiments are to formats and methods of risk communication; how information in the discrete choice experiment can be presented effectually to respondents; whose preferences should be elicited; the correct underlying utility function and analytical model; the impact of heterogeneity in preferences; and the generalisability of the results. We believe these methodological issues should be addressed, alongside developing a 'reference case', before agencies can safely and confidently use discrete choice experiments for quantitative benefit-risk assessment in the context of regulatory decision making for new medicines and healthcare products.
Goujon, Nicolas; Devine, Alexandra; Baker, Sally M; Sprunt, Beth; Edmonds, Tanya J; Booth, Jennifer K; Keeffe, Jill E
2014-01-01
A review of existing measurement instruments was conducted to examine their suitability to measure disability prevalence and assess quality of life, protection of disability rights and community participation by people with disabilities, specifically within the context of development programs in low and middle-income countries. From a search of PubMed and the grey literature, potentially relevant measurement instruments were identified and examined for their content and psychometric properties, where possible. Criteria for inclusion were: based on the WHO's International Classification of Functioning Disability and Health (ICF), used quantitative methods, suitable for population-based studies of disability inclusive development in English and published after 1990. Characteristics of existing instruments were analysed according to components of the ICF and quality of life domains. Ten instruments were identified and reviewed according to the criteria listed above. Each version of instruments was analysed separately. Only three instruments included a component on quality of life. Domains from the ICF that were addressed by some but not all instruments included the environment, technology and communication. The measurement instruments reviewed covered the range of elements required to measure disability-inclusion within development contexts. However no single measurement instrument has the capacity to measure both disability prevalence and changes in quality of life according to contemporary disability paradigms. The review of measurement instruments supports the need for developing an instrument specifically intended to measure disability inclusive practice within development programs. Implications for Rehabilitation Surveys and tools are needed to plan disability inclusive development. Existing measurement tools to determine prevalence of disability, wellbeing, rights and access to the community were reviewed. No single validated tool exists for population-based studies, uses quantitative methods and the components of the ICF to measure prevalence of disability, well-being of people with disability and their access to their communities. A measurement tool that reflects the UNCRPD and addresses all components of the ICF is needed to assist in disability inclusive development, especially in low and mid resource countries.
Validation studies and proficiency testing.
Ankilam, Elke; Heinze, Petra; Kay, Simon; Van den Eede, Guy; Popping, Bert
2002-01-01
Genetically modified organisms (GMOs) entered the European food market in 1996. Current legislation demands the labeling of food products if they contain <1% GMO, as assessed for each ingredient of the product. To create confidence in the testing methods and to complement enforcement requirements, there is an urgent need for internationally validated methods, which could serve as reference methods. To date, several methods have been submitted to validation trials at an international level; approaches now exist that can be used in different circumstances and for different food matrixes. Moreover, the requirement for the formal validation of methods is clearly accepted; several national and international bodies are active in organizing studies. Further validation studies, especially on the quantitative polymerase chain reaction methods, need to be performed to cover the rising demand for new extraction methods and other background matrixes, as well as for novel GMO constructs.
Pan, Duohai; Crull, George; Yin, Shawn; Grosso, John
2014-02-01
Avalide(@), a medication used for the treatment of hypertension, is a combination of Irbesartan, and Hydrochlorothiazide. Irbesartan, one of the active pharmaceutical ingredients (API) in Avalide products, exists in two neat crystalline forms: Form A and Form B. Irbesartan Form A is the API form used in a wet granulation for the preparation of Avalide tablets. The presence of the less soluble Irbesartan Form B in Avalide tablets may result in the slower dissolution. In this paper, we have presented our work on the method development, verification and challenges of quantitatively detecting, via NIR and ssNMR, very small amounts of Irbesartan Form B in Avalide tablets. As part of the NIR method development and qualification, limit of detection, linearity and accuracy were examined. In addition, a limited study of the robustness of the method was conducted and a bias in the level of Form B was correlated to the ambient humidity. ssNMR, a primary method for the determination of polymorphic composition, was successfully used as an orthogonal technique to verify the accuracy of the NIR method and added to the confidence in the NIR method. The speed and efficiency of the NIR method make it a suitable and convenient tool for routine analysis of Avalide tablets for Form B in a QC environment. Copyright © 2013 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Bergren, David Alexander
2014-01-01
This quantitative study was designed to be an analysis of the relationship between school climate and student achievement through the creation of an index of climate-factors (SES, discipline, attendance, and school size) for which publicly available data existed. The index that was formed served as a proxy measure of climate; it was analyzed…
Image dehazing based on non-local saturation
NASA Astrophysics Data System (ADS)
Wang, Linlin; Zhang, Qian; Yang, Deyun; Hou, Yingkun; He, Xiaoting
2018-04-01
In this paper, a method based on non-local saturation algorithm is proposed to avoid block and halo effect for single image dehazing with dark channel prior. First we convert original image from RGB color space into HSV color space with the idea of non-local method. Image saturation is weighted equally by the size of fixed window according to image resolution. Second we utilize the saturation to estimate the atmospheric light value and transmission rate. Then through the function of saturation and transmission, the haze-free image is obtained based on the atmospheric scattering model. Comparing the results of existing methods, our method can restore image color and enhance contrast. We guarantee the proposed method with quantitative and qualitative evaluation respectively. Experiments show the better visual effect with high efficiency.
Hennegan, Julie; Dolan, Catherine; Wu, Maryalice; Scott, Linda; Montgomery, Paul
2016-12-07
Governments, multinational organisations, and charities have commenced the distribution of sanitary products to address current deficits in girls' menstrual management. The few effectiveness studies conducted have focused on health and education outcomes but have failed to provide quantitative assessment of girls' preferences, experiences of absorbents, and comfort. Objectives of the study were, first, to quantitatively describe girls' experiences with, and ratings of reliability and acceptability of different menstrual absorbents. Second, to compare ratings of freely-provided reusable pads (AFRIpads) to other existing methods of menstrual management. Finally, to assess differences in self-reported freedom of activity during menses according to menstrual absorbent. Cross-sectional, secondary analysis of data from the final survey of a controlled trial of reusable sanitary padand puberty education provision was undertaken. Participants were 205 menstruating schoolgirls from eight schools in rural Uganda. 72 girls who reported using the intervention-provided reusable pads were compared to those using existing improvised methods (predominately new or old cloth). Schoolgirls using reusable pads provided significantly higher ratings of perceived absorbent reliability across activities, less difficulties changing absorbents, and less disgust with cleaning absorbents. There were no significant differences in reports of outside garment soiling (OR 1.00 95%CI 0.51-1.99), or odour (0.84 95%CI 0.40-1.74) during the last menstrual period. When girls were asked if menstruation caused them to miss daily activities there were no differences between those using reusable pads and those using other existing methods. However, when asked about activities avoided during menstruation, those using reusable pads participated less in physical sports, working in the field, fetching water, and cooking. Reusable pads were rated favourably. This translated into some benefits for self-reported involvement in daily activities, although reports of actual soiling and missing activities due to menstruation did not differ. More research is needed comparing the impact of menstrual absorbents on girls' daily activities, and validating outcome measures for menstrual management research.
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
Multiple normalized solutions for a planar gauged nonlinear Schrödinger equation
NASA Astrophysics Data System (ADS)
Luo, Xiao
2018-06-01
We study the existence, multiplicity, quantitative property and asymptotic behavior of normalized solutions for a gauged nonlinear Schrödinger equation arising from the Chern-Simons theory Δ u + ω u +|x|^2u+ λ ( {{h^2}(| x | )}/{{{| x | ^2}}} + \\int \\limits _{| x | }^{ + ∞} {{h(s)}/s} {u^2}(s)ds) u = {| u | ^{p - 2}}u,\\quad x\\in R^2, where ω \\in R, λ >0, p>4 and h(s) = 1/2\\int \\limits _0^s {r{u^2}(r)dr} . Combining constraint minimization method and minimax principle, we prove that the problem possesses at least two normalized solutions: One is a ground state and the other is an excited state. Furthermore, the asymptotic behavior and quantitative property of the ground state are analyzed.
ERIC Educational Resources Information Center
Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.
2011-01-01
Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…
75 FR 81632 - Australia Beef Imports Approved for the Electronic Certification System (eCERT)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
..., 2011, the export certification requirement for imports of beef from Australia subject to quantitative...: Background There are existing quantitative restraints on beef from Australia pursuant to U.S. Note 3... quantitative quota restrictions beginning January 3, 2011. Such imports that are entered, or withdrawn from...
Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim
2009-01-01
Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...
A method based on IHS cylindrical transform model for quality assessment of image fusion
NASA Astrophysics Data System (ADS)
Zhu, Xiaokun; Jia, Yonghong
2005-10-01
Image fusion technique has been widely applied to remote sensing image analysis and processing, and methods for quality assessment of image fusion in remote sensing have also become the research issues at home and abroad. Traditional assessment methods combine calculation of quantitative indexes and visual interpretation to compare fused images quantificationally and qualitatively. However, in the existing assessment methods, there are two defects: on one hand, most imdexes lack the theoretic support to compare different fusion methods. On the hand, there is not a uniform preference for most of the quantitative assessment indexes when they are applied to estimate the fusion effects. That is, the spatial resolution and spectral feature could not be analyzed synchronously by these indexes and there is not a general method to unify the spatial and spectral feature assessment. So in this paper, on the basis of the approximate general model of four traditional fusion methods, including Intensity Hue Saturation(IHS) triangle transform fusion, High Pass Filter(HPF) fusion, Principal Component Analysis(PCA) fusion, Wavelet Transform(WT) fusion, a correlation coefficient assessment method based on IHS cylindrical transform is proposed. By experiments, this method can not only get the evaluation results of spatial and spectral features on the basis of uniform preference, but also can acquire the comparison between fusion image sources and fused images, and acquire differences among fusion methods. Compared with the traditional assessment methods, the new methods is more intuitionistic, and in accord with subjective estimation.
NASA Astrophysics Data System (ADS)
McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.
2016-12-01
The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present study included both pre- and post-course conceptual-learning assessments and a new quantitative physics problem-solving assessment that included three representative conservation of energy problems from a first-semester calculus-based college physics course. Scores for problem translation, plan coherence, solution execution, and evaluation of solution plausibility were extracted for each problem. Over 450 students in three IE-based sections and two traditional lecture sections taught at the same university during the same semester participated. As expected, the IE-based course produced more robust gains on a Force Concept Inventory than did the lecture course. By contrast, when the full sample was considered, gains in quantitative problem solving were significantly greater for lecture than IE-based physics; when students were matched on pre-test scores, there was still no advantage for IE-based physics on gains in quantitative problem solving. Further, the association between performance on the concept inventory and quantitative problem solving was minimal. These results highlight that improved conceptual understanding does not necessarily support improved quantitative physics problem solving, and that the instructional method appears to have less bearing on gains in quantitative problem solving than does the kinds of problems emphasized in the courses and homework and the overlap of these problems to those on the assessment.
Bateman, Randall J.; Munsell, Ling Y.; Chen, Xianghong; Holtzman, David M.; Yarasheski, Kevin E.
2007-01-01
In all biological systems, protein amount is a function of the rate of production and clearance. The speed of a response to a disturbance in protein homeostasis is determined by turnover rate. Quantifying alterations in protein synthesis and clearance rates is vital to understanding disease pathogenesis (e.g., aging, inflammation). No methods exist for quantifying production and clearance rates of low abundance (femtomole) proteins in vivo. We describe a novel, mass spectrometry-based method for quantitating low abundance protein synthesis and clearance rates in vitro and in vivo in animals and humans. The utility of this method is demonstrated with amyloid-beta (Aß), an important low abundance protein involved in Alzheimer's disease pathogenesis. We used in vivo stable isotope labeling, immunoprecipitation of Aß from cerebrospinal fluid, and quantitative liquid chromatography electrospray-ionization tandem mass spectrometry (LC-ESI-tandem MS) to quantify human Aß protein production and clearance rates. The method is sensitive and specific for stable isotope labeled amino acid incorporation into CNS (± 1% accuracy). This in vivo method can be used to identify pathophysiologic changes in protein metabolism; and may serve as a biomarker for monitoring disease risk, progression, or response to novel therapeutic agents. The technique is adaptable to other macromolecules, such as carbohydrates or lipids. PMID:17383190
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drukker, Karen, E-mail: kdrukker@uchicago.edu; Giger, Maryellen L.; Li, Hui
2014-03-15
Purpose: To investigate whether biologic image composition of mammographic lesions can improve upon existing mammographic quantitative image analysis (QIA) in estimating the probability of malignancy. Methods: The study population consisted of 45 breast lesions imaged with dual-energy mammography prior to breast biopsy with final diagnosis resulting in 10 invasive ductal carcinomas, 5 ductal carcinomain situ, 11 fibroadenomas, and 19 other benign diagnoses. Analysis was threefold: (1) The raw low-energy mammographic images were analyzed with an established in-house QIA method, “QIA alone,” (2) the three-compartment breast (3CB) composition measure—derived from the dual-energy mammography—of water, lipid, and protein thickness were assessed, “3CBmore » alone”, and (3) information from QIA and 3CB was combined, “QIA + 3CB.” Analysis was initiated from radiologist-indicated lesion centers and was otherwise fully automated. Steps of the QIA and 3CB methods were lesion segmentation, characterization, and subsequent classification for malignancy in leave-one-case-out cross-validation. Performance assessment included box plots, Bland–Altman plots, and Receiver Operating Characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) for distinguishing between benign and malignant lesions (invasive and DCIS) was 0.81 (standard error 0.07) for the “QIA alone” method, 0.72 (0.07) for “3CB alone” method, and 0.86 (0.04) for “QIA+3CB” combined. The difference in AUC was 0.043 between “QIA + 3CB” and “QIA alone” but failed to reach statistical significance (95% confidence interval [–0.17 to + 0.26]). Conclusions: In this pilot study analyzing the new 3CB imaging modality, knowledge of the composition of breast lesions and their periphery appeared additive in combination with existing mammographic QIA methods for the distinction between different benign and malignant lesion types.« less
75 FR 68468 - List of Fisheries for 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-08
...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...
Wang, Chengjian; Zhang, Ping; Jin, Wanjun; Li, Lingmei; Qiang, Shan; Zhang, Ying; Huang, Linjuan; Wang, Zhongfu
2017-01-06
Rapid, simple and versatile methods for quantitative analysis of glycoprotein O-glycans are urgently required for current studies on protein O-glycosylation patterns and the search for disease O-glycan biomarkers. Relative quantitation of O-glycans using stable isotope labeling followed by mass spectrometric analysis represents an ideal and promising technique. However, it is hindered by the shortage of reliable nonreductive O-glycan release methods as well as the too large or too small inconstant mass difference between the light and heavy isotope form derivatives of O-glycans, which results in difficulties during the recognition and quantitative analysis of O-glycans by mass spectrometry. Herein we report a facile and versatile O-glycan relative quantification strategy, based on an improved one-pot method that can quantitatively achieve nonreductive release and in situ chromophoric labeling of intact mucin-type O-glycans in one step. In this study, the one-pot method is optimized and applied for quantitative O-glycan release and tagging with either non-deuterated (d 0 -) or deuterated (d 5 -) 1-phenyl-3-methyl-5-pyrazolone (PMP). The obtained O-glycan derivatives feature a permanent 10-Da mass difference between the d 0 - and d 5 -PMP forms, allowing complete discrimination and comparative quantification of these isotopically labeled O-glycans by mass spectrometric techniques. Moreover, the d 0 - and d 5 -PMP derivatives of O-glycans also have a relatively high hydrophobicity as well as a strong UV adsorption, especially suitable for high-resolution separation and high-sensitivity detection by RP-HPLC-UV. We have refined the conditions for the one-pot reaction as well as the corresponding sample purification approach. The good quantitation feasibility, reliability and linearity of this strategy have been verified using bovine fetuin and porcine stomach mucin as model O-glycoproteins. Additionally, we have also successfully applied this method to the quantitative O-glycomic comparison between perch and salmon eggs by ESI-MS, MS/MS and online RP-HPLC-UV-ESI-MS/MS, demonstrating its excellent applicability to various complex biological samples. O-Linked glycoproteins, generated via a widely existing glycosylation modification process on serine (Ser) or threonine (Thr) residues of nascent proteins, play essential roles in a series of biological processes. As a type of informational molecule, the O-glycans of these glycoproteins participate directly in these biological mechanisms. Thus, the characteristic differences or changes of O-glycans in expression level usually relate to pathologies of many diseases and represent an important opportunity to uncover the functional mechanisms of various glycoprotein O-glycans. The novel strategy introduced here provides a simple and versatile analytical method for the precise quantitation of glycoprotein O-glycans by mass spectrometry, enabling rapid evaluation of the differences or changes of O-glycans in expression level. It is attractive for the field of quantitative/comparative O-glycomics, which has great significance for exploring the complex structure-function relationship of O-glycans, as well as for the search of O-glycan biomarkers of some major diseases and O-glycan related targets of some drugs. Copyright © 2016 Elsevier B.V. All rights reserved.
Health impact assessment – A survey on quantifying tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org
Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less
2015-05-01
challenging component of assessing human health risks associated with contaminated soil and groundwater since the late 1990s, during which time...and analysis. 1.3 REGULATORY DRIVERS Regulatory guidance for assessment and management of risks associated with VI has been issued by at least 27...requirements to assess potential human health risks , and this possibility exists where VOCs are present in the subsurface near occupied buildings
Hypercuboidal renormalization in spin foam quantum gravity
NASA Astrophysics Data System (ADS)
Bahr, Benjamin; Steinhaus, Sebastian
2017-06-01
In this article, we apply background-independent renormalization group methods to spin foam quantum gravity. It is aimed at extending and elucidating the analysis of a companion paper, in which the existence of a fixed point in the truncated renormalization group flow for the model was reported. Here, we repeat the analysis with various modifications and find that both qualitative and quantitative features of the fixed point are robust in this setting. We also go into details about the various approximation schemes employed in the analysis.
MO-C-BRCD-03: The Role of Informatics in Medical Physics and Vice Versa.
Andriole, K
2012-06-01
Like Medical Physics, Imaging Informatics encompasses concepts touching every aspect of the imaging chain from image creation, acquisition, management and archival, to image processing, analysis, display and interpretation. The two disciplines are in fact quite complementary, with similar goals to improve the quality of care provided to patients using an evidence-based approach, to assure safety in the clinical and research environments, to facilitate efficiency in the workplace, and to accelerate knowledge discovery. Use-cases describing several areas of informatics activity will be given to illustrate current limitations that would benefit from medical physicist participation, and conversely areas in which informaticists may contribute to the solution. Topics to be discussed include radiation dose monitoring, process management and quality control, display technologies, business analytics techniques, and quantitative imaging. Quantitative imaging is increasingly becoming an essential part of biomedicalresearch as well as being incorporated into clinical diagnostic activities. Referring clinicians are asking for more objective information to be gleaned from the imaging tests that they order so that they may make the best clinical management decisions for their patients. Medical Physicists may be called upon to identify existing issues as well as develop, validate and implement new approaches and technologies to help move the field further toward quantitative imaging methods for the future. Biomedical imaging informatics tools and techniques such as standards, integration, data mining, cloud computing and new systems architectures, ontologies and lexicons, data visualization and navigation tools, and business analytics applications can be used to overcome some of the existing limitations. 1. Describe what is meant by Medical Imaging Informatics and understand why the medical physicist should care. 2. Identify existing limitations in information technologies with respect to Medical Physics, and conversely see how Informatics may assist the medical physicist in filling some of the current gaps in their activities. 3. Understand general informatics concepts and areas of investigation including imaging and workflow standards, systems integration, computing architectures, ontologies, data mining and business analytics, data visualization and human-computer interface tools, and the importance of quantitative imaging for the future of Medical Physics and Imaging Informatics. 4. Become familiar with on-going efforts to address current challenges facing future research into and clinical implementation of quantitative imaging applications. © 2012 American Association of Physicists in Medicine.
Gender Differences in Drinking Practices in Middle Aged and Older Russians
Bobrova, Natalia; West, Robert; Malyutina, Darya; Malyutina, Sofia; Bobak, Martin
2010-01-01
Aims: The study investigated gender differences in drinking patterns and the reasons behind them among men and women in the Russian city of Novosibirsk. Methods: A mixed method, combining quantitative and qualitative data, was conducted based on the Health, Alcohol and Psychosocial factors In Eastern Europe cohort study. The quantitative study included 4268 men and 5094 women aged 45–69 years; of those, 20 men and 24 women completed an in-depth interview. Results: The quantitative data revealed a large gap in drinking patterns in general between genders. Women drank less often and much smaller quantities than that of men. For example, 19% of men, vs. 1% of women, were classified as problem drinkers (two or more positive answers on the CAGE questionnaire). These differences were not explained by socioeconomic factors. Qualitative data have shown that gender roles and a traditional culture around women's and men's drinking were the main reasons for the reported drinking behaviour, whereby women were consistently expected to drink much less than men in terms of preference for strong beverages, drinking frequency and quantity of alcohol consumed. Conclusion: The study confirmed that large differences exist between Russian men's and women's drinking; these differences may be largely explained by gender roles. PMID:21075855
NASA Astrophysics Data System (ADS)
Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.
2017-11-01
The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.
Experimental design and quantitative analysis of microbial community multiomics.
Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis
2017-11-30
Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.
TUBEs-Mass Spectrometry for Identification and Analysis of the Ubiquitin-Proteome.
Azkargorta, Mikel; Escobes, Iraide; Elortza, Felix; Matthiesen, Rune; Rodríguez, Manuel S
2016-01-01
Mass spectrometry (MS) has become the method of choice for the large-scale analysis of protein ubiquitylation. There exist a number of proposed methods for mapping ubiquitin sites, each with different pros and cons. We present here a protocol for the MS analysis of the ubiquitin-proteome captured by TUBEs and subsequent data analysis. Using dedicated software and algorithms, specific information on the presence of ubiquitylated peptides can be obtained from the MS search results. In addition, a quantitative and functional analysis of the ubiquitylated proteins and their interacting partners helps to unravel the biological and molecular processes they are involved in.
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay
2012-01-01
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis. PMID:23149420
Comparison of three commercially available fit-test methods.
Janssen, Larry L; Luinenburg, D Michael; Mullins, Haskell E; Nelson, Thomas J
2002-01-01
American National Standards Institute (ANSI) standard Z88.10, Respirator Fit Testing Methods, includes criteria to evaluate new fit-tests. The standard allows generated aerosol, particle counting, or controlled negative pressure quantitative fit-tests to be used as the reference method to determine acceptability of a new test. This study examined (1) comparability of three Occupational Safety and Health Administration-accepted fit-test methods, all of which were validated using generated aerosol as the reference method; and (2) the effect of the reference method on the apparent performance of a fit-test method under evaluation. Sequential fit-tests were performed using the controlled negative pressure and particle counting quantitative fit-tests and the bitter aerosol qualitative fit-test. Of 75 fit-tests conducted with each method, the controlled negative pressure method identified 24 failures; bitter aerosol identified 22 failures; and the particle counting method identified 15 failures. The sensitivity of each method, that is, agreement with the reference method in identifying unacceptable fits, was calculated using each of the other two methods as the reference. None of the test methods met the ANSI sensitivity criterion of 0.95 or greater when compared with either of the other two methods. These results demonstrate that (1) the apparent performance of any fit-test depends on the reference method used, and (2) the fit-tests evaluated use different criteria to identify inadequately fitting respirators. Although "acceptable fit" cannot be defined in absolute terms at this time, the ability of existing fit-test methods to reject poor fits can be inferred from workplace protection factor studies.
ERIC Educational Resources Information Center
Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa
2014-01-01
Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students' apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course…
Code of Federal Regulations, 2013 CFR
2013-01-01
... its review. In the first stage of the process, the Council will apply six uniform quantitative... quantitative thresholds using information available through existing public and regulatory sources, nonbank..., rather than applying a broadly applicable quantitative metric. The Council believes that the threat a...
Code of Federal Regulations, 2014 CFR
2014-01-01
... its review. In the first stage of the process, the Council will apply six uniform quantitative... quantitative thresholds using information available through existing public and regulatory sources, nonbank..., rather than applying a broadly applicable quantitative metric. The Council believes that the threat a...
1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology
Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965
Analysis of cell division patterns in the Arabidopsis shoot apical meristem
Shapiro, Bruce E.; Tobin, Cory; Mjolsness, Eric; ...
2015-03-30
The stereotypic pattern of cell shapes in the Arabidopsis shoot apical meristem (SAM) suggests that strict rules govern the placement of new walls during cell division. When a cell in the SAM divides, a new wall is built that connects existing walls and divides the cytoplasm of the daughter cells. Because features that are determined by the placement of new walls such as cell size, shape, and number of neighbors are highly regular, rules must exist for maintaining such order. Here in this paper we present a quantitative model of these rules that incorporates different observed features of cell division.more » Each feature is incorporated into a "potential function" that contributes a single term to a total analog of potential energy. New cell walls are predicted to occur at locations where the potential function is minimized. Quantitative terms that represent the well-known historical rules of plant cell division, such as those given by Hofmeister, Errera, and Sachs are developed and evaluated against observed cell divisions in the epidermal layer (L1) of Arabidopsis thaliana SAM. The method is general enough to allow additional terms for nongeometric properties such as internal concentration gradients and mechanical tensile forces.« less
Koskinen, Heli I
2010-01-01
The Faculty of Veterinary Medicine at the University of Helsinki recognized the lack of systems to measure the quality of education. At the department level, this meant lack of systems to measure the quality of students' outcomes. The aim of this article was to compare the quality of outcomes of a final examination in veterinary radiology by calculating the correlations between traditional (quantitative scores traditionally given by veterinary teachers) and nontraditional (qualitative Structure of the Observed Learning Outcome, or SOLO, method) grading results. Evaluation of the quality of the questions is also included. The results indicate that SOLO offers criteria for quality evaluation, especially for questions. A correlation of 0.60 (p<0.01) existed between qualitative and quantitative estimations, and a correlation of 0.79 (p<0.01) existed between evaluators, both using traditional scores. Two suggestions for a better system to evaluate quality in the future: First, development of problem-solving skills during the learning process should also be assessed. Second, both the scoring of factual correctness of answers (knowledge) and the grammatical structure of an answer and the quality of presentation should be included in the quality evaluation process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, X; Arbique, G; Guild, J
Purpose: To evaluate the quantitative image quality of spectral reconstructions of phantom data from a spectral CT scanner. Methods: The spectral CT scanner (IQon Spectral CT, Philips Healthcare) is equipped with a dual-layer detector and generates conventional 80-140 kVp images and variety of spectral reconstructions, e.g., virtual monochromatic (VM) images, virtual non-contrast (VNC) images, iodine maps, and effective atomic number (Z) images. A cylindrical solid water phantom (Gammex 472, 33 cm diameter and 5 cm thick) with iodine (2.0-20.0 mg I/ml) and calcium (50-600 mg/ml) rod inserts was scanned at 120 kVp and 27 mGy CTDIvol. Spectral reconstructions were evaluatedmore » by comparing image measurements with theoretical values calculated from nominal rod compositions provided by the phantom manufacturer. The theoretical VNC was calculated using water and iodine basis material decomposition, and the theoretical Z was calculated using two common methods, the chemical formula method (Z1) and the dual-energy ratio method (Z2). Results: Beam-hardening-like artifacts between high-attenuation calcium rods (≥300 mg/ml, >800 HU) influenced quantitative measurements, so the quantitative analysis was only performed on iodine rods using the images from the scan with all the calcium rods removed. The CT numbers of the iodine rods in the VM images (50∼150 keV) were close to theoretical values with average difference of 2.4±6.9 HU. Compared with theoretical values, the average difference for iodine concentration, VNC CT number and effective Z of iodine rods were −0.10±0.38 mg/ml, −0.1±8.2 HU, 0.25±0.06 (Z1) and −0.23±0.07 (Z2). Conclusion: The results indicate that the spectral CT scanner generates quantitatively accurate spectral reconstructions at clinically relevant iodine concentrations. Beam-hardening-like artifacts still exist when high-attenuation objects are present and their impact on patient images needs further investigation. YY is an employee of Philips Healthcare.« less
In vivo THz sensing of the cornea of the eye
NASA Astrophysics Data System (ADS)
Ozheredov, Ilya; Prokopchuk, Mikhail; Mischenko, Mikhail; Safonova, Tatiana; Solyankin, Petr; Larichev, Andrey; Angeluts, Andrey; Balakin, Alexei; Shkurinov, Alexander
2018-05-01
Measurement of the absolute value of the humidity of the cornea of the human eye and its dynamics is of paramount importance for the preservation of eyesight. In the present paper we have demonstrated that terahertz technologies can be practically applied for quantitative measurement of the physiological dynamics of tear film and sensing of corneal tissue hydration. We suggest uses of the equipment for application in clinics and a method for absolute calibration of the values for measurement. The proposed method is fundamentally different from existing and currently available methods of ophthalmological diagnosis. This suggests that the developed technique may have high diagnostic significance and can be used in the study and treatment of several diseases of the ocular surface.
Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.
Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao
2015-08-01
Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.
Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann
2014-10-01
Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.
Improving power and robustness for detecting genetic association with extreme-value sampling design.
Chen, Hua Yun; Li, Mingyao
2011-12-01
Extreme-value sampling design that samples subjects with extremely large or small quantitative trait values is commonly used in genetic association studies. Samples in such designs are often treated as "cases" and "controls" and analyzed using logistic regression. Such a case-control analysis ignores the potential dose-response relationship between the quantitative trait and the underlying trait locus and thus may lead to loss of power in detecting genetic association. An alternative approach to analyzing such data is to model the dose-response relationship by a linear regression model. However, parameter estimation from this model can be biased, which may lead to inflated type I errors. We propose a robust and efficient approach that takes into consideration of both the biased sampling design and the potential dose-response relationship. Extensive simulations demonstrate that the proposed method is more powerful than the traditional logistic regression analysis and is more robust than the linear regression analysis. We applied our method to the analysis of a candidate gene association study on high-density lipoprotein cholesterol (HDL-C) which includes study subjects with extremely high or low HDL-C levels. Using our method, we identified several SNPs showing a stronger evidence of association with HDL-C than the traditional case-control logistic regression analysis. Our results suggest that it is important to appropriately model the quantitative traits and to adjust for the biased sampling when dose-response relationship exists in extreme-value sampling designs. © 2011 Wiley Periodicals, Inc.
[Progress in stable isotope labeled quantitative proteomics methods].
Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui
2013-06-01
Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.
Method and platform standardization in MRM-based quantitative plasma proteomics.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H
2013-12-16
There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
Automated quantitative assessment of proteins' biological function in protein knowledge bases.
Mayr, Gabriele; Lepperdinger, Günter; Lackner, Peter
2008-01-01
Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.
Multifractal detrended cross-correlation analysis in the MENA area
NASA Astrophysics Data System (ADS)
El Alaoui, Marwane; Benbachir, Saâd
2013-12-01
In this paper, we investigated multifractal cross-correlations qualitatively and quantitatively using a cross-correlation test and the Multifractal detrended cross-correlation analysis method (MF-DCCA) for markets in the MENA area. We used cross-correlation coefficients to measure the level of this correlation. The analysis concerns four stock market indices of Morocco, Tunisia, Egypt and Jordan. The countries chosen are signatory of the Agadir agreement concerning the establishment of a free trade area comprising Arab Mediterranean countries. We computed the bivariate generalized Hurst exponent, Rényi exponent and spectrum of singularity for each pair of indices to measure quantitatively the cross-correlations. By analyzing the results, we found the existence of multifractal cross-correlations between all of these markets. We compared the spectrum width of these indices; we also found which pair of indices has a strong multifractal cross-correlation.
Hori, Fumitaka; Harada, Yuji; Kuretake, Tatsumi; Uno, Shigeyasu
2016-01-01
A detection method of gold nanoparticles in chromatography paper has been developed for a simple, cost-effective and reliable quantitation of immunochromatographic strip test. The time courses of the solution resistance in chromatography paper with the gold nanoparticles solution are electrochemically measured by chrono-impedimetry. The dependence of the solution resistance on the concentration of gold nanoparticles has been successfully observed. The main factor to increase the solution resistance may be obstruction of the ion transport due to the presence of gold nanoparticles. The existence of gold nanoparticles with 1.92 × 10(9) particles/mL in an indistinctly-colored chromatography paper is also identified by a solution resistance measurement. This indicates that the solution resistance assay has the potential to lower the detection limit of the conventional qualitative assay.
Detecting cell death with optical coherence tomography and envelope statistics
NASA Astrophysics Data System (ADS)
Farhat, Golnaz; Yang, Victor X. D.; Czarnota, Gregory J.; Kolios, Michael C.
2011-02-01
Currently no standard clinical or preclinical noninvasive method exists to monitor cell death based on morphological changes at the cellular level. In our past work we have demonstrated that quantitative high frequency ultrasound imaging can detect cell death in vitro and in vivo. In this study we apply quantitative methods previously used with high frequency ultrasound to optical coherence tomography (OCT) to detect cell death. The ultimate goal of this work is to use these methods for optically-based clinical and preclinical cancer treatment monitoring. Optical coherence tomography data were acquired from acute myeloid leukemia cells undergoing three modes of cell death. Significant increases in integrated backscatter were observed for cells undergoing apoptosis and mitotic arrest, while necrotic cells induced a decrease. These changes appear to be linked to structural changes observed in histology obtained from the cell samples. Signal envelope statistics were analyzed from fittings of the generalized gamma distribution to histograms of envelope intensities. The parameters from this distribution demonstrated sensitivities to morphological changes in the cell samples. These results indicate that OCT integrated backscatter and first order envelope statistics can be used to detect and potentially differentiate between modes of cell death in vitro.
Electrochemical and Infrared Absorption Spectroscopy Detection of SF₆ Decomposition Products.
Dong, Ming; Zhang, Chongxing; Ren, Ming; Albarracín, Ricardo; Ye, Rixin
2017-11-15
Sulfur hexafluoride (SF₆) gas-insulated electrical equipment is widely used in high-voltage (HV) and extra-high-voltage (EHV) power systems. Partial discharge (PD) and local heating can occur in the electrical equipment because of insulation faults, which results in SF₆ decomposition and ultimately generates several types of decomposition products. These SF₆ decomposition products can be qualitatively and quantitatively detected with relevant detection methods, and such detection contributes to diagnosing the internal faults and evaluating the security risks of the equipment. At present, multiple detection methods exist for analyzing the SF₆ decomposition products, and electrochemical sensing (ES) and infrared (IR) spectroscopy are well suited for application in online detection. In this study, the combination of ES with IR spectroscopy is used to detect SF₆ gas decomposition. First, the characteristics of these two detection methods are studied, and the data analysis matrix is established. Then, a qualitative and quantitative analysis ES-IR model is established by adopting a two-step approach. A SF₆ decomposition detector is designed and manufactured by combining an electrochemical sensor and IR spectroscopy technology. The detector is used to detect SF₆ gas decomposition and is verified to reliably and accurately detect the gas components and concentrations.
Gong, Kuang; Yang, Jaewon; Kim, Kyungsang; El Fakhri, Georges; Seo, Youngho; Li, Quanzheng
2018-05-23
Positron Emission Tomography (PET) is a functional imaging modality widely used in neuroscience studies. To obtain meaningful quantitative results from PET images, attenuation correction is necessary during image reconstruction. For PET/MR hybrid systems, PET attenuation is challenging as Magnetic Resonance (MR) images do not reflect attenuation coefficients directly. To address this issue, we present deep neural network methods to derive the continuous attenuation coefficients for brain PET imaging from MR images. With only Dixon MR images as the network input, the existing U-net structure was adopted and analysis using forty patient data sets shows it is superior than other Dixon based methods. When both Dixon and zero echo time (ZTE) images are available, we have proposed a modified U-net structure, named GroupU-net, to efficiently make use of both Dixon and ZTE information through group convolution modules when the network goes deeper. Quantitative analysis based on fourteen real patient data sets demonstrates that both network approaches can perform better than the standard methods, and the proposed network structure can further reduce the PET quantification error compared to the U-net structure. © 2018 Institute of Physics and Engineering in Medicine.
Kang, Wonseok; Yu, Soohwan; Seo, Doochun; Jeong, Jaeheon; Paik, Joonki
2015-09-10
In very high-resolution (VHR) push-broom-type satellite sensor data, both destriping and denoising methods have become chronic problems and attracted major research advances in the remote sensing fields. Since the estimation of the original image from a noisy input is an ill-posed problem, a simple noise removal algorithm cannot preserve the radiometric integrity of satellite data. To solve these problems, we present a novel method to correct VHR data acquired by a push-broom-type sensor by combining wavelet-Fourier and multiscale non-local means (NLM) filters. After the wavelet-Fourier filter separates the stripe noise from the mixed noise in the wavelet low- and selected high-frequency sub-bands, random noise is removed using the multiscale NLM filter in both low- and high-frequency sub-bands without loss of image detail. The performance of the proposed method is compared to various existing methods on a set of push-broom-type sensor data acquired by Korean Multi-Purpose Satellite 3 (KOMPSAT-3) with severe stripe and random noise, and the results of the proposed method show significantly improved enhancement results over existing state-of-the-art methods in terms of both qualitative and quantitative assessments.
Kang, Wonseok; Yu, Soohwan; Seo, Doochun; Jeong, Jaeheon; Paik, Joonki
2015-01-01
In very high-resolution (VHR) push-broom-type satellite sensor data, both destriping and denoising methods have become chronic problems and attracted major research advances in the remote sensing fields. Since the estimation of the original image from a noisy input is an ill-posed problem, a simple noise removal algorithm cannot preserve the radiometric integrity of satellite data. To solve these problems, we present a novel method to correct VHR data acquired by a push-broom-type sensor by combining wavelet-Fourier and multiscale non-local means (NLM) filters. After the wavelet-Fourier filter separates the stripe noise from the mixed noise in the wavelet low- and selected high-frequency sub-bands, random noise is removed using the multiscale NLM filter in both low- and high-frequency sub-bands without loss of image detail. The performance of the proposed method is compared to various existing methods on a set of push-broom-type sensor data acquired by Korean Multi-Purpose Satellite 3 (KOMPSAT-3) with severe stripe and random noise, and the results of the proposed method show significantly improved enhancement results over existing state-of-the-art methods in terms of both qualitative and quantitative assessments. PMID:26378532
Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models
NASA Astrophysics Data System (ADS)
Wellmann, J. Florian; Regenauer-Lieb, Klaus
2012-03-01
Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.
Measurement of plasma hydrogen sulfide in vivo and in vitro
Shen, Xinggui; Pattillo, Christopher B.; Pardue, Sibile; Bir, Shyamal C.; Wang, Rui; Kevil, Christopher G.
2015-01-01
The gasotransmitter hydrogen sulfide is known to regulate multiple cellular functions during normal and pathophysiological states. However, a paucity of concise information exists regarding quantitative amounts of hydrogen sulfide involved in physiological and pathological responses. This is primarily due to disagreement among various methods employed to measure free hydrogen sulfide. In this article, we describe a very sensitive method of measuring the presence of H2S in plasma down to nanomolar levels, using monobromobimane (MBB). The current standard assay using methylene blue provides erroneous results that do not actually measure H2S. The method presented herein involves derivatization of sulfide with excess MBB in 100 mM Tris–HCl buffer (pH 9.5, 0.1 mM DTPA) for 30 min in 1% oxygen at room temperature. The fluorescent product sulfide-dibimane (SDB) is analyzed by RP-HPLC using an eclipse XDB-C18 (4.6×250 mm) column with gradient elution by 0.1% (v/v) trifluoroacetic acid in acetonitrile. The limit of detection for sulfide-dibimane is 2 nM and the SDB product is very stable over time, allowing batch storage and analysis. In summary, our MBB method is suitable for sensitive quantitative measurement of free hydrogen sulfide in multiple biological samples such as plasma, tissue and cell culture lysates, or media. PMID:21276849
Moore, Talia Y; Cooper, Kimberly L; Biewener, Andrew A; Vasudevan, Ramanarayan
2017-09-05
Mechanistically linking movement behaviors and ecology is key to understanding the adaptive evolution of locomotion. Predator evasion, a behavior that enhances fitness, may depend upon short bursts or complex patterns of locomotion. However, such movements are poorly characterized by existing biomechanical metrics. We present methods based on the entropy measure of randomness from Information Theory to quantitatively characterize the unpredictability of non-steady-state locomotion. We then apply the method by examining sympatric rodent species whose escape trajectories differ in dimensionality. Unlike the speed-regulated gait use of cursorial animals to enhance locomotor economy, bipedal jerboa (family Dipodidae) gait transitions likely enhance maneuverability. In field-based observations, jerboa trajectories are significantly less predictable than those of quadrupedal rodents, likely increasing predator evasion ability. Consistent with this hypothesis, jerboas exhibit lower anxiety in open fields than quadrupedal rodents, a behavior that varies inversely with predator evasion ability. Our unpredictability metric expands the scope of quantitative biomechanical studies to include non-steady-state locomotion in a variety of evolutionary and ecologically significant contexts.Biomechanical understanding of animal gait and maneuverability has primarily been limited to species with more predictable, steady-state movement patterns. Here, the authors develop a method to quantify movement predictability, and apply the method to study escape-related movement in several species of desert rodents.
Molecularly imprinted polymers for the detection of illegal drugs and additives: a review.
Xiao, Deli; Jiang, Yue; Bi, Yanping
2018-04-04
This review (with 154 refs.) describes the current status of using molecularly imprinted polymers in the extraction and quantitation of illicit drugs and additives. The review starts with an introduction into some synthesis methods (lump MIPs, spherical MIPs, surface imprinting) of MIPs using illicit drugs and additives as templates. The next section covers applications, with subsections on the detection of illegal additives in food, of doping in sports, and of illicit addictive drugs. A particular focus is directed towards current limitations and challenges, on the optimization of methods for preparation of MIPs, their applicability to aqueous samples, the leakage of template molecules, and the identification of the best balance between adsorption capacity and selectivity factor. At last, the need for convincing characterization methods, the lack of uniform parameters for defining selectivity, and the merits and demerits of MIPs prepared using nanomaterials are addressed. Strategies are suggested to solve existing problems, and future developments are discussed with respect to a more widespread use in relevant fields. Graphical abstract This review gives a comprehensive overview of the advances made in molecularly imprinting of polymers for use in the extraction and quantitation of illicit drugs and additives. Methods for syntheses, highlighted applications, limitations and current challenges are specifically addressed.
Modeling of Continuum Manipulators Using Pythagorean Hodograph Curves.
Singh, Inderjeet; Amara, Yacine; Melingui, Achille; Mani Pathak, Pushparaj; Merzouki, Rochdi
2018-05-10
Research on continuum manipulators is increasingly developing in the context of bionic robotics because of their many advantages over conventional rigid manipulators. Due to their soft structure, they have inherent flexibility, which makes it a huge challenge to control them with high performances. Before elaborating a control strategy of such robots, it is essential to reconstruct first the behavior of the robot through development of an approximate behavioral model. This can be kinematic or dynamic depending on the conditions of operation of the robot itself. Kinematically, two types of modeling methods exist to describe the robot behavior; quantitative methods describe a model-based method, and qualitative methods describe a learning-based method. In kinematic modeling of continuum manipulator, the assumption of constant curvature is often considered to simplify the model formulation. In this work, a quantitative modeling method is proposed, based on the Pythagorean hodograph (PH) curves. The aim is to obtain a three-dimensional reconstruction of the shape of the continuum manipulator with variable curvature, allowing the calculation of its inverse kinematic model (IKM). It is noticed that the performances of the PH-based kinematic modeling of continuum manipulators are considerable regarding position accuracy, shape reconstruction, and time/cost of the model calculation, than other kinematic modeling methods, for two cases: free load manipulation and variable load manipulation. This modeling method is applied to the compact bionic handling assistant (CBHA) manipulator for validation. The results are compared with other IKMs developed in case of CBHA manipulator.
A SVM-based quantitative fMRI method for resting-state functional network detection.
Song, Xiaomu; Chen, Nan-kuei
2014-09-01
Resting-state functional magnetic resonance imaging (fMRI) aims to measure baseline neuronal connectivity independent of specific functional tasks and to capture changes in the connectivity due to neurological diseases. Most existing network detection methods rely on a fixed threshold to identify functionally connected voxels under the resting state. Due to fMRI non-stationarity, the threshold cannot adapt to variation of data characteristics across sessions and subjects, and generates unreliable mapping results. In this study, a new method is presented for resting-state fMRI data analysis. Specifically, the resting-state network mapping is formulated as an outlier detection process that is implemented using one-class support vector machine (SVM). The results are refined by using a spatial-feature domain prototype selection method and two-class SVM reclassification. The final decision on each voxel is made by comparing its probabilities of functionally connected and unconnected instead of a threshold. Multiple features for resting-state analysis were extracted and examined using an SVM-based feature selection method, and the most representative features were identified. The proposed method was evaluated using synthetic and experimental fMRI data. A comparison study was also performed with independent component analysis (ICA) and correlation analysis. The experimental results show that the proposed method can provide comparable or better network detection performance than ICA and correlation analysis. The method is potentially applicable to various resting-state quantitative fMRI studies. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhou, Yitian; Zhou, Ping; Xin, Yinqiang; Wang, Jie; Zhu, Zhiqiang; Hu, Ji; Wei, Shicheng; Ma, Hongwei
2014-11-01
Telomerase plays an important role in governing the life span of cells for its capacity to extend telomeres. As high activity of telomerase has been found in stem cells and cancer cells specifically, various methods have been developed for the evaluation of telomerase activity. To overcome the time-consuming procedures and complicated manipulations of existing methods, we developed a novel method named Telomeric Repeat Elongation Assay based on Quartz crystal microbalance (TREAQ) to monitor telomerase activity during the self-renewal and differentiation of human induced pluripotent stem cells (hiPSCs). TREAQ results indicated hiPSCs possess invariable telomerase activity for 11 passages on Matrigel and a steady decline of telomerase activity when differentiated for different periods, which is confirmed with existing golden standard method. The pluripotency of hiPSCs during differentiation could be estimated through monitoring telomerase activity and compared with the expression levels of markers of pluripotency gene via quantitative real time PCR. Regular assessment for factors associated with pluripotency or stemness was expensive and requires excessive sample consuming, thus TREAQ could be a promising alternative technology for routine monitoring of telomerase activity and estimate the pluripotency of stem cells.
Impact of diet on the design of waste processors in CELSS
NASA Technical Reports Server (NTRS)
Waleh, Ahmad; Kanevsky, Valery; Nguyen, Thoi K.; Upadhye, Ravi; Wydeven, Theodore
1991-01-01
The preliminary results of a design analysis for a waste processor which employs existing technologies and takes into account the constraints of human diet are presented. The impact of diet is determined by using a model and an algorithm developed for the control and management of diet in a Controlled Ecological Life Support System (CELSS). A material and energy balance model for thermal oxidation of waste is developed which is consistent with both physical/chemical methods of incineration and supercritical water oxidation. The two models yield quantitative analysis of the diet and waste streams and the specific design parameters for waste processors, respectively. The results demonstrate that existing technologies can meet the demands of waste processing, but the choice and design of the processors or processing methods will be sensitive to the constraints of diet. The numerical examples are chosen to display the nature and extent of the gap in the available experiment information about CELSS requirements.
NASA Technical Reports Server (NTRS)
Lingbloom, Mike S.
2008-01-01
During redesign of the Space Shuttle reusable solid rocket motor (RSRM), NASA amended the contract with ATK Launch Systems (then Morton Thiokol Inc.) with Change Order 966 to implement a contamination control and cleanliness verification method. The change order required: (1) A quantitative inspection method (2) A written record of actual contamination levels versus a known reject level (3) A method that is more sensitive than existing methods of visual and black light inspection. Black light inspection is only useful for inspection of contaminants that fluoresce near the 365 nm spectral line and is not useful for inspection of most silicones that will not produce strong fluorescence. Black light inspection conducted by a qualified inspector under controlled light is capable of detecting Conoco HD-2 grease in gross amounts and is very subjective due to operator sensitivity. Optically stimulated electron emission (OSEE), developed at the Materials and Process Laboratory at Marshall Space Flight Center (MSFC), was selected to satisfy Change Order 966. OSEE offers several important advantages over existing laboratory methods with similar sensitivity, e.g., spectroscopy and nonvolatile residue sampling, which provide turn around time, real time capability, and full coverage inspection capability. Laboratory methods require sample gathering and in-lab analysis, which sometimes takes several days to get results. This is not practical in a production environment. In addition, these methods do not offer full coverage inspection of the large components
Methods for assessing geodiversity
NASA Astrophysics Data System (ADS)
Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco
2017-04-01
The accepted systematics of geodiversity assessment methods will be presented in three categories: qualitative, quantitative and qualitative-quantitative. Qualitative methods are usually descriptive methods that are suited to nominal and ordinal data. Quantitative methods use a different set of parameters and indicators to determine the characteristics of geodiversity in the area being researched. Qualitative-quantitative methods are a good combination of the collection of quantitative data (i.e. digital) and cause-effect data (i.e. relational and explanatory). It seems that at the current stage of the development of geodiversity research methods, qualitative-quantitative methods are the most advanced and best assess the geodiversity of the study area. Their particular advantage is the integration of data from different sources and with different substantive content. Among the distinguishing features of the quantitative and qualitative-quantitative methods for assessing geodiversity are their wide use within geographic information systems, both at the stage of data collection and data integration, as well as numerical processing and their presentation. The unresolved problem for these methods, however, is the possibility of their validation. It seems that currently the best method of validation is direct filed confrontation. Looking to the next few years, the development of qualitative-quantitative methods connected with cognitive issues should be expected, oriented towards ontology and the Semantic Web.
Bisso, Paul W; Tai, Michelle; Katepalli, Hari; Bertrand, Nicolas; Blankschtein, Daniel; Langer, Robert
2018-01-10
Hydrophobic self-assembly pairs diverse chemical precursors and simple formulation processes to access a vast array of functional colloids. Exploration of this design space, however, is stymied by lack of broadly general, high-throughput colloid characterization tools. Here, we show that a narrow structural subset of fluorescent, zwitterionic molecular rotors, dialkylaminostilbazolium sulfonates [DASS] with intermediate-length alkyl tails, fills this major analytical void by quantitatively sensing hydrophobic interfaces in microplate format. DASS dyes supersede existing interfacial probes by avoiding off-target fluorogenic interactions and dye aggregation while preserving hydrophobic partitioning strength. To illustrate the generality of this approach, we demonstrate (i) a microplate-based technique for measuring mass concentration of small (20-200 nm), dilute (submicrogram sensitivity) drug delivery nanoparticles; (ii) elimination of particle size, surfactant chemistry, and throughput constraints on quantifying the complex surfactant/metal oxide adsorption isotherms critical for environmental remediation and enhanced oil recovery; and (iii) more reliable self-assembly onset quantitation for chemically and structurally distinct amphiphiles. These methods could streamline the development of nanotechnologies for a broad range of applications.
March Cerdà, J C; Prieto Rodríguez, M A; Hernán García, M; Solas Gaspar, O
1999-01-01
Regarding the debate on the existence of two current focuses on health science research (qualitative and quantitative), the paper states the need for complementing the techniques which contribute to a better knowledge of populations and communities, and the need for offering effective solutions to different problems. The article analyses the usefulness of qualitative methods, describes the techniques and procedures more frequently used to guarantee the validity and reliability of research findings and ends bringing up the need for using qualitative and quantitative approaches. This way of working together or learning from each other will enrich research and interventions on public heath and health management fields. Qualitative methods are useful for sound understanding of a given issue that is being investigated or evaluated taking into account the point of view of the participants under research. Key techniques, listed from the most structured to the less structured are among others: structured interview, Delphi, nominal group, case study, semistructured interview, focal group, brainstorming, discussion group, in depth interview, life story and participant observation.
Volcano collapse promoted by hydrothermal alteration and edifice shape, Mount Rainier, Washington
Reid, M.E.; Sisson, T.W.; Brien, D.L.
2001-01-01
Catastrophic collapses of steep volcano flanks threaten many populated regions, and understanding factors that promote collapse could save lives and property. Large collapses of hydrothermally altered parts of Mount Rainier have generated far-traveled debris flows; future flows would threaten densely populated parts of the Puget Sound region. We evaluate edifice collapse hazards at Mount Rainier using a new three-dimensional slope stability method incorporating detailed geologic mapping and subsurface geophysical imaging to determine distributions of strong (fresh) and weak (altered) rock. Quantitative three-dimensional slope stability calculations reveal that sizeable flank collapse (>0.1 km3) is promoted by voluminous, weak, hydrothermally altered rock situated high on steep slopes. These conditions exist only on Mount Rainier's upper west slope, consistent with the Holocene debris-flow history. Widespread alteration on lower flanks or concealed in regions of gentle slope high on the edifice does not greatly facilitate collapse. Our quantitative stability assessment method can also provide useful hazard predictions using reconnaissance geologic information and is a potentially rapid and inexpensive new tool for aiding volcano hazard assessments.
Aspects of human biometeorology in past, present and future.
Höppe, P
1997-02-01
Human biometeorology is quite an old science: during the times of Hippokrates in ancient Greece the influence of weather changes on physiological processes in the human body were considered to exist. However, not until the progress in modern statistics, physics and physiology in the course of this century provided quantitative methods did human-biometeorology become an acknowledged natural science. In the first half of this century primarily the explanation of the phenomena of reactions of the body to weather changes was the general objective. In the second half of this century quantitative descriptions of thermal interchanges between the human body and the environment by means of energy balance models of the human body have gained increasing importance. The methods of modern human biometeorology increasingly are acknowledged by workers in disciplines of potential application, such as urban or regional planners or air conditioning engineers. Human biometeorology tries to assess all atmospheric influences in its entirety, including the air pollution pattern. The discipline considers itself as branch of science which is tied closely to environmental meteorology and environmental medicine.
Quantitative analysis of a reconstruction method for fully three-dimensional PET.
Suckling, J; Ott, R J; Deehan, B J
1992-03-01
The major advantage of positron emission tomography (PET) using large area planar detectors over scintillator-based commercial ring systems is the potentially larger (by a factor of two or three) axial field-of-view (FOV). However, to achieve the space invariance of the point spread function necessary for Fourier filtering a polar angle rejection criterion is applied to the data during backprojection resulting in a trade-off between FOV size and sensitivity. A new algorithm due to Defrise and co-workers developed for list-mode data overcomes this problem with a solution involving the division of the image into several subregions. A comparison between the existing backprojection-then-filter algorithm and the new method (with three subregions) has been made using both simulated and real data collected from the MUP-PET positron camera. Signal-to-noise analysis reveals that improvements of up to a factor of 1.4 are possible resulting from an increased data usage of up to a factor of 2.5 depending on the axial extent of the imaged object. Quantitation is also improved.
Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D
2014-02-01
Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.
Brennan, David J; Bauer, Greta R; Bradley, Kaitlin; Tran, Oth Vilaythong
2017-01-01
Research on sexual minority men (gay, bisexual, and other men who have sex with men) was examined with regard to the measures of sexual orientation used, the methods of research, and the main health outcomes under study. A systematic review of English-language quantitative studies was conducted focused on the health of sexual minority men published in 2010 (n = 250). The results provide a snapshot of the literature and revealed that research on sexual minority men overwhelmingly focused on HIV, STIs, and sexual health for which sexual orientation was most commonly defined behaviorally. For topics of mental health or body/fitness outcomes, sexual orientation was most commonly defined by identity. Most study samples were venue-based, and only 8.8% of published papers drew data from population-based samples. The findings suggest that there exists a need for research on sexual minority men's health beyond STIs and HIV that will examine mental and physical health outcomes beyond sexual risk, uses probability-based samples, and addresses intersectional concerns related to race/ethnicity and age.
Determination of the complete structure of natural lecithins.
Kuksis, A; Marai, L
1967-05-01
A method is described for the separation, identification, and quantitative estimation of the individual molecular species occurring in natural lecithin mixtures. Purified lecithin preparations are converted into diglyceride acetates by enzymic dephosphorylation and acetylation. The diglyceride acetates are separated on the basis of the degree of unsaturation and the molecular geometry by means of chromatography on thin layers of silica gel which are impregnated with silver nitrate. The various acetates thus resolved are separately recovered from the plates and diluted with tridecanoin internal standard; the quantitative distribution of the molecular weights is determined by gas chromatography.Suitable aliquots of the saturated and unsaturated diglyceride acetates are further analyzed for over-all and for positional distribution of fatty acids. The identity and proportions of the various lecithins are deduced by integration and normalization of all the experimental data. Where doubt exists, specific diglyceride acetates are isolated by preparative gas chromatography, and their fatty acid composition is determined. The method is illustrated with data obtained for the mixed lecithins of egg yolk. The general approach is applicable to the determination of the structure of other phospholipids of comparable complexity.
Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.
2016-01-01
Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654
Schalken, Naomi; Rietbergen, Charlotte
2017-01-01
Objective: The goal of this systematic review was to examine the reporting quality of the method section of quantitative systematic reviews and meta-analyses from 2009 to 2016 in the field of industrial and organizational psychology with the help of the Meta-Analysis Reporting Standards (MARS), and to update previous research, such as the study of Aytug et al. (2012) and Dieckmann et al. (2009). Methods: A systematic search for quantitative systematic reviews and meta-analyses was conducted in the top 10 journals in the field of industrial and organizational psychology between January 2009 and April 2016. Data were extracted on study characteristics and items of the method section of MARS. A cross-classified multilevel model was analyzed, to test whether publication year and journal impact factor (JIF) were associated with the reporting quality scores of articles. Results: Compliance with MARS in the method section was generally inadequate in the random sample of 120 articles. Variation existed in the reporting of items. There were no significant effects of publication year and journal impact factor (JIF) on the reporting quality scores of articles. Conclusions: The reporting quality in the method section of systematic reviews and meta-analyses was still insufficient, therefore we recommend researchers to improve the reporting in their articles by using reporting standards like MARS. PMID:28878704
Nasu, Mamiko; Nemoto, Takayuki; Mimura, Hisashi; Sako, Kazuhiro
2013-01-01
Most pharmaceutical drug substances and excipients in formulations exist in a crystalline or amorphous form, and an understanding of their state during manufacture and storage is critically important, particularly in formulated products. Carbon 13 solid-state nuclear magnetic resonance (NMR) spectroscopy is useful for studying the chemical and physical state of pharmaceutical solids in a formulated product. We developed two new selective signal excitation methods in (13) C solid-state NMR to extract the spectrum of a target component from such a mixture. These methods were based on equalization of the proton relaxation time in a single domain via rapid intraproton spin diffusion and the difference in proton spin-lattice relaxation time in the rotating frame ((1) H T1rho) of individual components in the mixture. Introduction of simple pulse sequences to one-dimensional experiments reduced data acquisition time and increased flexibility. We then demonstrated these methods in a commercially available drug and in a mixture of two saccharides, in which the (13) C signals of the target components were selectively excited, and showed them to be applicable to the quantitative analysis of individual components in solid mixtures, such as formulated products, polymorphic mixtures, or mixtures of crystalline and amorphous phases. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Ugryumova, Nadya; Bonesi, Marco; Matcher, Stephen J.
2008-02-01
Polarization-sensitive optical coherence tomography has been used to solve fast-axis fibre orientation in three dimension space. Previously we have demonstrated that the apparent variations in polar angle orientation of collagen fibers along sagittal ridge of equine third metacarpophalangeal joint exist. A quantitative method based on multiple angles of illumination has been proposed to determine the polar angle of the collagen fibers. This method however ignored the full 3D structure by assuming that the collagen fibers long-axis lay within the plane of incidence. A new quantitative method based on the theory of light propagation in uniaxial materials is described which avoids this assumption. To test this method we have performed control experiments on a sample of equine tendon (this tissue has well defined c-axis lying along the long-axis of the tendon). Several samples of tendon were cut to achieve a planar surface inclined at -20° to the long axis. Additional 30° rotation provided non-zero azimuthal angle. The surface was then imaged using incident beam angles -40°, -20°, 0, +20°, +40° in two orthogonal planes. Values for both the polar and azimuthal angles were then derived using a numerical optimisation procedure. Results agreed qualitatively with the nominal values but suggested that the accuracy was limited by our method of determining the apparent birefringence.
NASA Astrophysics Data System (ADS)
Ugryumova, Nadya; Matcher, Stephen J.
2007-02-01
Polarization-sensitive optical coherence tomography has been used to solve fast-axis fibre orientation in three dimension space. Previously we have demonstrated that the apparent variations in polar angle orientation of collagen fibers along sagittal ridge of equine third metacarpophalangeal joint exist. A quantitative method based on multiple angles of illumination has been proposed to determine the polar angle of the collagen fibers. This method however ignored the full 3-D structure by assuming that the collagen fibers long-axis lay within the plane of incidence. A new quantitative method based on the theory of light propagation in uniaxial materials is described which avoids this assumption. To test this method we have performed control experiments on a sample of equine tendon (this tissue has well defined c-axis lying along the long-axis of the tendon). Several samples of tendon were cut to achieve a planar surface inclined at -20° to the long axis. Additional 30° rotation provided non-zero azimuthal angle. The surface was then imaged using incident beam angles -40°, -20°, 0, +20°, +40° in two orthogonal planes. Values for both the polar and azimuthal angles were then derived using a numerical optimisation procedure. Results agreed qualitatively with the nominal values but suggested that the accuracy was limited by our method of determining the apparent birefringence.
Rode, Tone Mari; Berget, Ingunn; Langsrud, Solveig; Møretrø, Trond; Holck, Askild
2009-07-01
Microorganisms are constantly exposed to new and altered growth conditions, and respond by changing gene expression patterns. Several methods for studying gene expression exist. During the last decade, the analysis of microarrays has been one of the most common approaches applied for large scale gene expression studies. A relatively new method for gene expression analysis is MassARRAY, which combines real competitive-PCR and MALDI-TOF (matrix-assisted laser desorption/ionization time-of-flight) mass spectrometry. In contrast to microarray methods, MassARRAY technology is suitable for analysing a larger number of samples, though for a smaller set of genes. In this study we compare the results from MassARRAY with microarrays on gene expression responses of Staphylococcus aureus exposed to acid stress at pH 4.5. RNA isolated from the same stress experiments was analysed using both the MassARRAY and the microarray methods. The MassARRAY and microarray methods showed good correlation. Both MassARRAY and microarray estimated somewhat lower fold changes compared with quantitative real-time PCR (qRT-PCR). The results confirmed the up-regulation of the urease genes in acidic environments, and also indicated the importance of metal ion regulation. This study shows that the MassARRAY technology is suitable for gene expression analysis in prokaryotes, and has advantages when a set of genes is being analysed for an organism exposed to many different environmental conditions.
Quantification of febuxostat polymorphs using powder X-ray diffraction technique.
Qiu, Jing-bo; Li, Gang; Sheng, Yue; Zhu, Mu-rong
2015-03-25
Febuxostat is a pharmaceutical compound with more than 20 polymorphs of which form A is most widely used and usually exists in a mixed polymorphic form with form G. In the present study, a quantification method for polymorphic form A and form G of febuxostat (FEB) has been developed using powder X-ray diffraction (PXRD). Prior to development of a quantification method, pure polymorphic form A and form G are characterized. A continuous scan with a scan rate of 3° min(-1) over an angular range of 3-40° 2θ is applied for the construction of the calibration curve using the characteristic peaks of form A at 12.78° 2θ (I/I0100%) and form G at 11.72° 2θ (I/I0100%). The linear regression analysis data for the calibration plots shows good linear relationship with R(2)=0.9985 with respect to peak area in the concentration range 10-60 wt.%. The method is validated for precision, recovery and ruggedness. The limits of detection and quantitation are 1.5% and 4.6%, respectively. The obtained results prove that the method is repeatable, sensitive and accurate. The proposed developed PXRD method can be applied for the quantitative analysis of mixtures of febuxostat polymorphs (forms A and G). Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gigase, Yves
2007-07-01
Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide asmore » example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)« less
Single Molecule Approaches in RNA-Protein Interactions.
Serebrov, Victor; Moore, Melissa J
RNA-protein interactions govern every aspect of RNA metabolism, and aberrant RNA-binding proteins are the cause of hundreds of genetic diseases. Quantitative measurements of these interactions are necessary in order to understand mechanisms leading to diseases and to develop efficient therapies. Existing methods of RNA-protein interactome capture can afford a comprehensive snapshot of RNA-protein interaction networks but lack the ability to characterize the dynamics of these interactions. As all ensemble methods, their resolution is also limited by statistical averaging. Here we discuss recent advances in single molecule techniques that have the potential to tackle these challenges. We also provide a thorough overview of single molecule colocalization microscopy and the essential protein and RNA tagging and detection techniques.
NASA Astrophysics Data System (ADS)
Arai, Hiroyuki; Miyagawa, Isao; Koike, Hideki; Haseyama, Miki
We propose a novel technique for estimating the number of people in a video sequence; it has the advantages of being stable even in crowded situations and needing no ground-truth data. By analyzing the geometrical relationships between image pixels and their intersection volumes in the real world quantitatively, a foreground image directly indicates the number of people. Because foreground detection is possible even in crowded situations, the proposed method can be applied in such situations. Moreover, it can estimate the number of people in an a priori manner, so it needs no ground-truth data unlike existing feature-based estimation techniques. Experiments show the validity of the proposed method.
Estimation of total alkaloid in Chitrakadivati by UV-Spectrophotometer.
Ajanal, Manjunath; Gundkalle, Mahadev B; Nayak, Shradda U
2012-04-01
Herbal formulation standardization by adopting newer technique is need of the hour in the field of Ayurvedic pharmaceutical industry. As very few reports exist. These kind of studies would certainly widen the herbal research area. Chitrakadivati is one such popular herbal formulation used in Ayurveda. Many of its ingredients are known for presence of alkaloids. Presence of alkaloid was tested qualitatively by Dragondroff's method then subjected to quantitative estimation by UV-Spectrophotometer. This method is based on the reaction between alkaloid and bromocresol green (BCG). Study discloses that out of 16 ingredients, 9 contain alkaloid. Chitrakadivati has shown 0.16% of concentration of alkaloid and which is significantly higher than it's individual ingredients.
A relative performance analysis of atmospheric Laser Doppler Velocimeter methods.
NASA Technical Reports Server (NTRS)
Farmer, W. M.; Hornkohl, J. O.; Brayton, D. B.
1971-01-01
Evaluation of the effectiveness of atmospheric applications of a Laser Doppler Velocimeter (LDV) at a wavelength of about 0.5 micrometer in conjunction with dual scatter LDV illuminating techniques, or at a wavelength of 10.6 micrometer with local oscillator LDV illuminating techniques. Equations and examples are given to provide a quantitative basis for LDV system selection and performance criteria in atmospheric research. The comparative study shows that specific ranges and conditions exist where performance of one of the methods is superior to that of the other. It is also pointed out that great care must be exercised in choosing system parameters that optimize a particular LDV designed for atmospheric applications.
Amador, Carolina; Chen, Shigao; Manduca, Armando; Greenleaf, James F.; Urban, Matthew W.
2017-01-01
Quantitative ultrasound elastography is increasingly being used in the assessment of chronic liver disease. Many studies have reported ranges of liver shear wave velocities values for healthy individuals and patients with different stages of liver fibrosis. Nonetheless, ongoing efforts exist to stabilize quantitative ultrasound elastography measurements by assessing factors that influence tissue shear wave velocity values, such as food intake, body mass index (BMI), ultrasound scanners, scanning protocols, ultrasound image quality, etc. Time-to-peak (TTP) methods have been routinely used to measure the shear wave velocity. However, there is still a need for methods that can provide robust shear wave velocity estimation in the presence of noisy motion data. The conventional TTP algorithm is limited to searching for the maximum motion in time profiles at different spatial locations. In this study, two modified shear wave speed estimation algorithms are proposed. The first method searches for the maximum motion in both space and time (spatiotemporal peak, STP); the second method applies an amplitude filter (spatiotemporal thresholding, STTH) to select points with motion amplitude higher than a threshold for shear wave group velocity estimation. The two proposed methods (STP and STTH) showed higher precision in shear wave velocity estimates compared to TTP in phantom. Moreover, in a cohort of 14 healthy subjects STP and STTH methods improved both the shear wave velocity measurement precision and the success rate of the measurement compared to conventional TTP. PMID:28092532
Amador Carrascal, Carolina; Chen, Shigao; Manduca, Armando; Greenleaf, James F; Urban, Matthew W
2017-04-01
Quantitative ultrasound elastography is increasingly being used in the assessment of chronic liver disease. Many studies have reported ranges of liver shear wave velocity values for healthy individuals and patients with different stages of liver fibrosis. Nonetheless, ongoing efforts exist to stabilize quantitative ultrasound elastography measurements by assessing factors that influence tissue shear wave velocity values, such as food intake, body mass index, ultrasound scanners, scanning protocols, and ultrasound image quality. Time-to-peak (TTP) methods have been routinely used to measure the shear wave velocity. However, there is still a need for methods that can provide robust shear wave velocity estimation in the presence of noisy motion data. The conventional TTP algorithm is limited to searching for the maximum motion in time profiles at different spatial locations. In this paper, two modified shear wave speed estimation algorithms are proposed. The first method searches for the maximum motion in both space and time [spatiotemporal peak (STP)]; the second method applies an amplitude filter [spatiotemporal thresholding (STTH)] to select points with motion amplitude higher than a threshold for shear wave group velocity estimation. The two proposed methods (STP and STTH) showed higher precision in shear wave velocity estimates compared with TTP in phantom. Moreover, in a cohort of 14 healthy subjects, STP and STTH methods improved both the shear wave velocity measurement precision and the success rate of the measurement compared with conventional TTP.
Magnetic resonance imaging of the subthalamic nucleus for deep brain stimulation.
Chandran, Arjun S; Bynevelt, Michael; Lind, Christopher R P
2016-01-01
The subthalamic nucleus (STN) is one of the most important stereotactic targets in neurosurgery, and its accurate imaging is crucial. With improving MRI sequences there is impetus for direct targeting of the STN. High-quality, distortion-free images are paramount. Image reconstruction techniques appear to show the greatest promise in balancing the issue of geometrical distortion and STN edge detection. Existing spin echo- and susceptibility-based MRI sequences are compared with new image reconstruction methods. Quantitative susceptibility mapping is the most promising technique for stereotactic imaging of the STN.
Tip/tilt-compensated through-focus scanning optical microscopy
NASA Astrophysics Data System (ADS)
Lee, Jun Ho; Park, Jun Hyung; Jeong, Dohwan; Shin, Eun Ji; Park, Chris
2016-11-01
Through-Focus Optical Microscopy (TSOM), with nanometer scale lateral and vertical sensitivity matching those of scanning electron microscopy, has been demonstrated to be utilized for 3D inspection and metrology. There have been sensitivity and instability issues in acquiring through-focus images because TSOM 3D information is indirectly extracted by differentiating a target TSOM image from reference TSOM images. This paper first reports on the optical axis instability that occurs during the scanning process of TSOM when implemented in an existing patterned wafer inspection tool by moving the wafer plane; this is followed by quantitative confirmation of the optical/mechanical instability using a new TSOM tool on an optical bench with a Shack-Hartmann wavefront sensor and a tip/tilt sensor. Then, this paper proposes two tip/tilt compensated TSOM optical acquisition methods that can be applied with adaptive optics. The first method simply adopts a tip/tilt mirror with a quad cell in a simple closed loop, while the second method adopts a highorder deformable mirror with a Shack-Hartmann sensor. The second method is able to correct high-order residual aberrations as well as to perform through-focus scanning without z-axis movement, while the first method is easier to implement in pre-existing wafer inspection systems with only minor modification.
Yousefi, Siavash; Qin, Jia; Zhi, Zhongwei; Wang, Ruikang K
2013-02-01
Optical microangiography is an imaging technology that is capable of providing detailed functional blood flow maps within microcirculatory tissue beds in vivo. Some practical issues however exist when displaying and quantifying the microcirculation that perfuses the scanned tissue volume. These issues include: (I) Probing light is subject to specular reflection when it shines onto sample. The unevenness of the tissue surface makes the light energy entering the tissue not uniform over the entire scanned tissue volume. (II) The biological tissue is heterogeneous in nature, meaning the scattering and absorption properties of tissue would attenuate the probe beam. These physical limitations can result in local contrast degradation and non-uniform micro-angiogram images. In this paper, we propose a post-processing method that uses Rayleigh contrast-limited adaptive histogram equalization to increase the contrast and improve the overall appearance and uniformity of optical micro-angiograms without saturating the vessel intensity and changing the physical meaning of the micro-angiograms. The qualitative and quantitative performance of the proposed method is compared with those of common histogram equalization and contrast enhancement methods. We demonstrate that the proposed method outperforms other existing approaches. The proposed method is not limited to optical microangiography and can be used in other image modalities such as photo-acoustic tomography and scanning laser confocal microscopy.
Sun, Bo; Xue, Sheng-Ling; Zhang, Fen; Luo, Zhao-Peng; Wu, Ming-Zhu; Chen, Qing; Tang, Hao-Ru; Lin, Fu-Cheng; Yang, Jun
2015-11-17
Nornicotine production in Nicotiana tabacum is undesirable because it is the precursor of the carcinogen N'-nitrosonornicotine. In some individual burley tobacco plants, a large proportion of the nicotine can be converted to nornicotine, and this process of nicotine conversion is mediated primarily by enzymatic N-demethylation of nicotine which is controlled mainly by CYP82E4. Here we report a novel strategy based on quantitative real-time polymerase chain reaction (qPCR) method, which analyzed the ratio of nicotine conversion through examining the transcript level of CYP82E4 in burley leaves and do not need ethylene induction before detected. The assay was linear in a range from 1 × 10¹ to 1 × 10⁵ copies/mL of serially diluted standards, and also showed high specificity and reproducibility (93%-99%). To assess its applicability, 55 plants of burley cultivar Ky8959 at leaf maturing stage were analyzed, and the results were in accordance with those from gas chromatograph-mass spectrometry (GC-MS) method. Moreover, a linear correlation existed between conversion level and CYP82E4 transcript abundance. Taken together, the quantitative real-time PCR assay is standardized, rapid and reproducible for estimation of nicotine conversion level in vivo, which is expected to shed new light on monitoring of burley tobacco converter.
Feng, Chenghong; Bi, Zhe; Tang, Hongxiao
2015-01-06
Electrospray mass spectrometry has been reported as a novel technique for Al species identification, but to date, the working mechanism is not clear and no unanimous method exists for spectrum analysis of traditional Al salt flocculants, let alone for analysis of polyaluminum chloride (PAC) flocculants. Therefore, this paper introduces a novel theoretical calculation method to identify Al species from a mass spectrum, based on deducing changes in m/z (mass-to-charge ratio) and molecular formulas of oligomers in five typical PAC flocculants. The use of reference chemical species was specially proposed in the method to guarantee the uniqueness of the assigned species. The charge and mass reduction of the Al cluster was found to proceed by hydrolysis, gasification, and change of hydroxyl on the oxy bridge. The novel method was validated both qualitatively and quantitatively by comparing the results to those obtained with the (27)Al NMR spectrometry.
Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon
2014-01-01
In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis. PMID:28788708
Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon
2014-06-23
In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis.
Creating analytically divergence-free velocity fields from grid-based data
NASA Astrophysics Data System (ADS)
Ravu, Bharath; Rudman, Murray; Metcalfe, Guy; Lester, Daniel R.; Khakhar, Devang V.
2016-10-01
We present a method, based on B-splines, to calculate a C2 continuous analytic vector potential from discrete 3D velocity data on a regular grid. A continuous analytically divergence-free velocity field can then be obtained from the curl of the potential. This field can be used to robustly and accurately integrate particle trajectories in incompressible flow fields. Based on the method of Finn and Chacon (2005) [10] this new method ensures that the analytic velocity field matches the grid values almost everywhere, with errors that are two to four orders of magnitude lower than those of existing methods. We demonstrate its application to three different problems (each in a different coordinate system) and provide details of the specifics required in each case. We show how the additional accuracy of the method results in qualitatively and quantitatively superior trajectories that results in more accurate identification of Lagrangian coherent structures.
Trans-dimensional MCMC methods for fully automatic motion analysis in tagged MRI.
Smal, Ihor; Carranza-Herrezuelo, Noemí; Klein, Stefan; Niessen, Wiro; Meijering, Erik
2011-01-01
Tagged magnetic resonance imaging (tMRI) is a well-known noninvasive method allowing quantitative analysis of regional heart dynamics. Its clinical use has so far been limited, in part due to the lack of robustness and accuracy of existing tag tracking algorithms in dealing with low (and intrinsically time-varying) image quality. In this paper, we propose a novel probabilistic method for tag tracking, implemented by means of Bayesian particle filtering and a trans-dimensional Markov chain Monte Carlo (MCMC) approach, which efficiently combines information about the imaging process and tag appearance with prior knowledge about the heart dynamics obtained by means of non-rigid image registration. Experiments using synthetic image data (with ground truth) and real data (with expert manual annotation) from preclinical (small animal) and clinical (human) studies confirm that the proposed method yields higher consistency, accuracy, and intrinsic tag reliability assessment in comparison with other frequently used tag tracking methods.
Germain, Ronald N
2017-10-16
A dichotomy exists in the field of vaccinology about the promise versus the hype associated with application of "systems biology" approaches to rational vaccine design. Some feel it is the only way to efficiently uncover currently unknown parameters controlling desired immune responses or discover what elements actually mediate these responses. Others feel that traditional experimental, often reductionist, methods for incrementally unraveling complex biology provide a more solid way forward, and that "systems" approaches are costly ways to collect data without gaining true insight. Here I argue that both views are inaccurate. This is largely because of confusion about what can be gained from classical experimentation versus statistical analysis of large data sets (bioinformatics) versus methods that quantitatively explain emergent properties of complex assemblies of biological components, with the latter reflecting what was previously called "physiology." Reductionist studies will remain essential for generating detailed insight into the functional attributes of specific elements of biological systems, but such analyses lack the power to provide a quantitative and predictive understanding of global system behavior. But by employing (1) large-scale screening methods for discovery of unknown components and connections in the immune system ( omics ), (2) statistical analysis of large data sets ( bioinformatics ), and (3) the capacity of quantitative computational methods to translate these individual components and connections into models of emergent behavior ( systems biology ), we will be able to better understand how the overall immune system functions and to determine with greater precision how to manipulate it to produce desired protective responses. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.
Hou, Sen; Sun, Lili; Wieczorek, Stefan A; Kalwarczyk, Tomasz; Kaminski, Tomasz S; Holyst, Robert
2014-01-15
Fluorescent double-stranded DNA (dsDNA) molecules labeled at both ends are commonly produced by annealing of complementary single-stranded DNA (ssDNA) molecules, labeled with fluorescent dyes at the same (3' or 5') end. Because the labeling efficiency of ssDNA is smaller than 100%, the resulting dsDNA have two, one or are without a dye. Existing methods are insufficient to measure the percentage of the doubly-labeled dsDNA component in the fluorescent DNA sample and it is even difficult to distinguish the doubly-labeled DNA component from the singly-labeled component. Accurate measurement of the percentage of such doubly labeled dsDNA component is a critical prerequisite for quantitative biochemical measurements, which has puzzled scientists for decades. We established a fluorescence correlation spectroscopy (FCS) system to measure the percentage of doubly labeled dsDNA (PDL) in the total fluorescent dsDNA pool. The method is based on comparative analysis of the given sample and a reference dsDNA sample prepared by adding certain amount of unlabeled ssDNA into the original ssDNA solution. From FCS autocorrelation functions, we obtain the number of fluorescent dsDNA molecules in the focal volume of the confocal microscope and PDL. We also calculate the labeling efficiency of ssDNA. The method requires minimal amount of material. The samples have the concentration of DNA in the nano-molar/L range and the volume of tens of microliters. We verify our method by using restriction enzyme Hind III to cleave the fluorescent dsDNA. The kinetics of the reaction depends strongly on PDL, a critical parameter for quantitative biochemical measurements. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Rosenfeld, Daniel; Short, David A.; Atlas, David
1990-01-01
A theory is developed which establishes the basis for the use of rainfall areas within present thresholds as a measure of either the instantaneous areawide rain rate of convective storms or the total volume of rain from an individual storm over its lifetime. The method is based upon the existence of a well-behaved pdf of rain rate either from the many storms at one instant or from a single storm during its life. The generality of the instantaneous areawide method was examined by applying it to quantitative radar data sets from the GARP Tropical Atlantic Experiment for South Africa, Texas, and Darwin (Australia). It is shown that the pdf's developed for each of these areas are consistent with the theory.
Cumulative Risk and Impact Modeling on Environmental Chemical and Social Stressors.
Huang, Hongtai; Wang, Aolin; Morello-Frosch, Rachel; Lam, Juleen; Sirota, Marina; Padula, Amy; Woodruff, Tracey J
2018-03-01
The goal of this review is to identify cumulative modeling methods used to evaluate combined effects of exposures to environmental chemicals and social stressors. The specific review question is: What are the existing quantitative methods used to examine the cumulative impacts of exposures to environmental chemical and social stressors on health? There has been an increase in literature that evaluates combined effects of exposures to environmental chemicals and social stressors on health using regression models; very few studies applied other data mining and machine learning techniques to this problem. The majority of studies we identified used regression models to evaluate combined effects of multiple environmental and social stressors. With proper study design and appropriate modeling assumptions, additional data mining methods may be useful to examine combined effects of environmental and social stressors.
An accurate method of extracting fat droplets in liver images for quantitative evaluation
NASA Astrophysics Data System (ADS)
Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie
2015-03-01
The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.
Mixed Methods in Biomedical and Health Services Research
Curry, Leslie A.; Krumholz, Harlan M.; O’Cathain, Alicia; Plano Clark, Vicki L.; Cherlin, Emily; Bradley, Elizabeth H.
2013-01-01
Mixed methods studies, in which qualitative and quantitative methods are combined in a single program of inquiry, can be valuable in biomedical and health services research, where the complementary strengths of each approach can yield greater insight into complex phenomena than either approach alone. Although interest in mixed methods is growing among science funders and investigators, written guidance on how to conduct and assess rigorous mixed methods studies is not readily accessible to the general readership of peer-reviewed biomedical and health services journals. Furthermore, existing guidelines for publishing mixed methods studies are not well known or applied by researchers and journal editors. Accordingly, this paper is intended to serve as a concise, practical resource for readers interested in core principles and practices of mixed methods research. We briefly describe mixed methods approaches and present illustrations from published biomedical and health services literature, including in cardiovascular care, summarize standards for the design and reporting of these studies, and highlight four central considerations for investigators interested in using these methods. PMID:23322807
Hilborn, Jennifer V; Strauss, Esther; Hultsch, David F; Hunter, Michael A
2009-05-01
A growing body of research suggests that substantial variability exists among cognitive abilities within individuals. This within-person variability across cognitive domains is termed dispersion. The present study investigated the relationship between aging and dispersion of cognitive functions both quantitatively (overall levels of dispersion) and qualitatively (patterns of dispersion) in a sample of 304 nondemented, older adults aged 64 to 92 years (M = 74.02). Quantitatively, higher levels of dispersion were observed in the old-old adults (aged 75-92 years) and those identified as having experienced cognitive decline, suggesting that dispersion level may serve as a marker of cognitive integrity. Qualitatively, three distinct dispersion profiles were identified through clustering methods, and these were found to be related to demographic, health, and performance characteristics of the individuals, suggesting that patterns of dispersion may be meaningful indicators of individual differences.
Holographic 3D imaging through diffuse media by compressive sampling of the mutual intensity
NASA Astrophysics Data System (ADS)
Falldorf, Claas; Klein, Thorsten; Agour, Mostafa; Bergmann, Ralf B.
2017-05-01
We present a method for holographic imaging through a volume scattering material, which is based on selfreference and light with good spatial but limited temporal coherence. In contrast to existing techniques, we do not require a separate reference wave, thus our approach provides great advantages towards the flexibility of the measurement system. The main applications are remote sensing and investigation of moving objects through gaseous streams, bubbles or foggy water for example. Furthermore, due to the common path nature, the system is also insensitive to mechanical disturbances. The measurement result is a complex amplitude which is comparable to a phase shifted digital hologramm and therefore allows 3D imaging, numerical refocusing and quantitative phase contrast imaging. As an example of application, we present measurements of the quantitative phase contrast of the epidermis of an onion through a volume scattering material.
Ma, Junshui; Bayram, Sevinç; Tao, Peining; Svetnik, Vladimir
2011-03-15
After a review of the ocular artifact reduction literature, a high-throughput method designed to reduce the ocular artifacts in multichannel continuous EEG recordings acquired at clinical EEG laboratories worldwide is proposed. The proposed method belongs to the category of component-based methods, and does not rely on any electrooculography (EOG) signals. Based on a concept that all ocular artifact components exist in a signal component subspace, the method can uniformly handle all types of ocular artifacts, including eye-blinks, saccades, and other eye movements, by automatically identifying ocular components from decomposed signal components. This study also proposes an improved strategy to objectively and quantitatively evaluate artifact reduction methods. The evaluation strategy uses real EEG signals to synthesize realistic simulated datasets with different amounts of ocular artifacts. The simulated datasets enable us to objectively demonstrate that the proposed method outperforms some existing methods when no high-quality EOG signals are available. Moreover, the results of the simulated datasets improve our understanding of the involved signal decomposition algorithms, and provide us with insights into the inconsistency regarding the performance of different methods in the literature. The proposed method was also applied to two independent clinical EEG datasets involving 28 volunteers and over 1000 EEG recordings. This effort further confirms that the proposed method can effectively reduce ocular artifacts in large clinical EEG datasets in a high-throughput fashion. Copyright © 2011 Elsevier B.V. All rights reserved.
Metastable nanobubbles at the solid-liquid interface due to contact angle hysteresis.
Nishiyama, Takashi; Yamada, Yutaka; Ikuta, Tatsuya; Takahashi, Koji; Takata, Yasuyuki
2015-01-27
Nanobubbles exist at solid-liquid interfaces between pure water and hydrophobic surfaces with very high stability, lasting in certain cases up to several days. Not only semispherical but also other shapes, such as micropancakes, are known to exist at such interfaces. However, doubt has been raised as to whether or not the nanobubbles are gas-phase entities. In this study, surface nanobubbles at a pure water-highly ordered pyrolytic graphite (HOPG) interface were investigated by peak force quantitative nanomechanics (PF-QNM). Multiple isolated nanobubbles generated by the solvent-exchange method were present on the terraced areas, avoiding the steps of the HOPG surface. Adjacent nanobubbles coalesced and formed metastable nanobubbles. Coalescence was enhanced by the PF-QNM measurement. We determined that nanobubbles can exist for a long time because of nanoscale contact angle hysteresis at the water-HOPG interface. Moreover, the hydrophilic steps of HOPG were avoided during coalescence, providing evidence that the nanobubbles are truly gas phase.
Scattering of sound by atmospheric turbulence predictions in a refractive shadow zone
NASA Technical Reports Server (NTRS)
Mcbride, Walton E.; Bass, Henry E.; Raspet, Richard; Gilbert, Kenneth E.
1990-01-01
According to ray theory, regions exist in an upward refracting atmosphere where no sound should be present. Experiments show, however, that appreciable sound levels penetrate these so-called shadow zones. Two mechanisms contribute to sound in the shadow zone: diffraction and turbulent scattering of sound. Diffractive effects can be pronounced at lower frequencies but are small at high frequencies. In the short wavelength limit, then, scattering due to turbulence should be the predominant mechanism involved in producing the sound levels measured in shadow zones. No existing analytical method includes turbulence effects in the prediction of sound pressure levels in upward refractive shadow zones. In order to obtain quantitative average sound pressure level predictions, a numerical simulation of the effect of atmospheric turbulence on sound propagation is performed. The simulation is based on scattering from randomly distributed scattering centers ('turbules'). Sound pressure levels are computed for many realizations of a turbulent atmosphere. Predictions from the numerical simulation are compared with existing theories and experimental data.
Ultrasensitive Genotypic Detection of Antiviral Resistance in Hepatitis B Virus Clinical Isolates▿ †
Fang, Jie; Wichroski, Michael J.; Levine, Steven M.; Baldick, Carl J.; Mazzucco, Charles E.; Walsh, Ann W.; Kienzle, Bernadette K.; Rose, Ronald E.; Pokornowski, Kevin A.; Colonno, Richard J.; Tenney, Daniel J.
2009-01-01
Amino acid substitutions that confer reduced susceptibility to antivirals arise spontaneously through error-prone viral polymerases and are selected as a result of antiviral therapy. Resistance substitutions first emerge in a fraction of the circulating virus population, below the limit of detection by nucleotide sequencing of either the population or limited sets of cloned isolates. These variants can expand under drug pressure to dominate the circulating virus population. To enhance detection of these viruses in clinical samples, we established a highly sensitive quantitative, real-time allele-specific PCR assay for hepatitis B virus (HBV) DNA. Sensitivity was accomplished using a high-fidelity DNA polymerase and oligonucleotide primers containing locked nucleic acid bases. Quantitative measurement of resistant and wild-type variants was accomplished using sequence-matched standards. Detection methodology that was not reliant on hybridization probes, and assay modifications, minimized the effect of patient-specific sequence polymorphisms. The method was validated using samples from patients chronically infected with HBV through parallel sequencing of large numbers of cloned isolates. Viruses with resistance to lamivudine and other l-nucleoside analogs and entecavir, involving 17 different nucleotide substitutions, were reliably detected at levels at or below 0.1% of the total population. The method worked across HBV genotypes. Longitudinal analysis of patient samples showed earlier emergence of resistance on therapy than was seen with sequencing methodologies, including some cases of resistance that existed prior to treatment. In summary, we established and validated an ultrasensitive method for measuring resistant HBV variants in clinical specimens, which enabled earlier, quantitative measurement of resistance to therapy. PMID:19433559
NASA Astrophysics Data System (ADS)
Smith, Elizabeth Myhra
The interactions of peripheral membrane proteins with both membrane lipids and proteins are vital for many cellular processes including membrane trafficking, cellular signaling, and cell growth/regulation. Building accurate biophysical models of these processes requires quantitative characterization of the behavior of peripheral membrane proteins, yet methods to quantify their interactions inside living cells are very limited. Because peripheral membrane proteins usually exist both in membrane-bound and cytoplasmic forms, the separation of these two populations is a key challenge. This thesis aims at addressing this challenge by extending fluorescence fluctuation spectroscopy (FFS) to simultaneously measure the oligomeric state of peripheral membrane proteins in the cytoplasm and at the plasma membrane. We developed a new method based on z-scan FFS that accounts for the fluorescence contributions from cytoplasmic and membrane layers by incorporating a fluorescence intensity z-scan through the cell. H-Ras-EGFP served as a model system to demonstrate the feasibility of the technique. The resolvability and stability of z-scanning was determined as well as the oligomeric state of H-Ras-EGFP at the plasma membrane and in the cytoplasm. Further, we successfully characterized the binding affinity of a variety of proteins to the plasma membrane by quantitative analysis of the z-scan fluorescence intensity profile. This analysis method, which we refer to as z-scan fluorescence profile deconvoution, was further used in combination with dual-color competition studies to determine the lipid specificity of protein binding. Finally, we applied z-scan FFS to provide insight into the early assembly steps of the HTLV-1 retrovirus.
QuASAR: quantitative allele-specific analysis of reads
Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger
2015-01-01
Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375
Li, Xingyu; Plataniotis, Konstantinos N
2015-07-01
In digital histopathology, tasks of segmentation and disease diagnosis are achieved by quantitative analysis of image content. However, color variation in image samples makes it challenging to produce reliable results. This paper introduces a complete normalization scheme to address the problem of color variation in histopathology images jointly caused by inconsistent biopsy staining and nonstandard imaging condition. Method : Different from existing normalization methods that either address partial cause of color variation or lump them together, our method identifies causes of color variation based on a microscopic imaging model and addresses inconsistency in biopsy imaging and staining by an illuminant normalization module and a spectral normalization module, respectively. In evaluation, we use two public datasets that are representative of histopathology images commonly received in clinics to examine the proposed method from the aspects of robustness to system settings, performance consistency against achromatic pixels, and normalization effectiveness in terms of histological information preservation. As the saturation-weighted statistics proposed in this study generates stable and reliable color cues for stain normalization, our scheme is robust to system parameters and insensitive to image content and achromatic colors. Extensive experimentation suggests that our approach outperforms state-of-the-art normalization methods as the proposed method is the only approach that succeeds to preserve histological information after normalization. The proposed color normalization solution would be useful to mitigate effects of color variation in pathology images on subsequent quantitative analysis.
Benković, Goran; Skrlin, Ana; Madić, Tomislav; Debeljak, Zeljko; Medić-Šarić, Marica
2014-09-01
Current methods for determination of impurities with different charge-to-volume ratio are limited especially in terms of sensitivity and precision. The main goal of this research was to establish a quantitative method for determination of impurities with charges differing from that of recombinant human granulocyte colony-stimulating factor (rhG-CSF, filgrastim) with superior precision and sensitivity compared to existing methods. A CZE method has been developed, optimized, and validated for a purity assessment of filgrastim in liquid pharmaceutical formulations. Optimal separation of filgrastim from the related impurities with different charges was achieved on a 50 μm id fused-silica capillary of a total length of 80.5 cm. A BGE that contains 100 mM phosphoric acid adjusted to pH 7.0 with triethanolamine was used. The applied voltage was 20 kV while the temperature was maintained at 25°C. UV detection was set to 200 nm. Method was validated in terms of selectivity/specificity, linearity, precision, LOD, LOQ, stability, and robustness. Linearity was observed in the concentration range of 6-600 μg/mL and the LOQ was determined to be 0.3% relative to the concentration of filgrastim of 0.6 mg/mL. Other validation parameters were also found to be acceptable; thus the method was successfully applied for a quantitative purity assessment of filgrastim in a finished drug product. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Li, Jun; Tibshirani, Robert
2015-01-01
We discuss the identification of features that are associated with an outcome in RNA-Sequencing (RNA-Seq) and other sequencing-based comparative genomic experiments. RNA-Seq data takes the form of counts, so models based on the normal distribution are generally unsuitable. The problem is especially challenging because different sequencing experiments may generate quite different total numbers of reads, or ‘sequencing depths’. Existing methods for this problem are based on Poisson or negative binomial models: they are useful but can be heavily influenced by ‘outliers’ in the data. We introduce a simple, nonparametric method with resampling to account for the different sequencing depths. The new method is more robust than parametric methods. It can be applied to data with quantitative, survival, two-class or multiple-class outcomes. We compare our proposed method to Poisson and negative binomial-based methods in simulated and real data sets, and find that our method discovers more consistent patterns than competing methods. PMID:22127579
Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...
Somatic coliphages as surrogates for enteroviruses in sludge hygienization treatments.
Martín-Díaz, Julia; Casas-Mangas, Raquel; García-Aljaro, Cristina; Blanch, Anicet R; Lucena, Francisco
2016-01-01
Conventional bacterial indicators present serious drawbacks giving information about viral pathogens persistence during sludge hygienization treatments. This calls for the search of alternative viral indicators. Somatic coliphages' (SOMCPH) ability for acting as surrogates for enteroviruses was assessed in 47 sludge samples subjected to novel treatment processes. SOMCPH, infectious enteroviruses and genome copies of enteroviruses were monitored. Only one of these groups, the bacteriophages, was present in the sludge at concentrations that allowed the evaluation of treatment's performance. An indicator/pathogen relationship of 4 log10 (PFU/g dw) was found between SOMCPH and infective enteroviruses and their detection accuracy was assessed. The obtained results and the existence of rapid and standardized methods encourage the inclusion of SOMCPH quantification in future sludge directives. In addition, an existing real-time quantitative polymerase chain reaction (RT-qPCR) for enteroviruses was adapted and applied.
Yang, Ruifeng; Song, Guangjun; Guan, Wenli; Wang, Qian; Liu, Yan; Wei, Lai
2016-02-01
Qualitative HBsAg assay is used to screen HBV infection for decades. The utility of quantitative assay is also rejuvenated recently. We aimed to evaluate and compare the performance of a novel ultra-sensitive and quantitative assay, the Lumipulse assay, with the Architect and Elecsys assays. As screening methods, specificity was compared using 2043 consecutive clinical routine samples. As quantitative assays, precision and accuracy were assessed. Sera from 112 treatment-naïve chronic hepatitis B patients, four patients undergoing antiviral therapy and one patient with acute infection were tested to compare the correlations. Samples with concurrent HBsAg/anti-HBs were also quantified. The Lumipulse assay precisely quantified ultra-low level of HBsAg (0.004 IU/mL). It identified additional 0.98% (20/2043) clinical samples with trance amount of HBsAg. Three assays displayed excellent linear correlations irrespective of genotypes and S-gene mutations (R(2)>0.95, P<0.0001), while minor quantitative biases existed. The Lumipulse assay did not yield higher HBsAg concentrations in samples with concomitant anti-HBs. Compared with other assays, the Lumipulse assay is sensitive and specific for detecting HBsAg. The interpretation of the extremely low-level results, however, is challenging. Quantitative HBsAg results by different assays are highly correlated, but they should be interpreted interchangeably only after conversion to eliminate the biases. Copyright © 2015 Elsevier B.V. All rights reserved.
HIV prevention needs for men who have sex with men in Swaziland.
Sithole, Bhekie
2017-12-01
Men who have sex with men (MSM) have a high HIV burden and also often face multiple other challenges accessing HIV services, including legal and social issues. Although Swaziland recently started responding with interventions for MSM, significant gaps still exist both in information and programming. This study aimed to explore the HIV prevention needs of MSM in Swaziland, including factors elevating their risks and vulnerabilities to HIV infection; to find out what HIV prevention strategies exist; and to determine how best to meet the prevention needs of MSM. A total of 50 men who reported anal sex with other men in the past 12 months were recruited through simple respondent driven sampling. They completed either a structured quantitative survey (n = 35) or participated in a semi-structured qualitative interview (n = 15). Both quantitative and qualitative findings indicated perceived and experienced stigma among MSM. This predominantly manifested as internalised stigma, which may lead to alcohol abuse and sexual risky behaviours. At least 83% (29/35) of the quantitative sample had been labelled with derogatory terms because of their sexual orientation, while 66% (23/35) had experienced being avoided. There was limited knowledge of risk practices: When asked, 54% (19/35) of quantitative respondents reported that vaginal and anal sex carry an equal risk of HIV infection. Participants also had little knowledge on new HIV prevention methods such as pre-exposure prophylaxis (PrEP) and rectal microbicides. MSM needs included safe spaces in form of drop-in centres and non-hostile HIV services. Although Swaziland recently started interventions for key populations, including MSM, there is still a general lack on information to inform managers and implementers on the HIV prevention needs of MSM in Swaziland. Such information is crucial for designers of official and HIV programmes. Research is needed to increase knowledge on the HIV prevention needs for key populations, including MSM.
Electrochemical and Infrared Absorption Spectroscopy Detection of SF6 Decomposition Products
Dong, Ming; Ren, Ming; Ye, Rixin
2017-01-01
Sulfur hexafluoride (SF6) gas-insulated electrical equipment is widely used in high-voltage (HV) and extra-high-voltage (EHV) power systems. Partial discharge (PD) and local heating can occur in the electrical equipment because of insulation faults, which results in SF6 decomposition and ultimately generates several types of decomposition products. These SF6 decomposition products can be qualitatively and quantitatively detected with relevant detection methods, and such detection contributes to diagnosing the internal faults and evaluating the security risks of the equipment. At present, multiple detection methods exist for analyzing the SF6 decomposition products, and electrochemical sensing (ES) and infrared (IR) spectroscopy are well suited for application in online detection. In this study, the combination of ES with IR spectroscopy is used to detect SF6 gas decomposition. First, the characteristics of these two detection methods are studied, and the data analysis matrix is established. Then, a qualitative and quantitative analysis ES-IR model is established by adopting a two-step approach. A SF6 decomposition detector is designed and manufactured by combining an electrochemical sensor and IR spectroscopy technology. The detector is used to detect SF6 gas decomposition and is verified to reliably and accurately detect the gas components and concentrations. PMID:29140268
Silver, Matt; Montana, Giovanni
2012-01-01
Where causal SNPs (single nucleotide polymorphisms) tend to accumulate within biological pathways, the incorporation of prior pathways information into a statistical model is expected to increase the power to detect true associations in a genetic association study. Most existing pathways-based methods rely on marginal SNP statistics and do not fully exploit the dependence patterns among SNPs within pathways. We use a sparse regression model, with SNPs grouped into pathways, to identify causal pathways associated with a quantitative trait. Notable features of our “pathways group lasso with adaptive weights” (P-GLAW) algorithm include the incorporation of all pathways in a single regression model, an adaptive pathway weighting procedure that accounts for factors biasing pathway selection, and the use of a bootstrap sampling procedure for the ranking of important pathways. P-GLAW takes account of the presence of overlapping pathways and uses a novel combination of techniques to optimise model estimation, making it fast to run, even on whole genome datasets. In a comparison study with an alternative pathways method based on univariate SNP statistics, our method demonstrates high sensitivity and specificity for the detection of important pathways, showing the greatest relative gains in performance where marginal SNP effect sizes are small. PMID:22499682
Houiste, Céline; Auguste, Cécile; Macrez, Céline; Dereux, Stéphanie; Derouet, Angélique; Anger, Pascal
2009-02-01
Low-molecular-weight heparins (LMWHs) are widely used in the management of thrombosis and acute coronary syndromes. They are obtained by the enzymatic or chemical depolymerization of porcine intestinal heparin. Enoxaparin sodium, a widely used LMWH, has a unique and reproducible oligosaccharide profile which is determined by the origin of the starting material and a tightly controlled manufacturing process. Although other enoxaparin-like LMWHs do exist, specific release criteria including the origin of the crude heparin utilized for their production, have not been established. A quantitative polymerase chain reaction method has been developed to ensure the purity of the porcine origin of crude heparin, with a DNA detection limit as low as 1 ppm for bovine, or 10 ppm for ovine contaminants. This method is routinely used as the release acceptance criterion during enoxaparin sodium manufacturing. Furthermore, when the process removes DNA, other analytical techniques can be used to assess any contamination. Disaccharide profiling after exhaustive depolymerization can determine the presence of at least 10% bovine or 20% ovine material; multivariate analysis is useful to perform the data analysis. Consistent with the availability of newer technology, these methods should be required as acceptance criteria for crude heparins used in the manufacture of LMWHs to ensure their safety, quality, and immunologic profile.
2017-01-01
We report a novel molecular assay, based on helicase-dependent amplification (HDA), for the detection of enterococci as markers for fecal pollution in water. This isothermal assay targets the same Enterococcus 23S rRNA gene region as the existing quantitative polymerase chain reaction (qPCR) assays of U.S. Environmental Protection Agency Methods 1611 and 1609 but can be entirely performed on a simple heating block. The developed Enterococcus HDA assay successfully discriminated 15 enterococcal from 15 non-enterococcal reference strains and reliably detected 48 environmental isolates of enterococci. The limit of detection was 25 target copies per reaction, only 3 times higher than that of qPCR. The applicability of the assay was tested on 30 environmental water sample DNA extracts, simulating a gradient of fecal pollution. Despite the isothermal nature of the reaction, the HDA results were consistent with those of the qPCR reference. Given this performance, we conclude that the developed Enterococcus HDA assay has great potential as a qualitative molecular screening method for resource-limited settings when combined with compatible up- and downstream processes. This amplification strategy can pave the way for developing a new generation of rapid, low-cost, and field-deployable molecular diagnostic tools for water quality monitoring. PMID:28541661
Pappa, Katherine; Doty, Tasha; Taff, Steven D.; Kniepmann, Kathy; Foster, Erin R.
2017-01-01
Aims To explore the potential influence of the Stanford Chronic Disease Self-Management Program (CDSMP) on social support in Parkinson disease (PD). Methods This was a quasi-experimental mixed methods design. Volunteers with PD (n=27) and care partners (n=6) completed the CDSMP, questionnaires of social support and self-management outcomes, and an interview about social support in relation to CDSMP participation. PD participants (n=19) who did not participate in the CDSMP completed the questionnaires for quantitative comparison purposes. Results Regarding the quantitative data, there were no significant effects of CDSMP participation on social support questionnaire scores; however, there were some positive correlations between changes in social support and changes in self-management outcomes from pre- to post-CDSMP participation. Three qualitative themes emerged from the interviews: lack of perceived change in amount and quality of social support, positive impact on existing social networks, and benefit from participating in a supportive PD community. Conclusions Although participants did not acknowledge major changes in social support, there were some social support-related benefits of CDSMP participation for PD participants and care partners. These findings provide a starting point for more in-depth studies of social support and self-management in this population. PMID:29203950
Zhou, Teng; Han, Guoqiang; Li, Bing Nan; Lin, Zhizhe; Ciaccio, Edward J; Green, Peter H; Qin, Jing
2017-06-01
Celiac disease is one of the most common diseases in the world. Capsule endoscopy is an alternative way to visualize the entire small intestine without invasiveness to the patient. It is useful to characterize celiac disease, but hours are need to manually analyze the retrospective data of a single patient. Computer-aided quantitative analysis by a deep learning method helps in alleviating the workload during analysis of the retrospective videos. Capsule endoscopy clips from 6 celiac disease patients and 5 controls were preprocessed for training. The frames with a large field of opaque extraluminal fluid or air bubbles were removed automatically by using a pre-selection algorithm. Then the frames were cropped and the intensity was corrected prior to frame rotation in the proposed new method. The GoogLeNet is trained with these frames. Then, the clips of capsule endoscopy from 5 additional celiac disease patients and 5 additional control patients are used for testing. The trained GoogLeNet was able to distinguish the frames from capsule endoscopy clips of celiac disease patients vs controls. Quantitative measurement with evaluation of the confidence was developed to assess the severity level of pathology in the subjects. Relying on the evaluation confidence, the GoogLeNet achieved 100% sensitivity and specificity for the testing set. The t-test confirmed the evaluation confidence is significant to distinguish celiac disease patients from controls. Furthermore, it is found that the evaluation confidence may also relate to the severity level of small bowel mucosal lesions. A deep convolutional neural network was established for quantitative measurement of the existence and degree of pathology throughout the small intestine, which may improve computer-aided clinical techniques to assess mucosal atrophy and other etiologies in real-time with videocapsule endoscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.
Barcode extension for analysis and reconstruction of structures
NASA Astrophysics Data System (ADS)
Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L.; Gootenberg, Jonathan S.; Yin, Peng
2017-03-01
Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures.
Barcode extension for analysis and reconstruction of structures.
Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L; Gootenberg, Jonathan S; Yin, Peng
2017-03-13
Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures.
From conditioning shampoo to nanomechanics and haptics of human hair.
Wood, Claudia; Sugiharto, Albert Budiman; Max, Eva; Fery, Andreas
2011-01-01
Shampoo treatment and hair conditioning have a direct impact on our wellbeing via properties like combability and haptic perception of hair. Therefore, systematic investigations leading to quality improvement of hair care products are of major interest. The aim of our work is a better understanding of complex testing and the correlation with quantitative parameters. The motivation for the development of physical testing methods for hair feel relates to the fact that an ingredient supplier like BASF can only find new, so far not yet toxicologically approved chemistries for hair cosmetics, if an in-vitro method exists.In this work, the effects of different shampoo treatments with conditioning polymers are investigated. The employed physical test method, dry friction measurements and AFM observe friction phenomena on a macroscopic as well as on a nanoscale directly on hair. They are an approach to complement sensoric evaluation with an objective in-vitro method.
Machine learning methods in chemoinformatics
Mitchell, John B O
2014-01-01
Machine learning algorithms are generally developed in computer science or adjacent disciplines and find their way into chemical modeling by a process of diffusion. Though particular machine learning methods are popular in chemoinformatics and quantitative structure–activity relationships (QSAR), many others exist in the technical literature. This discussion is methods-based and focused on some algorithms that chemoinformatics researchers frequently use. It makes no claim to be exhaustive. We concentrate on methods for supervised learning, predicting the unknown property values of a test set of instances, usually molecules, based on the known values for a training set. Particularly relevant approaches include Artificial Neural Networks, Random Forest, Support Vector Machine, k-Nearest Neighbors and naïve Bayes classifiers. WIREs Comput Mol Sci 2014, 4:468–481. How to cite this article: WIREs Comput Mol Sci 2014, 4:468–481. doi:10.1002/wcms.1183 PMID:25285160
Barcode extension for analysis and reconstruction of structures
Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L; Gootenberg, Jonathan S; Yin, Peng
2017-01-01
Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures. PMID:28287117
Computer-based objective quantitative assessment of pulmonary parenchyma via x-ray CT
NASA Astrophysics Data System (ADS)
Uppaluri, Renuka; McLennan, Geoffrey; Sonka, Milan; Hoffman, Eric A.
1998-07-01
This paper is a review of our recent studies using a texture- based tissue characterization method called the Adaptive Multiple Feature Method. This computerized method is automated and performs tissue classification based upon the training acquired on a set of representative examples. The AMFM has been applied to several different discrimination tasks including normal subjects, subjects with interstitial lung disease, smokers, asbestos-exposed subjects, and subjects with cystic fibrosis. The AMFM has also been applied to data acquired using different scanners and scanning protocols. The AMFM has shown to be successful and better than other existing techniques in discriminating the tissues under consideration. We demonstrate that the AMFM is considerably more sensitive and specific in characterizing the lung, especially in the presence of mixed pathology, as compared to more commonly used methods. Evidence is presented suggesting that the AMFM is highly sensitive to some of the earliest disease processes.
19 CFR 206.14 - Contents of petition.
Code of Federal Regulations, 2012 CFR
2012-04-01
...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...
19 CFR 206.14 - Contents of petition.
Code of Federal Regulations, 2011 CFR
2011-04-01
...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...
19 CFR 206.14 - Contents of petition.
Code of Federal Regulations, 2013 CFR
2013-04-01
...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...
19 CFR 206.14 - Contents of petition.
Code of Federal Regulations, 2014 CFR
2014-04-01
...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...
19 CFR 206.14 - Contents of petition.
Code of Federal Regulations, 2010 CFR
2010-04-01
...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...
Kushida, Ikuo
2012-03-01
The objective of this study was to develop a quantitative crystallinity analysis method for the bulk drug of E1010 ((+)-(4R,5S,6S)-6-[(R)-1-hydroxyethyl]-3-[(2S,4S)-2-[(R)-1-hydroxy-1-[(R)-pyrrolidin-3 -yl]methyl]pyrrolidin-4-yl]thio-4-methyl-7-oxo-1-azabicyclo[3.2.0]hept-2-ene-2-carboxylic acid monohydrochloride), a novel carbapenem antibiotic. X-ray analyses, thermal analyses and hygroscopicity measurements were used to elucidate the crystal structure and the solid state properties. To develop a quantitative method for the crystallinity of E1010 bulk drug, the relationship between enthalpy change obtained by differential scanning calorimetry (DSC) and crystalline form ratio was investigated. E1010 bulk drug was found to exist in a crystalline trihydrate formed in two layers, i.e. a layer of E1010 free form, and a layer consisting of chloride ions and water molecules. The thermal analysis showed an endothermic peak derived from dehydration with the loss of crystal lattices at around 100°C as an onset. The enthalpy change value for the endothermic peak correlated well with crystalline content in binary physical mixtures of the crystalline trihydrate and the amorphous form. In addition, for nine lots of the bulk drug, a positive correlation between the enthalpy change and chemical stability in the solid state was observed. This quantitative analysis of crystallinity using DSC could be applicable for the quality control of the bulk drug to detect variability among manufacturing batches and to estimate the chemical stability of partially amorphous samples. © 2011 The Author. JPP © 2011 Royal Pharmaceutical Society.
Burbelo, Peter D; Goldman, Radoslav; Mattson, Thomas L
2005-08-18
Assays detecting human antigen-specific antibodies are medically useful. However, the usefulness of existing simple immunoassay formats is limited by technical considerations such as sera antibodies to contaminants in insufficiently pure antigen, a problem likely exacerbated when antigen panels are screened to obtain clinically useful data. We developed a novel and simple immunoprecipitation technology for identifying clinical sera containing antigen-specific antibodies and for generating quantitative antibody response profiles. This method is based on fusing protein antigens to an enzyme reporter, Renilla luciferase (Ruc), and expressing these fusions in mammalian cells, where mammalian-specific post-translational modifications can be added. After mixing crude extracts, sera and protein A/G beads together and incubating, during which the Ruc-antigen fusion become immobilized on the A/G beads, antigen-specific antibody is quantitated by washing the beads and adding coelenterazine substrate and measuring light production. We have characterized this technology with sera from patients having three different types of cancers. We show that 20-85% of these sera contain significant titers of antibodies against at least one of five frequently mutated and/or overexpressed tumor-associated proteins. Five of six colon cancer sera tested gave responses that were statistically significantly greater than the average plus three standard deviations of 10 control sera. The results of competition experiments, preincubating positive sera with unmodified E. coli-produced antigens, varied dramatically. This technology has several advantages over current quantitative immunoassays including its relative simplicity, its avoidance of problems associated with E. coli-produced antigens and its use of antigens that can carry mammalian or disease-specific post-translational modifications. This assay should be generally useful for analyzing sera for antibodies recognizing any protein or its post-translational modifications.
Burbelo, Peter D; Goldman, Radoslav; Mattson, Thomas L
2005-01-01
Background Assays detecting human antigen-specific antibodies are medically useful. However, the usefulness of existing simple immunoassay formats is limited by technical considerations such as sera antibodies to contaminants in insufficiently pure antigen, a problem likely exacerbated when antigen panels are screened to obtain clinically useful data. Results We developed a novel and simple immunoprecipitation technology for identifying clinical sera containing antigen-specific antibodies and for generating quantitative antibody response profiles. This method is based on fusing protein antigens to an enzyme reporter, Renilla luciferase (Ruc), and expressing these fusions in mammalian cells, where mammalian-specific post-translational modifications can be added. After mixing crude extracts, sera and protein A/G beads together and incubating, during which the Ruc-antigen fusion become immobilized on the A/G beads, antigen-specific antibody is quantitated by washing the beads and adding coelenterazine substrate and measuring light production. We have characterized this technology with sera from patients having three different types of cancers. We show that 20–85% of these sera contain significant titers of antibodies against at least one of five frequently mutated and/or overexpressed tumor-associated proteins. Five of six colon cancer sera tested gave responses that were statistically significantly greater than the average plus three standard deviations of 10 control sera. The results of competition experiments, preincubating positive sera with unmodified E. coli-produced antigens, varied dramatically. Conclusion This technology has several advantages over current quantitative immunoassays including its relative simplicity, its avoidance of problems associated with E. coli-produced antigens and its use of antigens that can carry mammalian or disease-specific post-translational modifications. This assay should be generally useful for analyzing sera for antibodies recognizing any protein or its post-translational modifications. PMID:16109166
Mirzoian, Armen; Mabud, Abdul
2006-01-01
A procedure to analyze ethyl carbamate (EC) by gas chromatography/mass spectrometry was optimized and validated. Deuterated EC (d5-EC) was added to the samples as an internal standard followed by extraction with polystyrene crosslinked polystyrene cartridges using minimal volumes of ethyl acetate. The EC response was measured in selective ion monitoring (SIM) mode and found to be linear in the range between the limit of quantitation (10 micro/L) and 1000 microg/L. EC recoveries varied from 92 to 112%, with the average value of 100 +/- 8%. The procedure compared well (r2 = 0.9970) with the existing AOAC Official Method with the added benefits of minimal solvent usage and reduced matrix interferences.
NASA Technical Reports Server (NTRS)
Stiller, G. P.; Gunson, M. R.; Lowes, L. L.; Abrams, M. C.; Raper, O. F.; Farmer, C. B.; Zander, R.; Rinsland, C. P.
1995-01-01
A simple, classical, and expedient method for the retrieval of atmospheric pressure-temperature profiles has been applied to the high-resolution infrared solar absorption spectra obtained with the atmospheric trace molecule spectroscopy (ATMOS) instrument. The basis for this method is a rotational analysis of retrieved apparent abundances from CO2 rovibrational absorption lines, employing existing constituent concentration retrieval software used in the analysis of data returned by ATMOS. Pressure-temperature profiles derived from spectra acquired during the ATLAS 1 space shuttle mission of March-April 1992 are quantitatively evaluated and compared with climatological and meteorological data as a means of assessing the validity of this approach.
Crisan, Anamaria; McKee, Geoffrey; Munzner, Tamara
2018-01-01
Background Microbial genome sequencing is now being routinely used in many clinical and public health laboratories. Understanding how to report complex genomic test results to stakeholders who may have varying familiarity with genomics—including clinicians, laboratorians, epidemiologists, and researchers—is critical to the successful and sustainable implementation of this new technology; however, there are no evidence-based guidelines for designing such a report in the pathogen genomics domain. Here, we describe an iterative, human-centered approach to creating a report template for communicating tuberculosis (TB) genomic test results. Methods We used Design Study Methodology—a human centered approach drawn from the information visualization domain—to redesign an existing clinical report. We used expert consults and an online questionnaire to discover various stakeholders’ needs around the types of data and tasks related to TB that they encounter in their daily workflow. We also evaluated their perceptions of and familiarity with genomic data, as well as its utility at various clinical decision points. These data shaped the design of multiple prototype reports that were compared against the existing report through a second online survey, with the resulting qualitative and quantitative data informing the final, redesigned, report. Results We recruited 78 participants, 65 of whom were clinicians, nurses, laboratorians, researchers, and epidemiologists involved in TB diagnosis, treatment, and/or surveillance. Our first survey indicated that participants were largely enthusiastic about genomic data, with the majority agreeing on its utility for certain TB diagnosis and treatment tasks and many reporting some confidence in their ability to interpret this type of data (between 58.8% and 94.1%, depending on the specific data type). When we compared our four prototype reports against the existing design, we found that for the majority (86.7%) of design comparisons, participants preferred the alternative prototype designs over the existing version, and that both clinicians and non-clinicians expressed similar design preferences. Participants showed clearer design preferences when asked to compare individual design elements versus entire reports. Both the quantitative and qualitative data informed the design of a revised report, available online as a LaTeX template. Conclusions We show how a human-centered design approach integrating quantitative and qualitative feedback can be used to design an alternative report for representing complex microbial genomic data. We suggest experimental and design guidelines to inform future design studies in the bioinformatics and microbial genomics domains, and suggest that this type of mixed-methods study is important to facilitate the successful translation of pathogen genomics in the clinic, not only for clinical reports but also more complex bioinformatics data visualization software. PMID:29340235
Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa
2002-08-01
Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Gu, Hui-Wen; Zhang, Shan-Hui; Wu, Bai-Chun; Chen, Wu; Wang, Jing-Bo; Liu, Yang
2018-07-01
Oil-field wastewaters contain high level of polycyclic aromatic hydrocarbons (PAHs), which have to be analyzed to assess the environmental effects before discharge. In this work, a green fluorimetric detection method that combines excitation-emission matrix (EEM) fluorescence with parallel factor analysis (PARAFAC) algorithm was firstly developed to achieve the direct and simultaneous determination of six U.S. EPA PAHs in two different kinds of complex oil-field wastewaters. Due to the distinctive "second-order advantage", neither time-consuming sample pretreatments nor toxic organic reagents were involved in the determination. By using the environment-friendly "mathematical separation" of PARAFAC, satisfactory quantitative results and reasonable spectral profiles for six PAHs were successfully extracted from the total EEM signals of oil-field wastewaters without need of chromatographic separation. The limits of detection of six PAHs were in the range of 0.09-0.72 ng mL-1, and the average spiked recoveries were between (89.4 ± 4.8)% and (109.1 ± 5.8)%, with average relative predictive errors <2.93%. In order to further confirm the accuracy of the proposed method, the same batch oil-field wastewater samples were analyzed by the recognized GC-MS method. t-test demonstrated that no significant differences exist between the quantitative results of the two methods. Given the advantages of green, fast, low-cost and high-sensitivity, the proposed method is expected to be broadened as an appealing alternative method for multi-residue analysis of overlapped PAHs in complex wastewater samples.
Terminology Guideline for Classifying Offshore Wind Energy Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beiter, Philipp; Musial, Walt
The purpose of this guideline is to establish a clear and consistent vocabulary for conveying offshore wind resource potential and to interpret this vocabulary in terms that are familiar to the oil and gas (O&G) industry. This involves clarifying and refining existing definitions of offshore wind energy resource classes. The terminology developed in this guideline represents one of several possible sets of vocabulary that may differ with respect to their purpose, data availability, and comprehensiveness. It was customized to correspond with established offshore wind practices and existing renewable energy industry terminology (e.g. DOE 2013, Brown et al. 2015) while conformingmore » to established fossil resource classification as best as possible. The developers of the guideline recognize the fundamental differences that exist between fossil and renewable energy resources with respect to availability, accessibility, lifetime, and quality. Any quantitative comparison between fossil and renewable energy resources, including offshore wind, is therefore limited. For instance, O&G resources are finite and there may be significant uncertainty associated with the amount of the resource. In contrast, aboveground renewable resources, such as offshore wind, do not generally deplete over time but can vary significantly subhourly, daily, seasonally, and annually. The intent of this guideline is to make these differences transparent and develop an offshore wind resource classification that conforms to established fossil resource classifications where possible. This guideline also provides methods to quantitatively compare certain offshore wind energy resources to O&G resource classes for specific applications. Finally, this guideline identifies areas where analogies to established O&G terminology may be inappropriate or subject to misinterpretation.« less
A study on reliability of power customer in distribution network
NASA Astrophysics Data System (ADS)
Liu, Liyuan; Ouyang, Sen; Chen, Danling; Ma, Shaohua; Wang, Xin
2017-05-01
The existing power supply reliability index system is oriented to power system without considering actual electricity availability in customer side. In addition, it is unable to reflect outage or customer’s equipment shutdown caused by instantaneous interruption and power quality problem. This paper thus makes a systematic study on reliability of power customer. By comparing with power supply reliability, reliability of power customer is defined and extracted its evaluation requirements. An indexes system, consisting of seven customer indexes and two contrast indexes, are designed to describe reliability of power customer from continuity and availability. In order to comprehensively and quantitatively evaluate reliability of power customer in distribution networks, reliability evaluation method is proposed based on improved entropy method and the punishment weighting principle. Practical application has proved that reliability index system and evaluation method for power customer is reasonable and effective.
Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface
2017-02-01
COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three
Chen, Wenan; McDonnell, Shannon K; Thibodeau, Stephen N; Tillmans, Lori S; Schaid, Daniel J
2016-11-01
Functional annotations have been shown to improve both the discovery power and fine-mapping accuracy in genome-wide association studies. However, the optimal strategy to incorporate the large number of existing annotations is still not clear. In this study, we propose a Bayesian framework to incorporate functional annotations in a systematic manner. We compute the maximum a posteriori solution and use cross validation to find the optimal penalty parameters. By extending our previous fine-mapping method CAVIARBF into this framework, we require only summary statistics as input. We also derived an exact calculation of Bayes factors using summary statistics for quantitative traits, which is necessary when a large proportion of trait variance is explained by the variants of interest, such as in fine mapping expression quantitative trait loci (eQTL). We compared the proposed method with PAINTOR using different strategies to combine annotations. Simulation results show that the proposed method achieves the best accuracy in identifying causal variants among the different strategies and methods compared. We also find that for annotations with moderate effects from a large annotation pool, screening annotations individually and then combining the top annotations can produce overly optimistic results. We applied these methods on two real data sets: a meta-analysis result of lipid traits and a cis-eQTL study of normal prostate tissues. For the eQTL data, incorporating annotations significantly increased the number of potential causal variants with high probabilities. Copyright © 2016 by the Genetics Society of America.
Zhou, Y.; Ojeda-May, P.; Nagaraju, M.; Pu, J.
2016-01-01
Adenosine triphosphate (ATP)-binding cassette (ABC) transporters are ubiquitous ATP-dependent membrane proteins involved in translocations of a wide variety of substrates across cellular membranes. To understand the chemomechanical coupling mechanism as well as functional asymmetry in these systems, a quantitative description of how ABC transporters hydrolyze ATP is needed. Complementary to experimental approaches, computer simulations based on combined quantum mechanical and molecular mechanical (QM/MM) potentials have provided new insights into the catalytic mechanism in ABC transporters. Quantitatively reliable determination of the free energy requirement for enzymatic ATP hydrolysis, however, requires substantial statistical sampling on QM/MM potential. A case study shows that brute force sampling of ab initio QM/MM (AI/MM) potential energy surfaces is computationally impractical for enzyme simulations of ABC transporters. On the other hand, existing semiempirical QM/MM (SE/MM) methods, although affordable for free energy sampling, are unreliable for studying ATP hydrolysis. To close this gap, a multiscale QM/MM approach named reaction path–force matching (RP–FM) has been developed. In RP–FM, specific reaction parameters for a selected SE method are optimized against AI reference data along reaction paths by employing the force matching technique. The feasibility of the method is demonstrated for a proton transfer reaction in the gas phase and in solution. The RP–FM method may offer a general tool for simulating complex enzyme systems such as ABC transporters. PMID:27498639
From themes to hypotheses: following up with quantitative methods.
Morgan, David L
2015-06-01
One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.
Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G
2017-12-01
Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.
Aggarwal, Priya; Gupta, Anubha
2017-12-01
A number of reconstruction methods have been proposed recently for accelerated functional Magnetic Resonance Imaging (fMRI) data collection. However, existing methods suffer with the challenge of greater artifacts at high acceleration factors. This paper addresses the issue of accelerating fMRI collection via undersampled k-space measurements combined with the proposed method based on l 1 -l 1 norm constraints, wherein we impose first l 1 -norm sparsity on the voxel time series (temporal data) in the transformed domain and the second l 1 -norm sparsity on the successive difference of the same temporal data. Hence, we name the proposed method as Double Temporal Sparsity based Reconstruction (DTSR) method. The robustness of the proposed DTSR method has been thoroughly evaluated both at the subject level and at the group level on real fMRI data. Results are presented at various acceleration factors. Quantitative analysis in terms of Peak Signal-to-Noise Ratio (PSNR) and other metrics, and qualitative analysis in terms of reproducibility of brain Resting State Networks (RSNs) demonstrate that the proposed method is accurate and robust. In addition, the proposed DTSR method preserves brain networks that are important for studying fMRI data. Compared to the existing methods, the DTSR method shows promising potential with an improvement of 10-12 dB in PSNR with acceleration factors upto 3.5 on resting state fMRI data. Simulation results on real data demonstrate that DTSR method can be used to acquire accelerated fMRI with accurate detection of RSNs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Advanced quantitative measurement methodology in physics education research
NASA Astrophysics Data System (ADS)
Wang, Jing
The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.
Nonstandard Work Schedules and Partnership Quality: Quantitative and Qualitative Findings
ERIC Educational Resources Information Center
Mills, Melinda; Taht, Kadri
2010-01-01
This article questions existing findings and provides new evidence about the consequences of nonstandard work schedules on partnership quality. Using quantitative couple data from The Netherlands Kinship Panel Study (NKPS) (N = 3,016) and semistructured qualitative interviews (N = 34), we found that, for women, schedules with varying hours…
Conceptions and Practices of Assessment: A Case of Teachers Representing Improvement Conception
ERIC Educational Resources Information Center
Azis, Astuti
2015-01-01
Despite numerous quantitative studies on teachers' conceptions and practices of assessment, little research exists regarding the unique assessment environment of Indonesia. This study uses both quantitative and qualitative data to examine how Indonesian junior high school teachers understand assessment and how their conceptions of assessment…
A Transformative Model for Undergraduate Quantitative Biology Education
ERIC Educational Resources Information Center
Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.
2010-01-01
The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…
ERIC Educational Resources Information Center
McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.
2016-01-01
The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present…
Servant Leadership and Its Impact on Classroom Climate and Student Achievement
ERIC Educational Resources Information Center
Mulligan, Daniel F.
2016-01-01
The purpose of this quantitative research was to see to what degree a relationship existed between servant leadership, classroom climate, and student achievement in a collegiate environment. This was a quantitative, correlational study. The foundational theories for this research included servant leadership and organizational climate that pertain…
A Quantitative Assessment of Test Anxiety and Human-Animal Interaction in College Students
ERIC Educational Resources Information Center
Dluzynski, Jessica L.
2017-01-01
Existing research on human-animal interactions has established that engaging with an animal may reduce anxiety-like behaviors (Acheson et al., 2013; Sobota, Mihara, Forrest, Featherstone, & Siegel, 2015; Yates, 2012) and lower physiological response in stressful situations (Campo & Uchino, 2013). This quantitative research study expanded…
Shi, Chuan; Goldberg, Shalom; Lin, Tricia; Dudkin, Vadim; Widdison, Wayne; Harris, Luke; Wilhelm, Sharon; Jmeian, Yazen; Davis, Darryl; O'Neil, Karyn; Weng, Naidong; Jian, Wenying
2018-04-17
Bioanalysis of antibody-drug conjugates (ADCs) is challenging due to the complex, heterogeneous nature of their structures and their complicated catabolism. To fully describe the pharmacokinetics (PK) of an ADC, several analytes are commonly quantified, including total antibody, conjugate, and payload. Among them, conjugate is the most challenging to measure, because it requires detection of both small and large molecules as one entity. Existing approaches to quantify the conjugated species of ADCs involve a ligand binding assay (LBA) for conjugated antibody or hybrid LBA/liquid chromatography/tandem mass spectrometry (LC/MS/MS) for quantitation of conjugated drug. In our current work for a protein-drug conjugate (PDC) using the Centyrin scaffold, a similar concept to ADCs but with smaller protein size, an alternative method to quantify the conjugate by using a surrogate peptide approach, was utilized. The His-tagged proteins were isolated from biological samples using immobilized metal affinity chromatography (IMAC), followed by trypsin digestion. The tryptic peptide containing the linker attached to the payload was used as a surrogate of the conjugate and monitored by LC/MS/MS analysis. During method development and its application, we found that hydrolysis of the succinimide ring of the linker was ubiquitous, taking place at many stages during the lifetime of the PDC including in the initial drug product, in vivo in circulation in the animals, and ex vivo during the trypsin digestion step of the sample preparation. We have shown that hydrolysis during trypsin digestion is concentration-independent and consistent during the work flow-therefore, having no impact on assay performance. However, for samples that have undergone extensive hydrolysis prior to trypsin digestion, significant bias could be introduced if only the non-hydrolyzed form is considered in the quantitation. Therefore, it is important to incorporate succinimide hydrolysis products in the quantitation method in order to provide an accurate estimation of the total conjugate level. More importantly, the LC/MS/MS-based method described here provides a useful tool to quantitatively evaluate succinimide hydrolysis of ADCs in vivo, which has been previously reported to have significant impact on their stability, exposure, and efficacy.
Li, Chuan-Xi; Chen, Peng; Wang, Ru-Jing; Wang, Xiu-Jie; Su, Ya-Ru; Li, Jinyan
2014-01-01
Mining Protein-Protein Interactions (PPIs) from the fast-growing biomedical literature resources has been proven as an effective approach for the identification of biological regulatory networks. This paper presents a novel method based on the idea of Interaction Relation Ontology (IRO), which specifies and organises words of various proteins interaction relationships. Our method is a two-stage PPI extraction method. At first, IRO is applied in a binary classifier to determine whether sentences contain a relation or not. Then, IRO is taken to guide PPI extraction by building sentence dependency parse tree. Comprehensive and quantitative evaluations and detailed analyses are used to demonstrate the significant performance of IRO on relation sentences classification and PPI extraction. Our PPI extraction method yielded a recall of around 80% and 90% and an F1 of around 54% and 66% on corpora of AIMed and BioInfer, respectively, which are superior to most existing extraction methods.
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
Quantitative Tools for Examining the Vocalizations of Juvenile Songbirds
Wellock, Cameron D.; Reeke, George N.
2012-01-01
The singing of juvenile songbirds is highly variable and not well stereotyped, a feature that makes it difficult to analyze with existing computational techniques. We present here a method suitable for analyzing such vocalizations, windowed spectral pattern recognition (WSPR). Rather than performing pairwise sample comparisons, WSPR measures the typicality of a sample against a large sample set. We also illustrate how WSPR can be used to perform a variety of tasks, such as sample classification, song ontogeny measurement, and song variability measurement. Finally, we present a novel measure, based on WSPR, for quantifying the apparent complexity of a bird's singing. PMID:22701474
Global human capital: integrating education and population.
Lutz, Wolfgang; KC, Samir
2011-07-29
Almost universally, women with higher levels of education have fewer children. Better education is associated with lower mortality, better health, and different migration patterns. Hence, the global population outlook depends greatly on further progress in education, particularly of young women. By 2050, the highest and lowest education scenarios--assuming identical education-specific fertility rates--result in world population sizes of 8.9 and 10.0 billion, respectively. Better education also matters for human development, including health, economic growth, and democracy. Existing methods of multi-state demography can quantitatively integrate education into standard demographic analysis, thus adding the "quality" dimension.
Intelligence's likelihood and evolutionary time frame
NASA Astrophysics Data System (ADS)
Bogonovich, Marc
2011-04-01
This paper outlines hypotheses relevant to the evolution of intelligent life and encephalization in the Phanerozoic. If general principles are inferable from patterns of Earth life, implications could be drawn for astrobiology. Many of the outlined hypotheses, relevant data, and associated evolutionary and ecological theory are not frequently cited in astrobiological journals. Thus opportunity exists to evaluate reviewed hypotheses with an astrobiological perspective. A quantitative method is presented for testing one of the reviewed hypotheses (hypothesis i; the diffusion hypothesis). Questions are presented throughout, which illustrate that the question of intelligent life's likelihood can be expressed as multiple, broadly ranging, more tractable questions.
Sutton, Katherine S; Stratton, Natalie; Pytyck, Jennifer; Kolla, Nathan J; Cantor, James M
2015-01-01
Hypersexuality remains an increasingly common but poorly understood patient complaint. Despite diversity in clinical presentations of patients referred for hypersexuality, the literature has maintained treatment approaches that are assumed to apply to the entire phenomenon. This approach has proven ineffective, despite its application over several decades. The present study used quantitative methods to examine demographic, mental health, and sexological correlates of common clinical subtypes of hypersexuality referrals. Findings support the existence of subtypes, each with distinct clusters of features. Paraphilic hypersexuals reported greater numbers of sexual partners, more substance abuse, initiation to sexual activity at an earlier age, and novelty as a driving force behind their sexual behavior. Avoidant masturbators reported greater levels of anxiety, delayed ejaculation, and use of sex as an avoidance strategy. Chronic adulterers reported premature ejaculation and later onset of puberty. Designated patients were less likely to report substance abuse, employment, or finance problems. Although quantitative, this article nonetheless presents a descriptive study in which the underlying typology emerged from features most salient in routine sexological assessment. Future studies might apply purely empirical statistical techniques, such as cluster analyses, to ascertain to what extent similar typologies emerge when examined prospectively.
[Quantitative estimation source of urban atmospheric CO2 by carbon isotope composition].
Liu, Wei; Wei, Nan-Nan; Wang, Guang-Hua; Yao, Jian; Zeng, You-Shi; Fan, Xue-Bo; Geng, Yan-Hong; Li, Yan
2012-04-01
To effectively reduce urban carbon emissions and verify the effectiveness of currently project for urban carbon emission reduction, quantitative estimation sources of urban atmospheric CO2 correctly is necessary. Since little fractionation of carbon isotope exists in the transportation from pollution sources to the receptor, the carbon isotope composition can be used for source apportionment. In the present study, a method was established to quantitatively estimate the source of urban atmospheric CO2 by the carbon isotope composition. Both diurnal and height variations of concentrations of CO2 derived from biomass, vehicle exhaust and coal burning were further determined for atmospheric CO2 in Jiading district of Shanghai. Biomass-derived CO2 accounts for the largest portion of atmospheric CO2. The concentrations of CO2 derived from the coal burning are larger in the night-time (00:00, 04:00 and 20:00) than in the daytime (08:00, 12:00 and 16:00), and increase with the increase of height. Those derived from the vehicle exhaust decrease with the height increase. The diurnal and height variations of sources reflect the emission and transport characteristics of atmospheric CO2 in Jiading district of Shanghai.
NASA Astrophysics Data System (ADS)
Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.
We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.
NASA Technical Reports Server (NTRS)
Garcia, M. J.; Thomas, J. D.; Greenberg, N.; Sandelski, J.; Herrera, C.; Mudd, C.; Wicks, J.; Spencer, K.; Neumann, A.; Sankpal, B.;
2001-01-01
Digital format is rapidly emerging as a preferred method for displaying and retrieving echocardiographic studies. The qualitative diagnostic accuracy of Moving Pictures Experts Group (MPEG-1) compressed digital echocardiographic studies has been previously reported. The goals of the present study were to compare quantitative measurements derived from MPEG-1 recordings with the super-VHS (sVHS) videotape clinical standard. Six reviewers performed blinded measurements from still-frame images selected from 20 echocardiographic studies that were simultaneously acquired in sVHS and MPEG-1 formats. Measurements were obtainable in 1401 (95%) of 1486 MPEG-1 variables compared with 1356 (91%) of 1486 sVHS variables (P <.001). Excellent agreement existed between MPEG-1 and sVHS 2-dimensional linear measurements (r = 0.97; MPEG-1 = 0.95[sVHS] + 1.1 mm; P <.001; Delta = 9% +/- 10%), 2-dimensional area measurements (r = 0.89), color jet areas (r = 0.87, p <.001), and Doppler velocities (r = 0.92, p <.001). Interobserver variability was similar for both sVHS and MPEG-1 readings. Our results indicate that quantitative off-line measurements from MPEG-1 digitized echocardiographic studies are feasible and comparable to those obtained from sVHS.
Development and application of a DNA microarray-based yeast two-hybrid system
Suter, Bernhard; Fontaine, Jean-Fred; Yildirimman, Reha; Raskó, Tamás; Schaefer, Martin H.; Rasche, Axel; Porras, Pablo; Vázquez-Álvarez, Blanca M.; Russ, Jenny; Rau, Kirstin; Foulle, Raphaele; Zenkner, Martina; Saar, Kathrin; Herwig, Ralf; Andrade-Navarro, Miguel A.; Wanker, Erich E.
2013-01-01
The yeast two-hybrid (Y2H) system is the most widely applied methodology for systematic protein–protein interaction (PPI) screening and the generation of comprehensive interaction networks. We developed a novel Y2H interaction screening procedure using DNA microarrays for high-throughput quantitative PPI detection. Applying a global pooling and selection scheme to a large collection of human open reading frames, proof-of-principle Y2H interaction screens were performed for the human neurodegenerative disease proteins huntingtin and ataxin-1. Using systematic controls for unspecific Y2H results and quantitative benchmarking, we identified and scored a large number of known and novel partner proteins for both huntingtin and ataxin-1. Moreover, we show that this parallelized screening procedure and the global inspection of Y2H interaction data are uniquely suited to define specific PPI patterns and their alteration by disease-causing mutations in huntingtin and ataxin-1. This approach takes advantage of the specificity and flexibility of DNA microarrays and of the existence of solid-related statistical methods for the analysis of DNA microarray data, and allows a quantitative approach toward interaction screens in human and in model organisms. PMID:23275563
Methods of quantitative risk assessment: The case of the propellant supply system
NASA Astrophysics Data System (ADS)
Merz, H. A.; Bienz, A.
1984-08-01
As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.
Studying learning in the healthcare setting: the potential of quantitative diary methods.
Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke
2015-08-01
Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.
Yousefi, Siavash; Qin, Jia; Zhi, Zhongwei
2013-01-01
Optical microangiography is an imaging technology that is capable of providing detailed functional blood flow maps within microcirculatory tissue beds in vivo. Some practical issues however exist when displaying and quantifying the microcirculation that perfuses the scanned tissue volume. These issues include: (I) Probing light is subject to specular reflection when it shines onto sample. The unevenness of the tissue surface makes the light energy entering the tissue not uniform over the entire scanned tissue volume. (II) The biological tissue is heterogeneous in nature, meaning the scattering and absorption properties of tissue would attenuate the probe beam. These physical limitations can result in local contrast degradation and non-uniform micro-angiogram images. In this paper, we propose a post-processing method that uses Rayleigh contrast-limited adaptive histogram equalization to increase the contrast and improve the overall appearance and uniformity of optical micro-angiograms without saturating the vessel intensity and changing the physical meaning of the micro-angiograms. The qualitative and quantitative performance of the proposed method is compared with those of common histogram equalization and contrast enhancement methods. We demonstrate that the proposed method outperforms other existing approaches. The proposed method is not limited to optical microangiography and can be used in other image modalities such as photo-acoustic tomography and scanning laser confocal microscopy. PMID:23482880
Dawes, Jo; Deaton, Stuart; Greenwood, Nan
2017-06-30
The purpose of this study was to appraise referrals of homeless patients to physiotherapy services and explore perceptions of barriers to access. This exploratory mixed-method study used a follow-up qualitative extension to core quantitative research design. Over 9 months, quantitative data were gathered from the healthcare records of homeless patients referred to physiotherapy by a general practitioner (GP) practice, including the number of referrals and demographic data of all homeless patients referred. Corresponding physiotherapy records of those people referred to physiotherapy were searched for the outcome of their care. Qualitative semi-structured telephone interviews, based on the quantitative findings, were carried out with staff involved with patient care from the referring GP practice and were used to expand insight into the quantitative findings. Two primary care sites provided data for this study: a GP practice dedicated exclusively to homeless people and the physiotherapy department receiving their referrals. Quantitative data from the healthcare records of 34 homeless patient referrals to physiotherapy were collected and analysed. In addition, five staff involved in patient care were interviewed. 34 referrals of homeless people were made to physiotherapy in a 9-month period. It was possible to match 25 of these to records from the physiotherapy department. Nine (36%) patients did not attend their first appointment; seven (28%) attended an initial appointment, but did not attend a subsequent appointment and were discharged from the service; five (20%) completed treatment and four patients (16%) had ongoing treatment. Semi-structured interviews revealed potential barriers preventing homeless people from accessing physiotherapy services, the complex factors being faced by those making referrals and possible ways to improve physiotherapy access. Homeless people with musculoskeletal problems may fail to access physiotherapy treatment, but opportunities exist to make access to physiotherapy easier. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Comparison of MRI-based estimates of articular cartilage contact area in the tibiofemoral joint.
Henderson, Christopher E; Higginson, Jill S; Barrance, Peter J
2011-01-01
Knee osteoarthritis (OA) detrimentally impacts the lives of millions of older Americans through pain and decreased functional ability. Unfortunately, the pathomechanics and associated deviations from joint homeostasis that OA patients experience are not well understood. Alterations in mechanical stress in the knee joint may play an essential role in OA; however, existing literature in this area is limited. The purpose of this study was to evaluate the ability of an existing magnetic resonance imaging (MRI)-based modeling method to estimate articular cartilage contact area in vivo. Imaging data of both knees were collected on a single subject with no history of knee pathology at three knee flexion angles. Intra-observer reliability and sensitivity studies were also performed to determine the role of operator-influenced elements of the data processing on the results. The method's articular cartilage contact area estimates were compared with existing contact area estimates in the literature. The method demonstrated an intra-observer reliability of 0.95 when assessed using Pearson's correlation coefficient and was found to be most sensitive to changes in the cartilage tracings on the peripheries of the compartment. The articular cartilage contact area estimates at full extension were similar to those reported in the literature. The relationships between tibiofemoral articular cartilage contact area and knee flexion were also qualitatively and quantitatively similar to those previously reported. The MRI-based knee modeling method was found to have high intra-observer reliability, sensitivity to peripheral articular cartilage tracings, and agreeability with previous investigations when using data from a single healthy adult. Future studies will implement this modeling method to investigate the role that mechanical stress may play in progression of knee OA through estimation of articular cartilage contact area.
A test for selection employing quantitative trait locus and mutation accumulation data.
Rice, Daniel P; Townsend, Jeffrey P
2012-04-01
Evolutionary biologists attribute much of the phenotypic diversity observed in nature to the action of natural selection. However, for many phenotypic traits, especially quantitative phenotypic traits, it has been challenging to test for the historical action of selection. An important challenge for biologists studying quantitative traits, therefore, is to distinguish between traits that have evolved under the influence of strong selection and those that have evolved neutrally. Most existing tests for selection employ molecular data, but selection also leaves a mark on the genetic architecture underlying a trait. In particular, the distribution of quantitative trait locus (QTL) effect sizes and the distribution of mutational effects together provide information regarding the history of selection. Despite the increasing availability of QTL and mutation accumulation data, such data have not yet been effectively exploited for this purpose. We present a model of the evolution of QTL and employ it to formulate a test for historical selection. To provide a baseline for neutral evolution of the trait, we estimate the distribution of mutational effects from mutation accumulation experiments. We then apply a maximum-likelihood-based method of inference to estimate the range of selection strengths under which such a distribution of mutations could generate the observed QTL. Our test thus represents the first integration of population genetic theory and QTL data to measure the historical influence of selection.
On normality, ethnicity, and missing values in quantitative trait locus mapping
Labbe, Aurélie; Wormald, Hanna
2005-01-01
Background This paper deals with the detection of significant linkage for quantitative traits using a variance components approach. Microsatellite markers were obtained for the Genetic Analysis Workshop 14 Collaborative Study on the Genetics of Alcoholism data. Ethnic heterogeneity, highly skewed quantitative measures, and a high rate of missing values are all present in this dataset and well known to impact upon linkage analysis. This makes it a good candidate for investigation. Results As expected, we observed a number of changes in LOD scores, especially for chromosomes 1, 7, and 18, along with the three factors studied. A dramatic example of such changes can be found in chromosome 7. Highly significant linkage to one of the quantitative traits became insignificant when a proper normalizing transformation of the trait was used and when analysis was carried out on an ethnically homogeneous subset of the original pedigrees. Conclusion In agreement with existing literature, transforming a trait to ensure normality using a Box-Cox transformation is highly recommended in order to avoid false-positive linkages. Furthermore, pedigrees should be sorted by ethnic groups and analyses should be carried out separately. Finally, one should be aware that the inclusion of covariates with a high rate of missing values reduces considerably the number of subjects included in the model. In such a case, the loss in power may be large. Imputation methods are then recommended. PMID:16451664
Muleme, James; Kankya, Clovice; Ssempebwa, John C.; Mazeri, Stella; Muwonge, Adrian
2017-01-01
Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications. PMID:29276703
Muleme, James; Kankya, Clovice; Ssempebwa, John C; Mazeri, Stella; Muwonge, Adrian
2017-01-01
Knowledge, attitude, and practice (KAP) studies guide the implementation of public health interventions (PHIs), and they are important tools for political persuasion. The design and implementation of PHIs assumes a linear KAP relationship, i.e., an awareness campaign results in the desirable societal behavioral change. However, there is no robust framework for testing this relationship before and after PHIs. Here, we use qualitative and quantitative data on pesticide usage to test this linear relationship, identify associated context specific factors as well as assemble a framework that could be used to guide and evaluate PHIs. We used data from a cross-sectional mixed methods study on pesticide usage. Quantitative data were collected using a structured questionnaire from 167 households representing 1,002 individuals. Qualitative data were collected from key informants and focus group discussions. Quantitative and qualitative data analysis was done in R 3.2.0 as well as qualitative thematic analysis, respectively. Our framework shows that a KAP linear relationship only existed for households with a low knowledge score, suggesting that an awareness campaign would only be effective for ~37% of the households. Context specific socioeconomic factors explain why this relationship does not hold for households with high knowledge scores. These findings are essential for developing targeted cost-effective and sustainable interventions on pesticide usage and other PHIs with context specific modifications.
Intelligent model-based diagnostics for vehicle health management
NASA Astrophysics Data System (ADS)
Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki
2003-08-01
The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.
Assessment of the Casualty Risk of Multiple Meteorological Hazards in China
Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang
2016-01-01
A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales. PMID:26901210
Polymenidou, Magdalini; Verghese-Nikolakaki, Susan; Groschup, Martin; Chaplin, Melanie J; Stack, Mick J; Plaitakis, Andreas; Sklaviadis, Theodoros
2002-01-01
Background Transmissible spongiform encephalopathies (TSEs) are neurodegenerative diseases affecting both humans and animals. They are associated with post-translational conversion of the normal cellular prion protein (PrPC) into a heat- and protease-resistant abnormal isoform (PrPSc). Detection of PrPSc in individuals is widely utilized for the diagnosis of prion diseases. Methods TSE brain tissue samples have been processed in order to quantitatively isolate PrPSc. The protocol includes an initial homogenization, digestion with proteinase K and salt precipitation. Results Here we show that over 97 percent of the PrPSc present can be precipitated from infected brain material using this simple salting-out procedure for proteins. No chemically harsh conditions are used during the process in order to conserve the native quality of the isolated protein. Conclusion The resulting PrPSc-enriched preparation should provide a suitable substrate for analyzing the structure of the prion agent and for scavenging for other molecules with which it may associate. In comparison with most methods that exist today, the one described in this study is rapid, cost-effective and does not demand expensive laboratory equipment. PMID:12370086
Pappa, Katherine; Doty, Tasha; Taff, Steven D; Kniepmann, Kathy; Foster, Erin R
2017-01-01
To explore the potential influence of the Stanford Chronic Disease Self-Management Program (CDSMP) on social support in Parkinson disease (PD). This was a quasi-experimental mixed methods design. Volunteers with PD (n=27) and care partners (n=6) completed the CDSMP, questionnaires of social support and self-management outcomes, and an interview about social support in relation to CDSMP participation. PD participants (n=19) who did not participate in the CDSMP completed the questionnaires for quantitative comparison purposes. Regarding the quantitative data, there were no significant effects of CDSMP participation on social support questionnaire scores; however, there were some positive correlations between changes in social support and changes in self-management outcomes from pre- to post-CDSMP participation. Three qualitative themes emerged from the interviews: lack of perceived change in amount and quality of social support, positive impact on existing social networks, and benefit from participating in a supportive PD community. Although participants did not acknowledge major changes in social support, there were some social support-related benefits of CDSMP participation for PD participants and care partners. These findings provide a starting point for more in-depth studies of social support and self-management in this population.
NASA Astrophysics Data System (ADS)
El-Nour, K. M. A.; Salam, E. T. A.; Soliman, H. M.; Orabi, A. S.
2017-03-01
A new optical sensor was developed for rapid screening with high sensitivity for the existence of biogenic amines (BAs) in poultry meat samples. Gold nanoparticles (GNPs) with particle size 11-19 nm function as a fast and sensitive biosensor for detection of histamine resulting from bacterial decarboxylation of histidine as a spoilage marker for stored poultry meat. Upon reaction with histamine, the red color of the GNPs converted into deep blue. The appearance of blue color favorably coincides with the concentration of BAs that can induce symptoms of poisoning. This biosensor enables a semi-quantitative detection of analyte in real samples by eye-vision. Quality evaluation is carried out by measuring histamine and histidine using different analytical techniques such as UV-vis, FTIR, and fluorescence spectroscopy as well as TEM. A rapid quantitative readout of samples by UV-vis and fluorescence methods with standard instrumentation were proposed in a short time unlike chromatographic and electrophoretic methods. Sensitivity and limit of detection (LOD) of 6.59 × 10-4 and 0.6 μM, respectively, are determined for histamine as a spoilage marker with a correlation coefficient ( R 2) of 0.993.
Assessment of the Casualty Risk of Multiple Meteorological Hazards in China.
Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang
2016-02-17
A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales.
NASA Astrophysics Data System (ADS)
Lach, Denise
2014-01-01
While interdisciplinary research is increasingly practiced as a way to transcend the limitations of individual disciplines, our concepts, and methods are primarily rooted in the disciplines that shape the way we think about the world and how we conduct research. While natural and social scientists may share a general understanding of how science is conducted, disciplinary differences in methodologies quickly emerge during interdisciplinary research efforts. This paper briefly introduces and reviews different philosophical underpinnings of quantitative and qualitative methodological approaches and introduces the idea that a pragmatic, realistic approach may allow natural and social scientists to work together productively. While realism assumes that there is a reality that exists independently of our perceptions, the work of scientists is to explore the mechanisms by which actions cause meaningful outcomes and the conditions under which the mechanisms can act. Our task as interdisciplinary researchers is to use the insights of our disciplines in the context of the problem to co-produce an explanation for the variables of interest. Research on qualities necessary for successful interdisciplinary researchers is also discussed along with recent efforts by funding agencies and academia to increase capacities for interdisciplinary research.
Lach, Denise
2014-01-01
While interdisciplinary research is increasingly practiced as a way to transcend the limitations of individual disciplines, our concepts, and methods are primarily rooted in the disciplines that shape the way we think about the world and how we conduct research. While natural and social scientists may share a general understanding of how science is conducted, disciplinary differences in methodologies quickly emerge during interdisciplinary research efforts. This paper briefly introduces and reviews different philosophical underpinnings of quantitative and qualitative methodological approaches and introduces the idea that a pragmatic, realistic approach may allow natural and social scientists to work together productively. While realism assumes that there is a reality that exists independently of our perceptions, the work of scientists is to explore the mechanisms by which actions cause meaningful outcomes and the conditions under which the mechanisms can act. Our task as interdisciplinary researchers is to use the insights of our disciplines in the context of the problem to co-produce an explanation for the variables of interest. Research on qualities necessary for successful interdisciplinary researchers is also discussed along with recent efforts by funding agencies and academia to increase capacities for interdisciplinary research.
PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*
Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh
2016-01-01
Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
Murase, Atsunobu; Miki, Ryohei; Motomura, Hiroyuki
2017-01-01
Abstract Understanding the distributional patterns of individual animal groups with respect to coastal topology and the local physical environment provides essential foundational frameworks for marine zoogeography. In the northwestern Pacific waters of Japan, the distributional pattern of some cool-temperate species of marine fishes suggests the existence of a biogeographic boundary corresponding to a long sandy shore on the eastern coast of Kyushu, southern Japan. The existence of this hypothetical biogeographic boundary was tested by mapping the southern distributional limit of two species of cool-temperate intertidal gobies, Chaenogobius annularis and C. gulosus, which are endemic to East Asia and common in rock pools within their range in the Japanese Archipelago. Distribution and abundance were assessed by survey of museum collections from south-east Kyushu (i.e., the entire coasts of Kagoshima and Miyazaki prefectures); and a quantitative survey of the abundance of these gobies in rock pools at various sites around the hypothesized boundary on the eastern coast of Kyushu, including the subtropical Tanega-shima Island. The museum collection survey showed different distribution patterns between the two species: C. annularis was distributed along the entire coasts of south-east Kyushu including subtropical islands, whereas C. gulosus was distributed along these coasts, including one site on a subtropical island, except for an area south of the hypothesized boundary on the eastern coast of Kyushu. The density and occurrence rates of C. annularis in rock pools decreased with latitude, it being absent from a subtropical island, and C. gulosus was not detected from sites south of the hypothesized boundary. The qualitative survey showed that the southernmost records of C. annularis and C. gulosus were the adjacent subtropical islands (Yaku-shima and Tanega-shima islands respectively), although the quantitative survey suggested that their normal range of distribution was limited to the southern part of the Kyushu mainland. A combination of qualitative and quantitative survey methods in the present study highlighted that the southernmost record of a certain species may not necessarily indicate the true limit of its distribution. The distribution of C. gulosus supports the existence of the hypothetical biogeographic boundary, and the different distribution patterns of the two species may be caused by differences in their early life histories. PMID:29362541
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
Qualitative versus quantitative methods in psychiatric research.
Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S
2012-01-01
Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.
Coltharp, Carla; Kessler, Rene P.; Xiao, Jie
2012-01-01
Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and allows a variety of quantitative measurements tailored to specific needs of different biological systems. PMID:23251611
QuASAR: quantitative allele-specific analysis of reads.
Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger
2015-04-15
Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-01-01
Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037
Mainali, Laxman; Camenisch, Theodore G; Hyde, James S; Subczynski, Witold K
2017-12-01
The presence of integral membrane proteins induces the formation of distinct domains in the lipid bilayer portion of biological membranes. Qualitative application of both continuous wave (CW) and saturation recovery (SR) electron paramagnetic resonance (EPR) spin-labeling methods allowed discrimination of the bulk, boundary, and trapped lipid domains. A recently developed method, which is based on the CW EPR spectra of phospholipid (PL) and cholesterol (Chol) analog spin labels, allows evaluation of the relative amount of PLs (% of total PLs) in the boundary plus trapped lipid domain and the relative amount of Chol (% of total Chol) in the trapped lipid domain [ M. Raguz, L. Mainali, W. J. O'Brien, and W. K. Subczynski (2015), Exp. Eye Res., 140:179-186 ]. Here, a new method is presented that, based on SR EPR spin-labeling, allows quantitative evaluation of the relative amounts of PLs and Chol in the trapped lipid domain of intact membranes. This new method complements the existing one, allowing acquisition of more detailed information about the distribution of lipids between domains in intact membranes. The methodological transition of the SR EPR spin-labeling approach from qualitative to quantitative is demonstrated. The abilities of this method are illustrated for intact cortical and nuclear fiber cell plasma membranes from porcine eye lenses. Statistical analysis (Student's t -test) of the data allowed determination of the separations of mean values above which differences can be treated as statistically significant ( P ≤ 0.05) and can be attributed to sources other than preparation/technique.
Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain
Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.
2011-01-01
We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568
Kato, Tomohiko; Saita, Takahiro
2011-03-16
The magnetism of Pd(1-x)Mn(x) is investigated theoretically. A localized spin model for Mn spins that interact with short-range antiferromagnetic interactions and long-range ferromagnetic interactions via itinerant d electrons is set up, with no adjustable parameters. A multicanonical Monte Carlo simulation, combined with a procedure of symmetry breaking, is employed to discriminate between the ferromagnetic and spin glass orders. The transition temperature and the low-temperature phase are determined from the temperature variation of the specific heat and the probability distributions of the ferromagnetic order parameter and the spin glass order parameter at different concentrations. The calculation results reveal that only the ferromagnetic phase exists at x < 0.02, that only the spin glass phase exists at x > 0.04, and that the two phases coexist at intermediate concentrations. This result agrees semi-quantitatively with experimental results.
He, Qian; Freakley, Simon J; Edwards, Jennifer K; Carley, Albert F; Borisevich, Albina Y; Mineo, Yuki; Haruta, Masatake; Hutchings, Graham J; Kiely, Christopher J
2016-09-27
The identity of active species in supported gold catalysts for low temperature carbon monoxide oxidation remains an unsettled debate. With large amounts of experimental evidence supporting theories of either gold nanoparticles or sub-nm gold species being active, it was recently proposed that a size-dependent activity hierarchy should exist. Here we study the diverging catalytic behaviours after heat treatment of Au/FeO x materials prepared via co-precipitation and deposition precipitation methods. After ruling out any support effects, the gold particle size distributions in different catalysts are quantitatively studied using aberration corrected scanning transmission electron microscopy (STEM). A counting protocol is developed to reveal the true particle size distribution from HAADF-STEM images, which reliably includes all the gold species present. Correlation of the populations of the various gold species present with catalysis results demonstrate that a size-dependent activity hierarchy must exist in the Au/FeO x catalyst.
Pharmacologic studies in vulnerable populations: Using the pediatric experience.
Zimmerman, Kanecia; Gonzalez, Daniel; Swamy, Geeta K; Cohen-Wolkowiez, Michael
2015-11-01
Historically, few data exist to guide dosing in children and pregnant women. Multiple barriers to inclusion of these vulnerable populations in clinical trials have led to this paucity of data. However, federal legislation targeted at pediatric therapeutics, innovative clinical trial design, use of quantitative clinical pharmacology methods, pediatric thought leadership, and collaboration have successfully overcome many existing barriers. This success has resulted in improved knowledge on pharmacokinetics, safety, and efficacy of therapeutics in children. To date, research in pregnant women has not been characterized by similar success. Wide gaps in knowledge remain despite the common use of therapeutics in pregnancy. Given the similar barriers to drug research and development in pediatric and pregnant populations, the route toward success in children may serve as a model for the advancement of drug development and appropriate drug administration in pregnant women. Copyright © 2015 Elsevier Inc. All rights reserved.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast
Pang, Wei; Coghill, George M.
2015-01-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377
ERIC Educational Resources Information Center
Berksteiner, Earl J.
2013-01-01
The purpose of this quantitative descriptive correlational study was to determine if associations existed between middle- and early-college (MEC) principals' leadership styles, teacher motivation, and teacher satisfaction. MEC programs were programs designed to assist high school students who were not served well in a traditional setting (Middle…
ERIC Educational Resources Information Center
Chudagr, Amita; Luschei, Thomas F.
2016-01-01
The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…
Group Projects and Civic Engagement in a Quantitative Literacy Course
ERIC Educational Resources Information Center
Dewar, Jacqueline; Larson, Suzanne; Zachariah, Thomas
2011-01-01
We describe our approach to incorporating a civic engagement component into a quantitative literacy (QL) course and the resulting gains in student learning, confidence, and awareness of local civic issues. We revised the existing QL course by including semester-long group projects involving local community issues that students could investigate…
APA Reporting Standards in Quantitative Research Dissertations from an Online EdD Program
ERIC Educational Resources Information Center
Salgado, Griselle
2013-01-01
This study was an investigation of the reporting practices in dissertations with quantitative research designs produced by students enrolled in an online Doctor of Education (EdD) program, one that follows the American Psychological Association (APA) standards for reporting research. Limited, empirical information exists about the competencies in…
Measurements in quantitative research: how to select and report on research instruments.
Hagan, Teresa L
2014-07-01
Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.
ERIC Educational Resources Information Center
Walker-Glenn, Michelle Lynn
2010-01-01
Although most high schools espouse school-wide literacy initiatives, few schools place equal emphasis on numeracy, or quantitative literacy. This lack of attention to quantitative skills is ironic in light of documented deficiencies in student mathematics achievement. While significant research exists regarding best practices for mathematics…
ERIC Educational Resources Information Center
Yu, Wei
2013-01-01
This dissertation applied the quantitative approach to the data gathered from online survey questionnaires regarding the three objects: Information Technology (IT) Portfolio Management, IT-Business Alignment, and IT Project Deliverables. By studying this data, this dissertation uncovered the underlying relationships that exist between the…
NASA Astrophysics Data System (ADS)
Xu, Lei; Zheng, Xiaoxiang; Zhang, Hengyi; Yu, Yajun
1998-09-01
Accurate edge detection of retinal vessels is a prerequisite for quantitative analysis of subtle morphological changes of retinal vessels under different pathological conditions. A novel method for edge detection of retinal vessels is presented in this paper. Methods: (1) Wavelet-based image preprocessing. (2) The signed edge detection algorithm and mathematical morphological operation are applied to get the approximate regions that contain retinal vessels. (3) By convolving the preprocessed image with a LoG operator only on the detected approximate regions of retinal vessels, followed by edges refining, clear edge maps of the retinal vessels are fast obtained. Results: A detailed performance evaluation together with the existing techniques is given to demonstrate the strong features of our method. Conclusions: True edge locations of retinal vessels can be fast detected with continuous structures of retinal vessels, less non- vessel segments left and insensitivity to noise. The method is also suitable for other application fields such as road edge detection.
Ehlers, Jan P; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion
2017-01-01
Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary.
Ehlers, Jan P.; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion
2017-01-01
Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary. PMID:28890927
Husch, Andreas; V Petersen, Mikkel; Gemmar, Peter; Goncalves, Jorge; Hertel, Frank
2018-01-01
Deep brain stimulation (DBS) is a neurosurgical intervention where electrodes are permanently implanted into the brain in order to modulate pathologic neural activity. The post-operative reconstruction of the DBS electrodes is important for an efficient stimulation parameter tuning. A major limitation of existing approaches for electrode reconstruction from post-operative imaging that prevents the clinical routine use is that they are manual or semi-automatic, and thus both time-consuming and subjective. Moreover, the existing methods rely on a simplified model of a straight line electrode trajectory, rather than the more realistic curved trajectory. The main contribution of this paper is that for the first time we present a highly accurate and fully automated method for electrode reconstruction that considers curved trajectories. The robustness of our proposed method is demonstrated using a multi-center clinical dataset consisting of N = 44 electrodes. In all cases the electrode trajectories were successfully identified and reconstructed. In addition, the accuracy is demonstrated quantitatively using a high-accuracy phantom with known ground truth. In the phantom experiment, the method could detect individual electrode contacts with high accuracy and the trajectory reconstruction reached an error level below 100 μm (0.046 ± 0.025 mm). An implementation of the method is made publicly available such that it can directly be used by researchers or clinicians. This constitutes an important step towards future integration of lead reconstruction into standard clinical care.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hankind, B.E.; Karkalits, O.C.
1978-09-01
The presence of large volumes of hot water (250-425 F) containing dissolved natural gas in the Gulf of Mexico coastal areas at depths of 5,000 to 25,000 feet (the geopressured zone) has been known for several years. Because natural gas and oil from conventional production methods were relatively inexpensive prior to 1973, and because foreign oil was readily available, no economic incentive existed for developing this resource. With the oil embargo and the resulting rapid escalation in prices of oil and gas since 1973, a new urgency exists for examining the economic potential of the geopressured-geothermal resource. The main objectivemore » of the research reported here was to determine the volume of gas dissolved in the geopressured water, as well as the qualitative and quantitative composition of the water and the dissolved gas. A further objective was to use an existing shut-in gas well so that drilling time and the attendant costs could be avoided.« less
Density matters: Review of approaches to setting organism-based ballast water discharge standards
Lee II,; Frazier,; Ruiz,
2010-01-01
As part of their effort to develop national ballast water discharge standards under NPDES permitting, the Office of Water requested that WED scientists identify and review existing approaches to generating organism-based discharge standards for ballast water. Six potential approaches were identified and the utility and uncertainties of each approach was evaluated. During the process of reviewing the existing approaches, the WED scientists, in conjunction with scientists at the USGS and Smithsonian Institution, developed a new approach (per capita invasion probability or "PCIP") that addresses many of the limitations of the previous methodologies. THE PCIP approach allows risk managers to generate quantitative discharge standards using historical invasion rates, ballast water discharge volumes, and ballast water organism concentrations. The statistical power of sampling ballast water for both the validation of ballast water treatment systems and ship-board compliance monitoring with the existing methods, though it should be possible to obtain sufficient samples during treatment validation. The report will go to a National Academy of Sciences expert panel that will use it in their evaluation of approaches to developing ballast water discharge standards for the Office of Water.
NASA Astrophysics Data System (ADS)
Kudomi, Nobuyuki; Watabe, Hiroshi; Hayashi, Takuya; Iida, Hidehiro
2007-04-01
Cerebral metabolic rate of oxygen (CMRO2), oxygen extraction fraction (OEF) and cerebral blood flow (CBF) images can be quantified using positron emission tomography (PET) by administrating 15O-labelled water (H152O) and oxygen (15O2). Conventionally, those images are measured with separate scans for three tracers C15O for CBV, H152O for CBF and 15O2 for CMRO2, and there are additional waiting times between the scans in order to minimize the influence of the radioactivity from the previous tracers, which results in a relatively long study period. We have proposed a dual tracer autoradiographic (DARG) approach (Kudomi et al 2005), which enabled us to measure CBF, OEF and CMRO2 rapidly by sequentially administrating H152O and 15O2 within a short time. Because quantitative CBF and CMRO2 values are sensitive to arterial input function, it is necessary to obtain accurate input function and a drawback of this approach is to require separation of the measured arterial blood time-activity curve (TAC) into pure water and oxygen input functions under the existence of residual radioactivity from the first injected tracer. For this separation, frequent manual sampling was required. The present paper describes two calculation methods: namely a linear and a model-based method, to separate the measured arterial TAC into its water and oxygen components. In order to validate these methods, we first generated a blood TAC for the DARG approach by combining the water and oxygen input functions obtained in a series of PET studies on normal human subjects. The combined data were then separated into water and oxygen components by the present methods. CBF and CMRO2 were calculated using those separated input functions and tissue TAC. The quantitative accuracy in the CBF and CMRO2 values by the DARG approach did not exceed the acceptable range, i.e., errors in those values were within 5%, when the area under the curve in the input function of the second tracer was larger than half of the first one. Bias and deviation in those values were also compatible to that of the conventional method, when noise was imposed on the arterial TAC. We concluded that the present calculation based methods could be of use for quantitatively calculating CBF and CMRO2 with the DARG approach.
Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao
2017-12-01
The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (<50%; 135 plaques, 60.27%), moderate stenosis (50%-69%; 39 plaques, 17.41%) and severe stenosis (70%-99%; 50 plaques, 22.32%) groups. Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p <0.001) and 2.9-3.4 mm and ≥3.5 mm groups (p <0.001). Both semi-quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.
Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul
2017-02-01
Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Analysis and improvement measures of flight delay in China
NASA Astrophysics Data System (ADS)
Zang, Yuhang
2017-03-01
Firstly, this paper establishes the principal component regression model to analyze the data quantitatively, based on principal component analysis to get the three principal component factors of flight delays. Then the least square method is used to analyze the factors and obtained the regression equation expression by substitution, and then found that the main reason for flight delays is airlines, followed by weather and traffic. Aiming at the above problems, this paper improves the controllable aspects of traffic flow control. For reasons of traffic flow control, an adaptive genetic queuing model is established for the runway terminal area. This paper, establish optimization method that fifteen planes landed simultaneously on the three runway based on Beijing capital international airport, comparing the results with the existing FCFS algorithm, the superiority of the model is proved.
Determining the properties of accretion-gap neutron stars
NASA Technical Reports Server (NTRS)
Kluzniak, Wlodzimierz; Michelson, Peter; Wagoner, Robert V.
1990-01-01
If neutron stars have radii as small as has been argued by some, observations of accretion-powered X-rays could verify the existence of innermost stable circular orbits (predicted by general relativity) around weakly magnetized neutron stars. This may be done by detecting X-ray emission from clumps of matter before and after they cross the gap (where matter cannot be supported by rotation) between the inner accretion disk and the stellar surface. Assuming the validity of general relativity, it would then be possible to determine the masses of such neutron stars independently of any knowledge of binary orbital parameters. If an accurate mass determination were already available through any of the methods conventionally used, the new mass determination method proposed here could then be used to quantitatively test strong field effects of gravitational theory.
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1995-01-01
Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
A new theoretical approach to analyze complex processes in cytoskeleton proteins.
Li, Xin; Kolomeisky, Anatoly B
2014-03-20
Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.
Glioma grading using cell nuclei morphologic features in digital pathology images
NASA Astrophysics Data System (ADS)
Reza, Syed M. S.; Iftekharuddin, Khan M.
2016-03-01
This work proposes a computationally efficient cell nuclei morphologic feature analysis technique to characterize the brain gliomas in tissue slide images. In this work, our contributions are two-fold: 1) obtain an optimized cell nuclei segmentation method based on the pros and cons of the existing techniques in literature, 2) extract representative features by k-mean clustering of nuclei morphologic features to include area, perimeter, eccentricity, and major axis length. This clustering based representative feature extraction avoids shortcomings of extensive tile [1] [2] and nuclear score [3] based methods for brain glioma grading in pathology images. Multilayer perceptron (MLP) is used to classify extracted features into two tumor types: glioblastoma multiforme (GBM) and low grade glioma (LGG). Quantitative scores such as precision, recall, and accuracy are obtained using 66 clinical patients' images from The Cancer Genome Atlas (TCGA) [4] dataset. On an average ~94% accuracy from 10 fold crossvalidation confirms the efficacy of the proposed method.
Human body segmentation via data-driven graph cut.
Li, Shifeng; Lu, Huchuan; Shao, Xingqing
2014-11-01
Human body segmentation is a challenging and important problem in computer vision. Existing methods usually entail a time-consuming training phase for prior knowledge learning with complex shape matching for body segmentation. In this paper, we propose a data-driven method that integrates top-down body pose information and bottom-up low-level visual cues for segmenting humans in static images within the graph cut framework. The key idea of our approach is first to exploit human kinematics to search for body part candidates via dynamic programming for high-level evidence. Then, by using the body parts classifiers, obtaining bottom-up cues of human body distribution for low-level evidence. All the evidence collected from top-down and bottom-up procedures are integrated in a graph cut framework for human body segmentation. Qualitative and quantitative experiment results demonstrate the merits of the proposed method in segmenting human bodies with arbitrary poses from cluttered backgrounds.
Local Descriptors of Dynamic and Nondynamic Correlation.
Ramos-Cordoba, Eloy; Matito, Eduard
2017-06-13
Quantitatively accurate electronic structure calculations rely on the proper description of electron correlation. A judicious choice of the approximate quantum chemistry method depends upon the importance of dynamic and nondynamic correlation, which is usually assesed by scalar measures. Existing measures of electron correlation do not consider separately the regions of the Cartesian space where dynamic or nondynamic correlation are most important. We introduce real-space descriptors of dynamic and nondynamic electron correlation that admit orbital decomposition. Integration of the local descriptors yields global numbers that can be used to quantify dynamic and nondynamic correlation. Illustrative examples over different chemical systems with varying electron correlation regimes are used to demonstrate the capabilities of the local descriptors. Since the expressions only require orbitals and occupation numbers, they can be readily applied in the context of local correlation methods, hybrid methods, density matrix functional theory, and fractional-occupancy density functional theory.
Single image super-resolution via an iterative reproducing kernel Hilbert space method.
Deng, Liang-Jian; Guo, Weihong; Huang, Ting-Zhu
2016-11-01
Image super-resolution, a process to enhance image resolution, has important applications in satellite imaging, high definition television, medical imaging, etc. Many existing approaches use multiple low-resolution images to recover one high-resolution image. In this paper, we present an iterative scheme to solve single image super-resolution problems. It recovers a high quality high-resolution image from solely one low-resolution image without using a training data set. We solve the problem from image intensity function estimation perspective and assume the image contains smooth and edge components. We model the smooth components of an image using a thin-plate reproducing kernel Hilbert space (RKHS) and the edges using approximated Heaviside functions. The proposed method is applied to image patches, aiming to reduce computation and storage. Visual and quantitative comparisons with some competitive approaches show the effectiveness of the proposed method.
Total Internal Reflection Microscopy (TIRM) as a nondestructive surface damage assessment tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Z.M.; Cohen, S.J.; Taylor, J.R.
1994-10-01
An easy to use, nondestructive, method for evaluating subsurface damage in polished substrates has been established at LLNL. Subsurface damage has been related to laser damage in coated optical components used in high power, high repetition rate laser systems. Total Internal Reflection Microscopy (TIRM) has been shown to be a viable nondestructive technique in analyzing subsurface damage in optical components. A successful TIRM system has been established for evaluating subsurface damage on fused silica components. Laser light scattering from subsurface damage sites is collected through a Nomarski microscope. These images are then captured by a CCD camera for analysis onmore » a computer. A variety of optics, including components with intentional subsurface damage due to grinding and polishing, have been analyzed and their TIRM images compared to an existing destructive etching method. Methods for quantitative measurement of subsurface damage are also discussed.« less
Flag-based detection of weak gas signatures in long-wave infrared hyperspectral image sequences
NASA Astrophysics Data System (ADS)
Marrinan, Timothy; Beveridge, J. Ross; Draper, Bruce; Kirby, Michael; Peterson, Chris
2016-05-01
We present a flag manifold based method for detecting chemical plumes in long-wave infrared hyperspectral movies. The method encodes temporal and spatial information related to a hyperspectral pixel into a flag, or nested sequence of linear subspaces. The technique used to create the flags pushes information about the background clutter, ambient conditions, and potential chemical agents into the leading elements of the flags. Exploiting this temporal information allows for a detection algorithm that is sensitive to the presence of weak signals. This method is compared to existing techniques qualitatively on real data and quantitatively on synthetic data to show that the flag-based algorithm consistently performs better on data when the SINRdB is low, and beats the ACE and MF algorithms in probability of detection for low probabilities of false alarm even when the SINRdB is high.
Algamal, Z Y; Lee, M H
2017-01-01
A high-dimensional quantitative structure-activity relationship (QSAR) classification model typically contains a large number of irrelevant and redundant descriptors. In this paper, a new design of descriptor selection for the QSAR classification model estimation method is proposed by adding a new weight inside L1-norm. The experimental results of classifying the anti-hepatitis C virus activity of thiourea derivatives demonstrate that the proposed descriptor selection method in the QSAR classification model performs effectively and competitively compared with other existing penalized methods in terms of classification performance on both the training and the testing datasets. Moreover, it is noteworthy that the results obtained in terms of stability test and applicability domain provide a robust QSAR classification model. It is evident from the results that the developed QSAR classification model could conceivably be employed for further high-dimensional QSAR classification studies.
78 FR 70059 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... (as opposed to quantitative statistical methods). In consultation with research experts, we have... qualitative interviews (as opposed to quantitative statistical methods). In consultation with research experts... utilization of qualitative interviews (as opposed to quantitative statistical methods). In consultation with...
A comprehensive and quantitative exploration of thousands of viral genomes
Mahmoudabadi, Gita
2018-01-01
The complete assembly of viral genomes from metagenomic datasets (short genomic sequences gathered from environmental samples) has proven to be challenging, so there are significant blind spots when we view viral genomes through the lens of metagenomics. One approach to overcoming this problem is to leverage the thousands of complete viral genomes that are publicly available. Here we describe our efforts to assemble a comprehensive resource that provides a quantitative snapshot of viral genomic trends – such as gene density, noncoding percentage, and abundances of functional gene categories – across thousands of viral genomes. We have also developed a coarse-grained method for visualizing viral genome organization for hundreds of genomes at once, and have explored the extent of the overlap between bacterial and bacteriophage gene pools. Existing viral classification systems were developed prior to the sequencing era, so we present our analysis in a way that allows us to assess the utility of the different classification systems for capturing genomic trends. PMID:29624169
Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David
2017-01-01
Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management. PMID:28966847
Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David
2017-09-01
Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management.
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Shrader, Sarah; Farland, Michelle Z; Danielson, Jennifer; Sicat, Brigitte; Umland, Elena M
2017-08-01
Objective. To identify and describe the available quantitative tools that assess interprofessional education (IPE) relevant to pharmacy education. Methods. A systematic approach was used to identify quantitative IPE assessment tools relevant to pharmacy education. The search strategy included the National Center for Interprofessional Practice and Education Resource Exchange (Nexus) website, a systematic search of the literature, and a manual search of journals deemed likely to include relevant tools. Results. The search identified a total of 44 tools from the Nexus website, 158 abstracts from the systematic literature search, and 570 abstracts from the manual search. A total of 36 assessment tools met the criteria to be included in the summary, and their application to IPE relevant to pharmacy education was discussed. Conclusion. Each of the tools has advantages and disadvantages. No single comprehensive tool exists to fulfill assessment needs. However, numerous tools are available that can be mapped to IPE-related accreditation standards for pharmacy education.
A comprehensive and quantitative exploration of thousands of viral genomes.
Mahmoudabadi, Gita; Phillips, Rob
2018-04-19
The complete assembly of viral genomes from metagenomic datasets (short genomic sequences gathered from environmental samples) has proven to be challenging, so there are significant blind spots when we view viral genomes through the lens of metagenomics. One approach to overcoming this problem is to leverage the thousands of complete viral genomes that are publicly available. Here we describe our efforts to assemble a comprehensive resource that provides a quantitative snapshot of viral genomic trends - such as gene density, noncoding percentage, and abundances of functional gene categories - across thousands of viral genomes. We have also developed a coarse-grained method for visualizing viral genome organization for hundreds of genomes at once, and have explored the extent of the overlap between bacterial and bacteriophage gene pools. Existing viral classification systems were developed prior to the sequencing era, so we present our analysis in a way that allows us to assess the utility of the different classification systems for capturing genomic trends. © 2018, Mahmoudabadi et al.
Nordic in Nature: Friluftsliv and Environmental Connectedness
ERIC Educational Resources Information Center
Beery, Thomas H.
2013-01-01
This study explored the question of whether a relationship exists between the Nordic cultural idea of friluftsliv and the psychological construct of environmental connectedness (EC). This quantitative study employed a correlational design with existing data from the Swedish Outdoor Recreation in Change national survey. Results indicate that there…
Risky Group Decision-Making Method for Distribution Grid Planning
NASA Astrophysics Data System (ADS)
Li, Cunbin; Yuan, Jiahang; Qi, Zhiqiang
2015-12-01
With rapid speed on electricity using and increasing in renewable energy, more and more research pay attention on distribution grid planning. For the drawbacks of existing research, this paper proposes a new risky group decision-making method for distribution grid planning. Firstly, a mixing index system with qualitative and quantitative indices is built. On the basis of considering the fuzziness of language evaluation, choose cloud model to realize "quantitative to qualitative" transformation and construct interval numbers decision matrices according to the "3En" principle. An m-dimensional interval numbers decision vector is regarded as super cuboids in m-dimensional attributes space, using two-level orthogonal experiment to arrange points uniformly and dispersedly. The numbers of points are assured by testing numbers of two-level orthogonal arrays and these points compose of distribution points set to stand for decision-making project. In order to eliminate the influence of correlation among indices, Mahalanobis distance is used to calculate the distance from each solutions to others which means that dynamic solutions are viewed as the reference. Secondly, due to the decision-maker's attitude can affect the results, this paper defines the prospect value function based on SNR which is from Mahalanobis-Taguchi system and attains the comprehensive prospect value of each program as well as the order. At last, the validity and reliability of this method is illustrated by examples which prove the method is more valuable and superiority than the other.
Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S.; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R.; Isaacs, Adrian M.; Partridge, Linda; Lu, Bingwei; Kumar, Justin P.; Girirajan, Santhosh
2016-01-01
About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292
Spatial data analysis for exploration of regional scale geothermal resources
NASA Astrophysics Data System (ADS)
Moghaddam, Majid Kiavarz; Noorollahi, Younes; Samadzadegan, Farhad; Sharifi, Mohammad Ali; Itoi, Ryuichi
2013-10-01
Defining a comprehensive conceptual model of the resources sought is one of the most important steps in geothermal potential mapping. In this study, Fry analysis as a spatial distribution method and 5% well existence, distance distribution, weights of evidence (WofE), and evidential belief function (EBFs) methods as spatial association methods were applied comparatively to known geothermal occurrences, and to publicly-available regional-scale geoscience data in Akita and Iwate provinces within the Tohoku volcanic arc, in northern Japan. Fry analysis and rose diagrams revealed similar directional patterns of geothermal wells and volcanoes, NNW-, NNE-, NE-trending faults, hotsprings and fumaroles. Among the spatial association methods, WofE defined a conceptual model correspondent with the real world situations, approved with the aid of expert opinion. The results of the spatial association analyses quantitatively indicated that the known geothermal occurrences are strongly spatially-associated with geological features such as volcanoes, craters, NNW-, NNE-, NE-direction faults and geochemical features such as hotsprings, hydrothermal alteration zones and fumaroles. Geophysical data contains temperature gradients over 100 °C/km and heat flow over 100 mW/m2. In general, geochemical and geophysical data were better evidence layers than geological data for exploring geothermal resources. The spatial analyses of the case study area suggested that quantitative knowledge from hydrothermal geothermal resources was significantly useful for further exploration and for geothermal potential mapping in the case study region. The results can also be extended to the regions with nearly similar characteristics.
Examining the Personal Nature of the K-14 Engineering Pipeline for Young Women
NASA Astrophysics Data System (ADS)
Gurski, Jennifer Sue
This mixed-methods study examined young women's perceptions of their K-14 STEM pipeline experiences and their resulting choice to enter and persist in an engineering major. Despite the increase of women in the STEM workforce, women remain underrepresented among engineering majors (Beasley & Fischer, 2012; Heilbronner, 2012; Neihart & Teo, 2013). Few studies exist that utilize a retrospective approach to understand how the culmination of young women's K-14 experiences have influenced their formation of individually held perceptions that lead to engineering persistence. It is this study's aim to utilize a mixed-methods approach to answer the following research question: How do young women's perceptions of their K-14 STEM experiences influence their decision to enroll and persist in an engineering major? These perceptions are explored through an ethnographic approach focusing on young women enrolled in engineering programs during their junior and senior years of study at a small private liberal arts university with eight engineering majors. The mixed-methods approach follows a sequential design method (Creswell, 2013) and utilizes questions in a quantitative Likert-type survey from the Academic Pathways for People Learning Engineering (APPLES) survey (Eris, Chachra, Chen, Sheppard, & Ludlow, 2010) and the Motivated Strategy Learning Questionnaire (MSLQ) (Pintrich, Smith, Garcia, & McKeachie, 1991). The quantitative study results will lead to the development of open-ended, structured questions for conducting a qualitative focus group. Anonymity of all participants is maintained. Keywords: STEM, young women, perceptions, pipeline, intervention, underrepresentation, engineering, persistence, retrospective, self-efficacy.
Quantitative determination of wool in textile by near-infrared spectroscopy and multivariate models.
Chen, Hui; Tan, Chao; Lin, Zan
2018-08-05
The wool content in textiles is a key quality index and the corresponding quantitative analysis takes an important position due to common adulterations in both raw and finished textiles. Conventional methods are maybe complicated, destructive, time-consuming, environment-unfriendly. Developing a quick, easy-to-use and green alternative method is interesting. The work focuses on exploring the feasibility of combining near-infrared (NIR) spectroscopy and several partial least squares (PLS)-based algorithms and elastic component regression (ECR) algorithms for measuring wool content in textile. A total of 108 cloth samples with wool content ranging from 0% to 100% (w/w) were collected and all the compositions are really existent in the market. The dataset was divided equally into the training and test sets for developing and validating calibration models. When using local PLS, the original spectrum axis was split into 20 sub-intervals. No obvious difference of performance can be seen for the local PLS models. The ECR model is comparable or superior to the other models due its flexibility, i.e., being transition state from PCR to PLS. It seems that ECR combined with NIR technique may be a potential method for determining wool content in textile products. In addition, it might have regulatory advantages to avoid time-consuming and environmental-unfriendly chemical analysis. Copyright © 2018 Elsevier B.V. All rights reserved.
Quantitative rotating frame relaxometry methods in MRI.
Gilani, Irtiza Ali; Sepponen, Raimo
2016-06-01
Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Sensitive detection of porcine DNA in processed animal proteins using a TaqMan real-time PCR assay.
Pegels, N; González, I; Fernández, S; García, T; Martín, R
2012-01-01
A TaqMan real-time PCR method was developed for specific detection of porcine-prohibited material in industrial feeds. The assay combines the use of a porcine-specific primer pair, which amplifies a 79 bp fragment of the mitochondrial (mt) 12 S rRNA gene, and a locked nucleic acid (LNA) TaqMan probe complementary to a target sequence lying between the porcine-specific primers. The nuclear 18 S rRNA gene system, yielding a 77 bp amplicon, was employed as a positive amplification control to monitor the total content of amplifiable DNA in the samples. The specificity of the porcine primers-probe system was verified against different animal and plant species, including mammals, birds and fish. The applicability of the real-time PCR protocol to detect the presence of porcine mt DNA in feeds was determined through the analysis of 190 industrial feeds (19 known reference and 171 blind samples) subjected to stringent processing treatments. The performance of the method allows qualitative and highly sensitive detection of short fragments from porcine DNA in all the industrial feeds declared to contain porcine material. Although the method has quantitative potential, the real quantitative capability of the assay is limited by the existing variability in terms of composition and processing conditions of the feeds, which affect the amount and quality of amplifiable DNA.
Zhao, Ziqing W; Roy, Rahul; Gebhardt, J Christof M; Suter, David M; Chapman, Alec R; Xie, X Sunney
2014-01-14
Superresolution microscopy based on single-molecule centroid determination has been widely applied to cellular imaging in recent years. However, quantitative imaging of the mammalian nucleus has been challenging due to the lack of 3D optical sectioning methods for normal-sized cells, as well as the inability to accurately count the absolute copy numbers of biomolecules in highly dense structures. Here we report a reflected light-sheet superresolution microscopy method capable of imaging inside the mammalian nucleus with superior signal-to-background ratio as well as molecular counting with single-copy accuracy. Using reflected light-sheet superresolution microscopy, we probed the spatial organization of transcription by RNA polymerase II (RNAP II) molecules and quantified their global extent of clustering inside the mammalian nucleus. Spatiotemporal clustering analysis that leverages on the blinking photophysics of specific organic dyes showed that the majority (>70%) of the transcription foci originate from single RNAP II molecules, and no significant clustering between RNAP II molecules was detected within the length scale of the reported diameter of "transcription factories." Colocalization measurements of RNAP II molecules equally labeled by two spectrally distinct dyes confirmed the primarily unclustered distribution, arguing against a prevalent existence of transcription factories in the mammalian nucleus as previously proposed. The methods developed in our study pave the way for quantitative mapping and stoichiometric characterization of key biomolecular species deep inside mammalian cells.
Detailed Hydrographic Feature Extraction from High-Resolution LiDAR Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danny L. Anderson
Detailed hydrographic feature extraction from high-resolution light detection and ranging (LiDAR) data is investigated. Methods for quantitatively evaluating and comparing such extractions are presented, including the use of sinuosity and longitudinal root-mean-square-error (LRMSE). These metrics are then used to quantitatively compare stream networks in two studies. The first study examines the effect of raster cell size on watershed boundaries and stream networks delineated from LiDAR-derived digital elevation models (DEMs). The study confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes generally yielded better stream network delineations, based on sinuosity and LRMSE. The second study demonstrates amore » new method of delineating a stream directly from LiDAR point clouds, without the intermediate step of deriving a DEM. Direct use of LiDAR point clouds could improve efficiency and accuracy of hydrographic feature extractions. The direct delineation method developed herein and termed “mDn”, is an extension of the D8 method that has been used for several decades with gridded raster data. The method divides the region around a starting point into sectors, using the LiDAR data points within each sector to determine an average slope, and selecting the sector with the greatest downward slope to determine the direction of flow. An mDn delineation was compared with a traditional grid-based delineation, using TauDEM, and other readily available, common stream data sets. Although, the TauDEM delineation yielded a sinuosity that more closely matches the reference, the mDn delineation yielded a sinuosity that was higher than either the TauDEM method or the existing published stream delineations. Furthermore, stream delineation using the mDn method yielded the smallest LRMSE.« less