ERIC Educational Resources Information Center
Kenan-Smalls, Yottie Marie
2011-01-01
The purpose of this quantitative study was to investigate diversity and inclusion from an age perspective among information technology (IT) professionals that were categorized as 4 different generations in the workforce today: Traditionalists, Baby Boomers, Generation X, and Generation Y. At the same time, this study sought to examine motivational…
Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.
2017-01-01
Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533
ERIC Educational Resources Information Center
Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.
2011-01-01
Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…
NASA Astrophysics Data System (ADS)
Wuhrer, R.; Moran, K.
2014-03-01
Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.
Knowledge and information generated using new tools/methods collectively called "Omics" technologies could have a profound effect on qualitative and quantitative characterizations of human health risk assessments.
The suffix "Omics" is a descriptor used for a series of e...
The system of technical diagnostics of the industrial safety information network
NASA Astrophysics Data System (ADS)
Repp, P. V.
2017-01-01
This research is devoted to problems of safety of the industrial information network. Basic sub-networks, ensuring reliable operation of the elements of the industrial Automatic Process Control System, were identified. The core tasks of technical diagnostics of industrial information safety were presented. The structure of the technical diagnostics system of the information safety was proposed. It includes two parts: a generator of cyber-attacks and the virtual model of the enterprise information network. The virtual model was obtained by scanning a real enterprise network. A new classification of cyber-attacks was proposed. This classification enables one to design an efficient generator of cyber-attacks sets for testing the virtual modes of the industrial information network. The numerical method of the Monte Carlo (with LPτ - sequences of Sobol), and Markov chain was considered as the design method for the cyber-attacks generation algorithm. The proposed system also includes a diagnostic analyzer, performing expert functions. As an integrative quantitative indicator of the network reliability the stability factor (Kstab) was selected. This factor is determined by the weight of sets of cyber-attacks, identifying the vulnerability of the network. The weight depends on the frequency and complexity of cyber-attacks, the degree of damage, complexity of remediation. The proposed Kstab is an effective integral quantitative measure of the information network reliability.
Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.
Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel
2016-01-01
Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability for ASD can accumulate in a given family in successive generations.
Huan, Tao; Li, Liang
2015-07-21
Generating precise and accurate quantitative information on metabolomic changes in comparative samples is important for metabolomics research where technical variations in the metabolomic data should be minimized in order to reveal biological changes. We report a method and software program, IsoMS-Quant, for extracting quantitative information from a metabolomic data set generated by chemical isotope labeling (CIL) liquid chromatography mass spectrometry (LC-MS). Unlike previous work of relying on mass spectral peak ratio of the highest intensity peak pair to measure relative quantity difference of a differentially labeled metabolite, this new program reconstructs the chromatographic peaks of the light- and heavy-labeled metabolite pair and then calculates the ratio of their peak areas to represent the relative concentration difference in two comparative samples. Using chromatographic peaks to perform relative quantification is shown to be more precise and accurate. IsoMS-Quant is integrated with IsoMS for picking peak pairs and Zero-fill for retrieving missing peak pairs in the initial peak pairs table generated by IsoMS to form a complete tool for processing CIL LC-MS data. This program can be freely downloaded from the www.MyCompoundID.org web site for noncommercial use.
NASA Astrophysics Data System (ADS)
Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.
2010-02-01
We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.
[Information about electroconvulsive therapy on the internet].
Degraeve, G; Van Heeringen, C; Audenaert, K
2006-01-01
This article aims to provide a quantitative and qualitative assessment of the information about electroconvulsive therapy that is currently available on the internet. We carried out a quantitative assessment by entering five search terms into eight (meta)search engines. We achieved our qualitative assessment by visiting the first twenty websites generated by each search on one of the search engines, in particular Google (www.google.com), and by scoring these websites with an adapted Sandvik-score. We conclude that the scored websites are technically sound but are incomplete as far as content is concerned.
Analysis of arsenical metabolites in biological samples.
Hernandez-Zavala, Araceli; Drobna, Zuzana; Styblo, Miroslav; Thomas, David J
2009-11-01
Quantitation of iAs and its methylated metabolites in biological samples provides dosimetric information needed to understand dose-response relations. Here, methods are described for separation of inorganic and mono-, di-, and trimethylated arsenicals by thin layer chromatography. This method has been extensively used to track the metabolism of the radionuclide [(73)As] in a variety of in vitro assay systems. In addition, a hydride generation-cryotrapping-gas chromatography-atomic absorption spectrometric method is described for the quantitation of arsenicals in biological samples. This method uses pH-selective hydride generation to differentiate among arsenicals containing trivalent or pentavalent arsenic.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Nonlinear pattern analysis of ventricular premature beats by mutual information
NASA Technical Reports Server (NTRS)
Osaka, M.; Saitoh, H.; Yokoshima, T.; Kishida, H.; Hayakawa, H.; Cohen, R. J.
1997-01-01
The frequency of ventricular premature beats (VPBs) has been related to the risk of mortality. However, little is known about the temporal pattern of occurrence of VPBs and its relationship to autonomic activity. Hence, we applied a general correlation measure, mutual information, to quantify how VPBs are generated over time. We also used mutual information to determine the correlation between VPB production and heart rate in order to evaluate effects of autonomic activity on VPB production. We examined twenty subjects with more than 3000 VPBs/day and simulated random time series of VPB occurrence. We found that mutual information values could be used to characterize quantitatively the temporal patterns of VPB generation. Our data suggest that VPB production is not random and VPBs generated with a higher value of mutual information may be more greatly affected by autonomic activity.
NASA Astrophysics Data System (ADS)
Suman, Rakesh; O'Toole, Peter
2014-03-01
Here we report a novel label free, high contrast and quantitative method for imaging live cells. The technique reconstructs an image from overlapping diffraction patterns using a ptychographical algorithm. The algorithm utilises both amplitude and phase data from the sample to report on quantitative changes related to the refractive index (RI) and thickness of the specimen. We report the ability of this technique to generate high contrast images, to visualise neurite elongation in neuronal cells, and to provide measure of cell proliferation.
Harn, Nicholas R; Hunt, Suzanne L; Hill, Jacqueline; Vidoni, Eric; Perry, Mark; Burns, Jeffrey M
2017-08-01
Establishing reliable methods for interpreting elevated cerebral amyloid-β plaque on PET scans is increasingly important for radiologists, as availability of PET imaging in clinical practice increases. We examined a 3-step method to detect plaque in cognitively normal older adults, focusing on the additive value of quantitative information during the PET scan interpretation process. Fifty-five F-florbetapir PET scans were evaluated by 3 experienced raters. Scans were first visually interpreted as having "elevated" or "nonelevated" plaque burden ("Visual Read"). Images were then processed using a standardized quantitative analysis software (MIMneuro) to generate whole brain and region of interest SUV ratios. This "Quantitative Read" was considered elevated if at least 2 of 6 regions of interest had an SUV ratio of more than 1.1. The final interpretation combined both visual and quantitative data together ("VisQ Read"). Cohen kappa values were assessed as a measure of interpretation agreement. Plaque was elevated in 25.5% to 29.1% of the 165 total Visual Reads. Interrater agreement was strong (kappa = 0.73-0.82) and consistent with reported values. Quantitative Reads were elevated in 45.5% of participants. Final VisQ Reads changed from initial Visual Reads in 16 interpretations (9.7%), with most changing from "nonelevated" Visual Reads to "elevated." These changed interpretations demonstrated lower plaque quantification than those initially read as "elevated" that remained unchanged. Interrater variability improved for VisQ Reads with the addition of quantitative information (kappa = 0.88-0.96). Inclusion of quantitative information increases consistency of PET scan interpretations for early detection of cerebral amyloid-β plaque accumulation.
Fundamental quantitative security in quantum key generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuen, Horace P.
2010-12-15
We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographicmore » context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.« less
Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.
Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E
2007-01-01
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig
2014-08-01
The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.
Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.
Sun, Wanxin; Chang, Shi; Tai, Dean C S; Tan, Nancy; Xiao, Guangfa; Tang, Huihuan; Yu, Hanry
2008-01-01
Liver fibrosis is associated with an abnormal increase in an extracellular matrix in chronic liver diseases. Quantitative characterization of fibrillar collagen in intact tissue is essential for both fibrosis studies and clinical applications. Commonly used methods, histological staining followed by either semiquantitative or computerized image analysis, have limited sensitivity, accuracy, and operator-dependent variations. The fibrillar collagen in sinusoids of normal livers could be observed through second-harmonic generation (SHG) microscopy. The two-photon excited fluorescence (TPEF) images, recorded simultaneously with SHG, clearly revealed the hepatocyte morphology. We have systematically optimized the parameters for the quantitative SHG/TPEF imaging of liver tissue and developed fully automated image analysis algorithms to extract the information of collagen changes and cell necrosis. Subtle changes in the distribution and amount of collagen and cell morphology are quantitatively characterized in SHG/TPEF images. By comparing to traditional staining, such as Masson's trichrome and Sirius red, SHG/TPEF is a sensitive quantitative tool for automated collagen characterization in liver tissue. Our system allows for enhanced detection and quantification of sinusoidal collagen fibers in fibrosis research and clinical diagnostics.
The quantitative control and matching of an optical false color composite imaging system
NASA Astrophysics Data System (ADS)
Zhou, Chengxian; Dai, Zixin; Pan, Xizhe; Li, Yinxi
1993-10-01
Design of an imaging system for optical false color composite (OFCC) capable of high-precision density-exposure time control and color balance is presented. The system provides high quality FCC image data that can be analyzed using a quantitative calculation method. The quality requirement to each part of the image generation system is defined, and the distribution of satellite remote sensing image information is analyzed. The proposed technology makes it possible to present the remote sensing image data more effectively and accurately.
Walzthoeni, Thomas; Joachimiak, Lukasz A; Rosenberger, George; Röst, Hannes L; Malmström, Lars; Leitner, Alexander; Frydman, Judith; Aebersold, Ruedi
2015-12-01
Chemical cross-linking in combination with mass spectrometry generates distance restraints of amino acid pairs in close proximity on the surface of native proteins and protein complexes. In this study we used quantitative mass spectrometry and chemical cross-linking to quantify differences in cross-linked peptides obtained from complexes in spatially discrete states. We describe a generic computational pipeline for quantitative cross-linking mass spectrometry consisting of modules for quantitative data extraction and statistical assessment of the obtained results. We used the method to detect conformational changes in two model systems: firefly luciferase and the bovine TRiC complex. Our method discovers and explains the structural heterogeneity of protein complexes using only sparse structural information.
NASA Astrophysics Data System (ADS)
Ozturk, Nilay; Yilmaz-Tuzun, Ozgul
2017-12-01
This study investigated preservice elementary science teachers' (PSTs) informal reasoning regarding socioscientific issues (SSI), their epistemological beliefs, and the relationship between informal reasoning and epistemological beliefs. From several SSIs, nuclear power usage was selected for this study. A total of 647 Turkish PSTs enrolled in three large universities in Turkey completed the open-ended questionnaire, which assessed the participants' informal reasoning about the target SSI, and Schommer's (1990) Epistemological Questionnaire. The participants' epistemological beliefs were assessed quantitatively and their informal reasoning was assessed both qualitatively and quantitatively. The findings revealed that PSTs preferred to generate evidence-based arguments rather than intuitive-based arguments; however, they failed to generate quality evidence and present different types of evidence to support their claims. Furthermore, among the reasoning quality indicators, PSTs mostly generated supportive argument construction. Regarding the use of reasoning modes, types of risk arguments and political-oriented arguments emerged as the new reasoning modes. The study demonstrated that the PSTs had different epistemological beliefs in terms of innate ability, omniscient authority, certain knowledge, and quick learning. Correlational analyses revealed that there was a strong negative correlation between the PSTs' certain knowledge and counterargument construction, and there were negative correlations between the PSTs' innate ability, certain knowledge, and quick learning dimensions of epistemological beliefs and their total argument construction. This study has implications for both science teacher education and the practice of science education. For example, PST teacher education programs should give sufficient importance to training teachers that are skillful and knowledgeable regarding SSIs. To achieve this, specific SSI-related courses should form part of science teacher education programs.
INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION
Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...
ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION
Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...
A Review of Recent Developments in X-Ray Diagnostics for Turbulent and Optically Dense Rocket Sprays
NASA Technical Reports Server (NTRS)
Radke, Christopher; Halls, Benjamin; Kastengren, Alan; Meyer, Terrence
2017-01-01
Highly efficient mixing and atomization of fuel and oxidizers is an important factor in many propulsion and power generating applications. To better quantify breakup and mixing in atomizing sprays, several diagnostic techniques have been developed to collect droplet information and spray statistics. Several optical based techniques, such as Ballistic Imaging and SLIPI have previously demonstrated qualitative measurements in optically dense sprays, however these techniques have produced limited quantitative information in the near injector region. To complement to these advances, a recent wave of developments utilizing synchrotron based x-rays have been successful been implemented facilitating the collection of quantitative measurements in optically dense sprays.
Essentiality, toxicity, and uncertainty in the risk assessment of manganese.
Boyes, William K
2010-01-01
Risk assessments of manganese by inhalation or oral routes of exposure typically acknowledge the duality of manganese as an essential element at low doses and a toxic metal at high doses. Previously, however, risk assessors were unable to describe manganese pharmacokinetics quantitatively across dose levels and routes of exposure, to account for mass balance, and to incorporate this information into a quantitative risk assessment. In addition, the prior risk assessment of inhaled manganese conducted by the U.S. Environmental Protection Agency (EPA) identified a number of specific factors that contributed to uncertainty in the risk assessment. In response to a petition regarding the use of a fuel additive containing manganese, methylcyclopentadienyl manganese tricarbonyl (MMT), the U.S. EPA developed a test rule under the U.S. Clean Air Act that required, among other things, the generation of pharmacokinetic information. This information was intended not only to aid in the design of health outcome studies, but also to help address uncertainties in the risk assessment of manganese. To date, the work conducted in response to the test rule has yielded substantial pharmacokinetic data. This information will enable the generation of physiologically based pharmacokinetic (PBPK) models capable of making quantitative predictions of tissue manganese concentrations following inhalation and oral exposure, across dose levels, and accounting for factors such as duration of exposure, different species of manganese, and changes of age, gender, and reproductive status. The work accomplished in response to the test rule, in combination with other scientific evidence, will enable future manganese risk assessments to consider tissue dosimetry more comprehensively than was previously possible.
Global, quantitative and dynamic mapping of protein subcellular localization.
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh
2016-06-09
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.
Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip
2010-01-01
The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251
Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.
Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan
2017-01-01
Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.
Generating new knowledge in cardiac interventions.
Blackstone, Eugene H
2013-06-01
Cardiac interventions are among the most quantitatively studied therapies. It is important for all involved with cardiac interventions to understand how information generated from observations made during patient care is transformed into data suitable for analysis, to appreciate at a high level what constitutes appropriate analyses of those data, to effectively evaluate inferences drawn from those analyses, and to apply new knowledge to better care for individual patients. Copyright © 2013 Elsevier Inc. All rights reserved.
A Discriminant Distance Based Composite Vector Selection Method for Odor Classification
Choi, Sang-Il; Jeong, Gu-Min
2014-01-01
We present a composite vector selection method for an effective electronic nose system that performs well even in noisy environments. Each composite vector generated from a electronic nose data sample is evaluated by computing the discriminant distance. By quantitatively measuring the amount of discriminative information in each composite vector, composite vectors containing informative variables can be distinguished and the final composite features for odor classification are extracted using the selected composite vectors. Using the only informative composite vectors can be also helpful to extract better composite features instead of using all the generated composite vectors. Experimental results with different volatile organic compound data show that the proposed system has good classification performance even in a noisy environment compared to other methods. PMID:24747735
INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION
Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...
Impulse Magnetic Fields Generated by Electrostatic Discharges in Protoplanetary Nebulae
NASA Technical Reports Server (NTRS)
Tunyi, I.; Guba, P.; Roth, L. E.; Timko, M.
2002-01-01
We examine quantitative aspects associated with the hypothesis of nebular lightnings as a source of impulse magnetic fields. Our findings support our previous accretion model in which a presence of impulse magnetic fields was of a key necessity. Additional information is contained in the original extended abstract.
Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
Quantitation of human milk proteins and their glycoforms using multiple reaction monitoring (MRM).
Huang, Jincui; Kailemia, Muchena J; Goonatilleke, Elisha; Parker, Evan A; Hong, Qiuting; Sabia, Rocchina; Smilowitz, Jennifer T; German, J Bruce; Lebrilla, Carlito B
2017-01-01
Human milk plays a substantial role in the child growth, development and determines their nutritional and health status. Despite the importance of the proteins and glycoproteins in human milk, very little quantitative information especially on their site-specific glycosylation is known. As more functions of milk proteins and other components continue to emerge, their fine-detailed quantitative information is becoming a key factor in milk research efforts. The present work utilizes a sensitive label-free MRM method to quantify seven milk proteins (α-lactalbumin, lactoferrin, secretory immunoglobulin A, immunoglobulin G, immunoglobulin M, α1-antitrypsin, and lysozyme) using their unique peptides while at the same time, quantifying their site-specific N-glycosylation relative to the protein abundance. The method is highly reproducible, has low limit of quantitation, and accounts for differences in glycosylation due to variations in protein amounts. The method described here expands our knowledge about human milk proteins and provides vital details that could be used in monitoring the health of the infant and even the mother. Graphical Abstract The glycopeptides EICs generated from QQQ.
IsobariQ: software for isobaric quantitative proteomics using IPTL, iTRAQ, and TMT.
Arntzen, Magnus Ø; Koehler, Christian J; Barsnes, Harald; Berven, Frode S; Treumann, Achim; Thiede, Bernd
2011-02-04
Isobaric peptide labeling plays an important role in relative quantitative comparisons of proteomes. Isobaric labeling techniques utilize MS/MS spectra for relative quantification, which can be either based on the relative intensities of reporter ions in the low mass region (iTRAQ and TMT) or on the relative intensities of quantification signatures throughout the spectrum due to isobaric peptide termini labeling (IPTL). Due to the increased quantitative information found in MS/MS fragment spectra generated by the recently developed IPTL approach, new software was required to extract the quantitative information. IsobariQ was specifically developed for this purpose; however, support for the reporter ion techniques iTRAQ and TMT is also included. In addition, to address recently emphasized issues about heterogeneity of variance in proteomics data sets, IsobariQ employs the statistical software package R and variance stabilizing normalization (VSN) algorithms available therein. Finally, the functionality of IsobariQ is validated with data sets of experiments using 6-plex TMT and IPTL. Notably, protein substrates resulting from cleavage by proteases can be identified as shown for caspase targets in apoptosis.
Couderc, Jean-Philippe
2010-01-01
The sharing of scientific data reinforces open scientific inquiry; it encourages diversity of analysis and opinion while promoting new research and facilitating the education of next generations of scientists. In this article, we present an initiative for the development of a repository containing continuous electrocardiographic information and their associated clinical information. This information is shared with the worldwide scientific community in order to improve quantitative electrocardiology and cardiac safety. First, we present the objectives of the initiative and its mission. Then, we describe the resources available in this initiative following three components: data, expertise and tools. The Data available in the Telemetric and Holter ECG Warehouse (THEW) includes continuous ECG signals and associated clinical information. The initiative attracted various academic and private partners whom expertise covers a large list of research arenas related to quantitative electrocardiography; their contribution to the THEW promotes cross-fertilization of scientific knowledge, resources, and ideas that will advance the field of quantitative electrocardiography. Finally, the tools of the THEW include software and servers to access and review the data available in the repository. To conclude, the THEW is an initiative developed to benefit the scientific community and to advance the field of quantitative electrocardiography and cardiac safety. It is a new repository designed to complement the existing ones such as Physionet, the AHA-BIH Arrhythmia Database, and the CSE database. The THEW hosts unique datasets from clinical trials and drug safety studies that, so far, were not available to the worldwide scientific community. PMID:20863512
Miller, C.; Waddell, K.; Tang, N.
2010-01-01
RP-122 Peptide quantitation using Multiple Reaction Monitoring (MRM) has been established as an important methodology for biomarker verification andvalidation.This requires high throughput combined with high sensitivity to analyze potentially thousands of target peptides in each sample.Dynamic MRM allows the system to only acquire the required MRMs of the peptide during a retention window corresponding to when each peptide is eluting. This reduces the number of concurrent MRM and therefore improves quantitation and sensitivity. MRM Selector allows the user to generate an MRM transition list with retention time information from discovery data obtained on a QTOF MS system.This list can be directly imported into the triple quadrupole acquisition software.However, situations can exist where a) the list of MRMs contain an excess of MRM transitions allowable under the ideal acquisition conditions chosen ( allowing for cycle time and chromatography conditions), or b) too many transitions in a certain retention time region which would result in an unacceptably low dwell time and cycle time.A new tool - MRM viewer has been developed to help users automatically generate multiple dynamic MRM methods from a single MRM list.In this study, a list of 3293 MRM transitions from a human plasma sample was compiled.A single dynamic MRM method with 3293 transitions results in a minimum dwell time of 2.18ms.Using MRM viewer we can generate three dynamic MRM methods with a minimum dwell time of 20ms which can give a better quality MRM quantitation.This tool facilitates both high throughput and high sensitivity for MRM quantitation.
Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028
Global, quantitative and dynamic mapping of protein subcellular localization
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH
2016-01-01
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775
NASA Technical Reports Server (NTRS)
Hoebel, Louis J.
1993-01-01
The problem of plan generation (PG) and the problem of plan execution monitoring (PEM), including updating, queries, and resource-bounded replanning, have different reasoning and representation requirements. PEM requires the integration of qualitative and quantitative information. PEM is the receiving of data about the world in which a plan or agent is executing. The problem is to quickly determine the relevance of the data, the consistency of the data with respect to the expected effects, and if execution should continue. Only spatial and temporal aspects of the plan are addressed for relevance in this work. Current temporal reasoning systems are deficient in computational aspects or expressiveness. This work presents a hybrid qualitative and quantitative system that is fully expressive in its assertion language while offering certain computational efficiencies. In order to proceed, methods incorporating approximate reasoning using hierarchies, notions of locality, constraint expansion, and absolute parameters need be used and are shown to be useful for the anytime nature of PEM.
Exploring generational cohort work satisfaction in hospital nurses.
Gordon, Pamela Ann
2017-07-03
Purpose Although extensive research exists regarding job satisfaction, many previous studies used a more restrictive, quantitative methodology. The purpose of this qualitative study is to capture the perceptions of hospital nurses within generational cohorts regarding their work satisfaction. Design/methodology/approach A preliminary qualitative, phenomenological study design explored hospital nurses' work satisfaction within generational cohorts - Baby Boomers (1946-1964), Generation X (1965-1980) and Millennials (1981-2000). A South Florida hospital provided the venue for the research. In all, 15 full-time staff nurses, segmented into generational cohorts, participated in personal interviews to determine themes related to seven established factors of work satisfaction: pay, autonomy, task requirements, administration, doctor-nurse relationship, interaction and professional status. Findings An analysis of the transcribed interviews confirmed the importance of the seven factors of job satisfaction. Similarities and differences between the generational cohorts related to a combination of stages of life and generational attributes. Practical implications The results of any qualitative research relate only to the specific venue studied and are not generalizable. However, the information gleaned from this study is transferable and other organizations are encouraged to conduct their own research and compare the results. Originality/value This study is unique, as the seven factors from an extensively used and highly respected quantitative research instrument were applied as the basis for this qualitative inquiry into generational cohort job satisfaction in a hospital setting.
Qualitative and quantitative analysis of solar hydrogen generation literature from 2001 to 2014.
Maghami, Mohammad Reza; Asl, Shahin Navabi; Rezadad, Mohammad Esmaeil; Ale Ebrahim, Nader; Gomes, Chandima
Solar hydrogen generation is one of the new topics in the field of renewable energy. Recently, the rate of investigation about hydrogen generation is growing dramatically in many countries. Many studies have been done about hydrogen generation from natural resources such as wind, solar, coal etc. In this work we evaluated global scientific production of solar hydrogen generation papers from 2001 to 2014 in any journal of all the subject categories of the Science Citation Index compiled by Institute for Scientific Information (ISI), Philadelphia, USA. Solar hydrogen generation was used as keywords to search the parts of titles, abstracts, or keywords. The published output analysis showed that hydrogen generation from the sun research steadily increased over the past 14 years and the annual paper production in 2013 was about three times 2010-paper production. The number of papers considered in this research is 141 which have been published from 2001 to this date. There are clear distinctions among author keywords used in publications from the five most high-publishing countries such as USA, China, Australia, Germany and India in solar hydrogen studies. In order to evaluate this work quantitative and qualitative analysis methods were used to the development of global scientific production in a specific research field. The analytical results eventually provide several key findings and consider the overview hydrogen production according to the solar hydrogen generation.
ERIC Educational Resources Information Center
Tractenberg, Rochelle E.
2017-01-01
Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards "big" data: while automated analyses can exploit massive amounts of data, the interpretation--and possibly more importantly, the replication--of results are…
The Science and Art of Grand Rond de Jambe en l'air: Applications to Teaching and Performance
ERIC Educational Resources Information Center
Wilson, Margaret
2008-01-01
Recommendations for the teaching and performance of grand rond de jambe en l'air are presented. Incorporating both quantitative and qualitative data, information generated from biomechanic analysis is balanced with observation of the testing and questionnaire and interview data. The voices of the participants in the research contribute to an…
A Primer for Handling Missing Values in the Analysis of Education and Training Data
ERIC Educational Resources Information Center
Gemici, Sinan; Bednarz, Alice; Lim, Patrick
2012-01-01
Quantitative research in vocational education and training (VET) is routinely affected by missing or incomplete information. However, the handling of missing data in published VET research is often sub-optimal, leading to a real risk of generating results that can range from being slightly biased to being plain wrong. Given that the growing…
ERIC Educational Resources Information Center
Ingleby, Ewan; Hedges, Clive
2012-01-01
This article is based on quantitative and qualitative data that have been generated since 2009 on the study skills needs of early years practitioners working in England. The research has identified that developing information technology skills appears to be a particular professional development need for these practitioners. The practitioners are…
Mining reflective continuing medical education data for family physician learning needs.
Lewis, Denice Colleen; Pluye, Pierre; Rodriguez, Charo; Grad, Roland
2016-04-06
A mixed methods research (sequential explanatory design) studied the potential of mining the data from the consumers of continuing medical education (CME) programs, for the developers of CME programs. The quantitative data generated by family physicians, through applying the information assessment method to CME content, was presented to key informants from the CME planning community through a qualitative description study.The data were revealed to have many potential applications including supporting the creation of CME content, CME program planning and personal learning portfolios.
Exploring pain pathophysiology in patients.
Sommer, Claudia
2016-11-04
Although animal models of pain have brought invaluable information on basic processes underlying pain pathophysiology, translation to humans is a problem. This Review will summarize what information has been gained by the direct study of patients with chronic pain. The techniques discussed range from patient phenotyping using quantitative sensory testing to specialized nociceptor neurophysiology, imaging methods of peripheral nociceptors, analyses of body fluids, genetics and epigenetics, and the generation of sensory neurons from patients via inducible pluripotent stem cells. Copyright © 2016, American Association for the Advancement of Science.
Imaging mRNA In Vivo, from Birth to Death.
Tutucci, Evelina; Livingston, Nathan M; Singer, Robert H; Wu, Bin
2018-05-20
RNA is the fundamental information transfer system in the cell. The ability to follow single messenger RNAs (mRNAs) from transcription to degradation with fluorescent probes gives quantitative information about how the information is transferred from DNA to proteins. This review focuses on the latest technological developments in the field of single-mRNA detection and their usage to study gene expression in both fixed and live cells. By describing the application of these imaging tools, we follow the journey of mRNA from transcription to decay in single cells, with single-molecule resolution. We review current theoretical models for describing transcription and translation that were generated by single-molecule and single-cell studies. These methods provide a basis to study how single-molecule interactions generate phenotypes, fundamentally changing our understating of gene expression regulation.
Quantitative multimodality imaging in cancer research and therapy.
Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad
2014-11-01
Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.
Quantitative recurrence for free semigroup actions
NASA Astrophysics Data System (ADS)
Carvalho, Maria; Rodrigues, Fagner B.; Varandas, Paulo
2018-03-01
We consider finitely generated free semigroup actions on a compact metric space and obtain quantitative information on Poincaré recurrence, average first return time and hitting frequency for the random orbits induced by the semigroup action. Besides, we relate the recurrence to balls with the rates of expansion of the semigroup generators and the topological entropy of the semigroup action. Finally, we establish a partial variational principle and prove an ergodic optimization for this kind of dynamical action. MC has been financially supported by CMUP (UID/MAT/00144/2013), which is funded by FCT (Portugal) with national (MEC) and European structural funds (FEDER) under the partnership agreement PT2020. FR and PV were partially supported by BREUDS. PV has also benefited from a fellowship awarded by CNPq-Brazil and is grateful to the Faculty of Sciences of the University of Porto for the excellent research conditions.
Wu, Shu-lian; Li, Hui; Zhang, Xiao-man; Chen, Wei R; Wang, Yun-Xia
2014-01-01
Quantitative characterization of skin collagen on photo-thermal response and its regeneration process is an important but difficult task. In this study, morphology and spectrum characteristics of collagen during photo-thermal response and its light-induced remodeling process were obtained by second-harmonic generation microscope in vivo. The texture feature of collagen orientation index and fractal dimension was extracted by image processing. The aim of this study is to detect the information hidden in skin texture during the process of photo-thermal response and its regeneration. The quantitative relations between injured collagen and texture feature were established for further analysis of the injured characteristics. Our results show that it is feasible to determine the main impacts of phototherapy on the skin. It is important to understand the process of collagen remodeling after photo-thermal injuries from texture feature.
Dual function microscope for quantitative DIC and birefringence imaging
NASA Astrophysics Data System (ADS)
Li, Chengshuai; Zhu, Yizheng
2016-03-01
A spectral multiplexing interferometry (SXI) method is presented for integrated birefringence and phase gradient measurement on label-free biological specimens. With SXI, the retardation and orientation of sample birefringence are simultaneously encoded onto two separate spectral carrier waves, generated by a crystal retarder oriented at a specific angle. Thus sufficient information for birefringence determination can be obtained from a single interference spectrum, eliminating the need for multiple acquisitions with mechanical rotation or electrical modulation. In addition, with the insertion of a Nomarski prism, the setup can then acquire quantitative differential interference contrast images. Red blood cells infected by malaria parasites are imaged for birefringence retardation as well as phase gradient. The results demonstrate that the SXI approach can achieve both quantitative phase imaging and birefringence imaging with a single, high-sensitivity system.
Quantitative metrics for evaluating the phased roll-out of clinical information systems.
Wong, David; Wu, Nicolas; Watkinson, Peter
2017-09-01
We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Ana and the Internet: a review of pro-anorexia websites.
Norris, Mark L; Boydell, Katherine M; Pinhas, Leora; Katzman, Debra K
2006-09-01
The purpose of this article is to describe the content of pro-anorexia websites, both qualitatively and quantitatively. An Internet search protocol was developed to identify pro-anorexia websites. A grounded theory approach was used to generate themes from Internet-based information. Basic descriptive analysis was employed to report on key website characteristics. Twenty pro-anorexia websites met inclusion criteria. Saturation of themes was achieved after review of 12 websites. Key website characteristics included purpose of website (75%), information about webmaster (67%), website disclaimers (58%), and information on "tips and tricks" (67%). Religious metaphors, lifestyle descriptions, and "thinspiration" (inspirational photo galleries and quotes that aim to serve as motivators for weight loss) were frequently present. A total of 10 themes were generated. The most prevalent themes included control, success, and perfection. Health-care providers and caregivers should be aware of pro-anorexia websites and their content, as these websites contain information that promote and support anorexia nervosa. Copyright (c) 2006 by Wiley Periodicals, Inc.
Microscale bioprocess optimisation.
Micheletti, Martina; Lye, Gary J
2006-12-01
Microscale processing techniques offer the potential to speed up the delivery of new drugs to the market, reducing development costs and increasing patient benefit. These techniques have application across both the chemical and biopharmaceutical sectors. The approach involves the study of individual bioprocess operations at the microlitre scale using either microwell or microfluidic formats. In both cases the aim is to generate quantitative bioprocess information early on, so as to inform bioprocess design and speed translation to the manufacturing scale. Automation can enhance experimental throughput and will facilitate the parallel evaluation of competing biocatalyst and process options.
Influence of study goals on study design and execution.
Kirklin, J W; Blackstone, E H; Naftel, D C; Turner, M E
1997-12-01
From the viewpoint of a clinician who makes recommendations to patients about choosing from the multiple possible management schemes, quantitative information derived from statistical analyses of observational studies is useful. Although random assignment of therapy is optimal, appropriately performed studies in which therapy has been nonrandomly "assigned" are considered acceptable, albeit occasionally with limitations in inferences. The analyses are considered most useful when they generate multivariable equations suitable for predicting time-related outcomes in individual patients. Graphic presentations improve communication with patients and facilitate truly informed consent.
The use of semi-structured interviews for the characterisation of farmer irrigation practices
NASA Astrophysics Data System (ADS)
O'Keeffe, Jimmy; Buytaert, Wouter; Mijic, Ana; Brozović, Nicholas; Sinha, Rajiv
2016-05-01
For the development of sustainable and realistic water security, generating information on the behaviours, characteristics, and drivers of users, as well as on the resource itself, is essential. In this paper we present a methodology for collecting qualitative and quantitative data on water use practices through semi-structured interviews. This approach facilitates the collection of detailed information on actors' decisions in a convenient and cost-effective manner. Semi-structured interviews are organised around a topic guide, which helps lead the conversation in a standardised way while allowing sufficient opportunity for relevant issues to emerge. In addition, they can be used to obtain certain types of quantitative data. While not as accurate as direct measurements, they can provide useful information on local practices and users' insights. We present an application of the methodology on farmer water use in two districts in the state of Uttar Pradesh in northern India. By means of 100 farmer interviews, information was collected on various aspects of irrigation practices, including irrigation water volumes, irrigation cost, water source, and their spatial variability. Statistical analyses of the information, along with data visualisation, are also presented, indicating a significant variation in irrigation practices both within and between districts. Our application shows that semi-structured interviews are an effective and efficient method of collecting both qualitative and quantitative information for the assessment of drivers, behaviours, and their outcomes in a data-scarce region. The collection of this type of data could significantly improve insights on water resources, leading to more realistic management options and increased water security in the future.
The use of semi-structured interviews for the characterisation of farmer irrigation practices
NASA Astrophysics Data System (ADS)
O'Keeffe, J.; Buytaert, W.; Mijic, A.; Brozovic, N.; Sinha, R.
2015-08-01
Generating information on the behaviours, characteristics and drivers of users, as well on the resource itself, is vital in developing sustainable and realistic water security options. In this paper we present a methodology for collecting qualitative and quantitative data on water use practices through semi-structured interviews. This approach facilitates the collection of detailed information on actors' decisions in a convenient and cost-effective manner. The interview is organised around a topic guide, which helps lead the conversation in a standardised way while allowing sufficient opportunity to identify relevant issues previously unknown to the researcher. In addition, semi-structured interviews can be used to obtain certain types of quantitative data. While not as accurate as direct measurements, it can provide useful information on local practices and farmers' insights. We present an application of the methodology on two districts in the State of Uttar Pradesh in North India. By means of 100 farmer interviews, information was collected on various aspects of irrigation practices, including irrigation water volumes, irrigation cost, water source and their spatial variability. A statistical analysis of the information, along with some data visualisation is also presented, which highlights a significant variation in irrigation practices both within and between the districts. Our application shows that semi-structured interviews are an effective and efficient method of collecting both qualitative and quantitative information for the assessment of drivers, behaviours and their outcomes in a data scarce region. The collection of this type of data could significantly improve insight on water resources, leading to more realistic management options and increased water security in the future.
The memory remains: Understanding collective memory in the digital age
García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha
2017-01-01
Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects. PMID:28435881
The memory remains: Understanding collective memory in the digital age.
García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha
2017-04-01
Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects.
Development of efficient test methods that can generate reliable data to inform risk assessment is an on-going challenge in the field of ecotoxicology. In the present study we evaluated whether a 96 h in vivo assay focused on a small number of quantitative real-time polymerase ch...
The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Act...
Croft, Nathan P.; de Verteuil, Danielle A.; Smith, Stewart A.; Wong, Yik Chun; Schittenhelm, Ralf B.; Tscharke, David C.; Purcell, Anthony W.
2015-01-01
The generation of antigen-specific reagents is a significant bottleneck in the study of complex pathogens that express many hundreds to thousands of different proteins or to emerging or new strains of viruses that display potential pandemic qualities and therefore require rapid investigation. In these instances the development of antibodies for example can be prohibitively expensive to cover the full pathogen proteome, or the lead time may be unacceptably long in urgent cases where new highly pathogenic viral strains may emerge. Because genomic information on such pathogens can be rapidly acquired this opens up avenues using mass spectrometric approaches to study pathogen antigen expression, host responses and for screening the utility of therapeutics. In particular, data-independent acquisition (DIA) modalities on high-resolution mass spectrometers generate spectral information on all components of a complex sample providing depth of coverage hitherto only seen in genomic deep sequencing. The spectral information generated by DIA can be iteratively interrogated for potentially any protein of interest providing both evidence of protein expression and quantitation. Here we apply a solely DIA mass spectrometry based methodology to profile the viral antigen expression in cells infected with vaccinia virus up to 9 h post infection without the need for antigen specific antibodies or other reagents. We demonstrate deep coverage of the vaccinia virus proteome using a SWATH-MS acquisition approach, extracting quantitative kinetics of 100 virus proteins within a single experiment. The results highlight the complexity of vaccinia protein expression, complementing what is known at the transcriptomic level, and provide a valuable resource and technique for future studies of viral infection and replication kinetics. Furthermore, they highlight the utility of DIA and mass spectrometry in the dissection of host-pathogen interactions. PMID:25755296
Dong, Yang; Qi, Ji; He, Honghui; He, Chao; Liu, Shaoxiong; Wu, Jian; Elson, Daniel S; Ma, Hui
2017-08-01
Polarization imaging has been recognized as a potentially powerful technique for probing the microstructural information and optical properties of complex biological specimens. Recently, we have reported a Mueller matrix microscope by adding the polarization state generator and analyzer (PSG and PSA) to a commercial transmission-light microscope, and applied it to differentiate human liver and cervical cancerous tissues with fibrosis. In this paper, we apply the Mueller matrix microscope for quantitative detection of human breast ductal carcinoma samples at different stages. The Mueller matrix polar decomposition and transformation parameters of the breast ductal tissues in different regions and at different stages are calculated and analyzed. For more quantitative comparisons, several widely-used image texture feature parameters are also calculated to characterize the difference in the polarimetric images. The experimental results indicate that the Mueller matrix microscope and the polarization parameters can facilitate the quantitative detection of breast ductal carcinoma tissues at different stages.
Data from quantitative label free proteomics analysis of rat spleen.
Dudekula, Khadar; Le Bihan, Thierry
2016-09-01
The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.
Weusten, Jos J A M; Carpay, Wim M; Oosterlaken, Tom A M; van Zuijlen, Martien C A; van de Wiel, Paul A
2002-03-15
For quantitative NASBA-based viral load assays using homogeneous detection with molecular beacons, such as the NucliSens EasyQ HIV-1 assay, a quantitation algorithm is required. During the amplification process there is a constant growth in the concentration of amplicons to which the beacon can bind while generating a fluorescence signal. The overall fluorescence curve contains kinetic information on both amplicon formation and beacon binding, but only the former is relevant for quantitation. In the current paper, mathematical modeling of the relevant processes is used to develop an equation describing the fluorescence curve as a function of the amplification time and the relevant kinetic parameters. This equation allows reconstruction of RNA formation, which is characterized by an exponential increase in concentrations as long as the primer concentrations are not rate limiting and by linear growth over time after the primer pool is depleted. During the linear growth phase, the actual quantitation is based on assessing the amplicon formation rate from the viral RNA relative to that from a fixed amount of calibrator RNA. The quantitation procedure has been successfully applied in the NucliSens EasyQ HIV-1 assay.
Quantitation of spatially-localized proteins in tissue samples using MALDI-MRM imaging.
Clemis, Elizabeth J; Smith, Derek S; Camenzind, Alexander G; Danell, Ryan M; Parker, Carol E; Borchers, Christoph H
2012-04-17
MALDI imaging allows the creation of a "molecular image" of a tissue slice. This image is reconstructed from the ion abundances in spectra obtained while rastering the laser over the tissue. These images can then be correlated with tissue histology to detect potential biomarkers of, for example, aberrant cell types. MALDI, however, is known to have problems with ion suppression, making it difficult to correlate measured ion abundance with concentration. It would be advantageous to have a method which could provide more accurate protein concentration measurements, particularly for screening applications or for precise comparisons between samples. In this paper, we report the development of a novel MALDI imaging method for the localization and accurate quantitation of proteins in tissues. This method involves optimization of in situ tryptic digestion, followed by reproducible and uniform deposition of an isotopically labeled standard peptide from a target protein onto the tissue, using an aerosol-generating device. Data is acquired by MALDI multiple reaction monitoring (MRM) mass spectrometry (MS), and accurate peptide quantitation is determined from the ratio of MRM transitions for the endogenous unlabeled proteolytic peptides to the corresponding transitions from the applied isotopically labeled standard peptides. In a parallel experiment, the quantity of the labeled peptide applied to the tissue was determined using a standard curve generated from MALDI time-of-flight (TOF) MS data. This external calibration curve was then used to determine the quantity of endogenous peptide in a given area. All standard curves generate by this method had coefficients of determination greater than 0.97. These proof-of-concept experiments using MALDI MRM-based imaging show the feasibility for the precise and accurate quantitation of tissue protein concentrations over 2 orders of magnitude, while maintaining the spatial localization information for the proteins.
PTMScout, a Web Resource for Analysis of High Throughput Post-translational Proteomics Studies*
Naegle, Kristen M.; Gymrek, Melissa; Joughin, Brian A.; Wagner, Joel P.; Welsch, Roy E.; Yaffe, Michael B.; Lauffenburger, Douglas A.; White, Forest M.
2010-01-01
The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu. PMID:20631208
SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data
NASA Astrophysics Data System (ADS)
Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos
2015-04-01
Since the foundation of stratigraphy geoscientists have recognized that data obtained from stratigraphic columns (SC), two dimensional schemes recording descriptions of both geological and paleontological features (e.g., thickness of rock packages, grain size, fossil and lithological components, and sedimentary structures), are key elements for establishing reliable hypotheses about the distribution in space and time of rock sequences, and ancient sedimentary environmental and paleobiological dynamics. Despite the tremendous advances on the way geoscientists store, plot, and quantitatively analyze sedimentological and paleontological data (e.g., Macrostrat [http://www.macrostrat.org/], Paleobiology Database [http://www.paleodb.org/], respectively), there is still a lack of computational methodologies designed to quantitatively examine data from a highly detailed SCs. Moreover, frequently the stratigraphic information is plotted "manually" using vector graphics editors (e.g., Corel Draw, Illustrator), however, this information although store on a digital format, cannot be used readily for any quantitative analysis. Therefore, any attempt to examine the stratigraphic data in an analytical fashion necessarily takes further steps. Given these issues, we have developed the sofware 'Stratigraphic Data Analysis in R' (SDAR), which stores in a database all sedimentological, stratigraphic, and paleontological information collected from a SC, allowing users to generate high-quality graphic plots (including one or multiple features stored in the database). SDAR also encompasses quantitative analyses helping users to quantify stratigraphic information (e.g. grain size, sorting and rounding, proportion of sand/shale). Finally, given that the SDAR analysis module, has been written in the open-source high-level computer language "R graphics/statistics language" [R Development Core Team, 2014], it is already loaded with many of the crucial features required to accomplish basic and complex tasks of statistical analysis (i.e., R language provide more than hundred spatial libraries that allow users to explore various Geostatistics and spatial analysis). Consequently, SDAR allows a deeper exploration of the stratigraphic data collected in the field, it will allow the geoscientific community in the near future to develop complex analyses related with the distribution in space and time of rock sequences, such as lithofacial correlations, by a multivariate comparison between empirical SCs with quantitative lithofacial models established from modern sedimentary environments.
Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F
2018-05-26
Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.
Cognitive functioning in patients with affective disorders and schizophrenia: a meta-analysis.
Stefanopoulou, Evgenia; Manoharan, Andiappan; Landau, Sabine; Geddes, John R; Goodwin, Guy; Frangou, Sophia
2009-01-01
There is considerable evidence for cognitive dysfunction in schizophrenia and affective disorders, but the pattern of potential similarities or differences between diagnostic groups remains uncertain. The objective of this study was to conduct a quantitative review of studies on cognitive performance in schizophrenia and affective disorders. Relevant articles were identified through literature search in major databases for the period between January 1980 and December 2005. Meta-analytic treatment of the original studies revealed widespread cognitive deficits in patients with schizophrenia and affective disorders in intellectual ability and speed of information processing, in encoding and retrieval, rule discovery and in response generation and response inhibition. Differences between diagnostic groups were quantitative rather than qualitative.
NASA Astrophysics Data System (ADS)
Quatela, Alessia; Gilmore, Adam M.; Steege Gall, Karen E.; Sandros, Marinella; Csatorday, Karoly; Siemiarczuk, Alex; (Ben Yang, Boqian; Camenen, Loïc
2018-04-01
We investigate the new simultaneous absorbance-transmission and fluorescence excitation-emission matrix method for rapid and effective characterization of the varying components from a mixture. The absorbance-transmission and fluorescence excitation-emission matrix method uniquely facilitates correction of fluorescence inner-filter effects to yield quantitative fluorescence spectral information that is largely independent of component concentration. This is significant because it allows one to effectively monitor quantitative component changes using multivariate methods and to generate and evaluate spectral libraries. We present the use of this novel instrument in different fields: i.e. tracking changes in complex mixtures including natural water, wine as well as monitoring stability and aggregation of hormones for biotherapeutics.
NASA Astrophysics Data System (ADS)
Singh, Vijay Raj; Yaqoob, Zahid; So, Peter T. C.
2017-02-01
Quantitative phase microscopy (QPM) techniques developed so far primarily belongs to high speed transmitted light based systems that has enough sensitivity to resolve membrane fluctuations and dynamics, but has no depth resolution. Therefore, most biomechanics studies using QPM today is confined to simple cells, such as RBCs, without internal organelles. An important instrument that will greatly extend the biomedical applications of QPM is to develop next generation microscope with 3D capability and sufficient temporal resolution to study biomechanics of complex eukaryotic cells including the mechanics of their internal compartments. For eukaryotic cells, the depth sectioning capability is critical and should be sufficient to distinguish nucleic membrane fluctuations from plasma membrane fluctuations. Further, this microscope must provide high temporal resolution since typical eukaryotes membranes are substantially stiffer than RBCs. A confocal reflectance quantitative phase microscope is presented based on multi-pinhole scanning, with the capabilities of higher temporal resolution and sensitivity for nucleic and plasma membranes of eukaryotic cells. System hardware is developed based on an array of confocal pinhole generated by using the `ON' state of subset of micro-mirrors of digital micro-mirror device (DMD, from Texas Instruments) and high-speed raster scanning provides 14ms imaging speed in wide-field mode. A common path interferometer is integrated at the imaging arm for detection of specimens' quantitative phase information. Theoretical investigation of quantitative phase reconstructed from system is investigated and application of system is presented for dimensional fluctuations measurements of both cellular plasma and nucleic membranes of embryonic stem cells.
2017-01-01
Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738
2017-12-01
sessions were correlated quantitatively by the web-based survey , identifying the need to update eSUPPO with specific icons such as Innovation...focus groups and surveys , assesses how well the mobile app meets the needs of the Supply Corps community. The analysis begins by understanding the...the app. After a complete analysis of eSUPPO’s current “As-Is” processes, the study takes the information gathered from both the survey and focus
Thiyl radicals and induction of protein degradation
Schöneich, Christian
2016-01-01
Thiyl radicals are important intermediates in the redox biology and chemistry of thiols. These radicals can react via hydrogen transfer with various C-H bonds in peptides and proteins, leading to the generation of carbon-centered radicals, and, potentially, to irreversible protein damage. This review summarizes quantitative information on reaction kinetics and product formation, and discusses the significance of these reactions for protein degradation induced by thiyl radical formation. PMID:26212409
Dynamic Decision Making under Uncertainty and Partial Information
2013-11-14
integral under the natural filtration generated by the Brownian motions . This compact expression potentially enables us to design sub- optimal penalties...bounds on bermudan option price under jump diffusion processes. Quantitative Finance , 2013. Under review, available at http://arxiv.org/abs/1305.4321... Finance , 19:53 – 71, 2009. [3] D.P. Bertsekas. Dynamic Programming and Optimal Control. Athena Scientific, 4th edition, 2012. [4] D.B. Brown and J.E
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Using mixed methods to identify and answer clinically relevant research questions.
Shneerson, Catherine L; Gale, Nicola K
2015-06-01
The need for mixed methods research in answering health care questions is becoming increasingly recognized because of the complexity of factors that affect health outcomes. In this article, we argue for the value of using a qualitatively driven mixed method approach for identifying and answering clinically relevant research questions. This argument is illustrated by findings from a study on the self-management practices of cancer survivors and the exploration of one particular clinically relevant finding about higher uptake of self-management in cancer survivors who had received chemotherapy treatment compared with those who have not. A cross-sectional study generated findings that formed the basis for the qualitative study, by informing the purposive sampling strategy and generating new qualitative research questions. Using a quantitative research component to supplement a qualitative study can enhance the generalizability and clinical relevance of the findings and produce detailed, contextualized, and rich answers to research questions that would be unachievable through quantitative or qualitative methods alone. © The Author(s) 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beleggia, M.; Helmholtz-Zentrum Berlin für Materialien und Energie, Berlin; Kasama, T.
We apply off-axis electron holography and Lorentz microscopy in the transmission electron microscope to map the electric field generated by a sharp biased metallic tip. A combination of experimental data and modelling provides quantitative information about the potential and the field around the tip. Close to the tip apex, we measure a maximum field intensity of 82 MV/m, corresponding to a field k factor of 2.5, in excellent agreement with theory. In order to verify the validity of the measurements, we use the inferred charge density distribution in the tip region to generate simulated phase maps and Fresnel (out-of-focus) imagesmore » for comparison with experimental measurements. While the overall agreement is excellent, the simulations also highlight the presence of an unexpected astigmatic contribution to the intensity in a highly defocused Fresnel image, which is thought to result from the geometry of the applied field.« less
NASA Astrophysics Data System (ADS)
Kowligy, Abijith S.; Lind, Alex; Hickstein, Daniel D.; Carlson, David R.; Timmers, Henry; Nader, Nima; Cruz, Flavio C.; Ycas, Gabriel; Papp, Scott B.; Diddams, Scott A.
2018-04-01
We experimentally demonstrate a simple configuration for mid-infrared (MIR) frequency comb generation in quasi-phase-matched lithium niobate waveguides using the cascaded-$\\chi^{(2)}$ nonlinearity. With nanojoule-scale pulses from an Er:fiber laser, we observe octave-spanning supercontinuum in the near-infrared with dispersive-wave generation in the 2.5--3 $\\text{\\mu}$m region and intra-pulse difference-frequency generation in the 4--5 $\\text{\\mu}$m region. By engineering the quasi-phase-matched grating profiles, tunable, narrow-band MIR and broadband MIR spectra are both observed in this geometry. Finally, we perform numerical modeling using a nonlinear envelope equation, which shows good quantitative agreement with the experiment---and can be used to inform waveguide designs to tailor the MIR frequency combs. Our results identify a path to a simple single-branch approach to mid-infrared frequency comb generation in a compact platform using commercial Er:fiber technology.
Kowligy, Abijith S; Lind, Alex; Hickstein, Daniel D; Carlson, David R; Timmers, Henry; Nader, Nima; Cruz, Flavio C; Ycas, Gabriel; Papp, Scott B; Diddams, Scott A
2018-04-15
We experimentally demonstrate a simple configuration for mid-infrared (MIR) frequency comb generation in quasi-phase-matched lithium niobate waveguides using the cascaded-χ (2) nonlinearity. With nanojoule-scale pulses from an Er:fiber laser, we observe octave-spanning supercontinuum in the near-infrared with dispersive wave generation in the 2.5-3 μm region and intrapulse difference frequency generation in the 4-5 μm region. By engineering the quasi-phase-matched grating profiles, tunable, narrowband MIR and broadband MIR spectra are both observed in this geometry. Finally, we perform numerical modeling using a nonlinear envelope equation, which shows good quantitative agreement with the experiment-and can be used to inform waveguide designs to tailor the MIR frequency combs. Our results identify a path to a simple single-branch approach to mid-infrared frequency comb generation in a compact platform using commercial Er:fiber technology.
NASA Astrophysics Data System (ADS)
Li, Xuan; Liu, Zhiping; Jiang, Xiaoli; Lodewijks, Gabrol
2018-01-01
Eddy current pulsed thermography (ECPT) is well established for non-destructive testing of electrical conductive materials, featuring the advantages of contactless, intuitive detecting and efficient heating. The concept of divergence characterization of the damage rate of carbon fibre-reinforced plastic (CFRP)-steel structures can be extended to ECPT thermal pattern characterization. It was found in this study that the use of ECPT technology on CFRP-steel structures generated a sizeable amount of valuable information for comprehensive material diagnostics. The relationship between divergence and transient thermal patterns can be identified and analysed by deploying mathematical models to analyse the information about fibre texture-like orientations, gaps and undulations in these multi-layered materials. The developed algorithm enabled the removal of information about fibre texture and the extraction of damage features. The model of the CFRP-glue-steel structures with damage was established using COMSOL Multiphysics® software, and quantitative non-destructive damage evaluation from the ECPT image areas was derived. The results of this proposed method illustrate that damaged areas are highly affected by available information about fibre texture. This proposed work can be applied for detection of impact induced damage and quantitative evaluation of CFRP structures.
Kessler, Larry G; Barnhart, Huiman X; Buckler, Andrew J; Choudhury, Kingshuk Roy; Kondratovich, Marina V; Toledano, Alicia; Guimaraes, Alexander R; Filice, Ross; Zhang, Zheng; Sullivan, Daniel C
2015-02-01
The development and implementation of quantitative imaging biomarkers has been hampered by the inconsistent and often incorrect use of terminology related to these markers. Sponsored by the Radiological Society of North America, an interdisciplinary group of radiologists, statisticians, physicists, and other researchers worked to develop a comprehensive terminology to serve as a foundation for quantitative imaging biomarker claims. Where possible, this working group adapted existing definitions derived from national or international standards bodies rather than invent new definitions for these terms. This terminology also serves as a foundation for the design of studies that evaluate the technical performance of quantitative imaging biomarkers and for studies of algorithms that generate the quantitative imaging biomarkers from clinical scans. This paper provides examples of research studies and quantitative imaging biomarker claims that use terminology consistent with these definitions as well as examples of the rampant confusion in this emerging field. We provide recommendations for appropriate use of quantitative imaging biomarker terminological concepts. It is hoped that this document will assist researchers and regulatory reviewers who examine quantitative imaging biomarkers and will also inform regulatory guidance. More consistent and correct use of terminology could advance regulatory science, improve clinical research, and provide better care for patients who undergo imaging studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
Johnson, Rachel C.; Windell, Sean; Brandes, Patricia L.; Conrad, J. Louise; Ferguson, John; Goertler, Pascale A. L.; Harvey, Brett N.; Heublein, Joseph; Isreal, Joshua A.; Kratville, Daniel W.; Kirsch, Joseph E.; Perry, Russell W.; Pisciotto, Joseph; Poytress, William R.; Reece, Kevin; Swart, Brycen G.
2017-01-01
A robust monitoring network that provides quantitative information about the status of imperiled species at key life stages and geographic locations over time is fundamental for sustainable management of fisheries resources. For anadromous species, management actions in one geographic domain can substantially affect abundance of subsequent life stages that span broad geographic regions. Quantitative metrics (e.g., abundance, movement, survival, life history diversity, and condition) at multiple life stages are needed to inform how management actions (e.g., hatcheries, harvest, hydrology, and habitat restoration) influence salmon population dynamics. The existing monitoring network for endangered Sacramento River winterrun Chinook Salmon (SRWRC, Oncorhynchus tshawytscha) in California’s Central Valley was compared to conceptual models developed for each life stage and geographic region of the life cycle to identify relevant SRWRC metrics. We concluded that the current monitoring network was insufficient to diagnose when (life stage) and where (geographic domain) chronic or episodic reductions in SRWRC cohorts occur, precluding within- and among-year comparisons. The strongest quantitative data exist in the Upper Sacramento River, where abundance estimates are generated for adult spawners and emigrating juveniles. However, once SRWRC leave the upper river, our knowledge of their identity, abundance, and condition diminishes, despite the juvenile monitoring enterprise. We identified six system-wide recommended actions to strengthen the value of data generated from the existing monitoring network to assess resource management actions: (1) incorporate genetic run identification; (2) develop juvenile abundance estimates; (3) collect data for life history diversity metrics at multiple life stages; (4) expand and enhance real-time fish survival and movement monitoring; (5) collect fish condition data; and (6) provide timely public access to monitoring data in open data formats. To illustrate how updated technologies can enhance the existing monitoring to provide quantitative data on SRWRC, we provide examples of how each recommendation can address specific management issues.
Automatic registration of ICG images using mutual information and perfusion analysis
NASA Astrophysics Data System (ADS)
Kim, Namkug; Seo, Jong-Mo; Lee, June-goo; Kim, Jong Hyo; Park, Kwangsuk; Yu, Hyeong-Gon; Yu, Young Suk; Chung, Hum
2005-04-01
Introduction: Indocyanin green fundus angiographic images (ICGA) of the eyes is useful method in detecting and characterizing the choroidal neovascularization (CNV), which is the major cause of the blindness over 65 years of age. To investigate the quantitative analysis of the blood flow on ICGA, systematic approach for automatic registration of using mutual information and a quantitative analysis was developed. Methods: Intermittent sequential images of indocyanin green angiography were acquired by Heidelberg retinal angiography that uses the laser scanning system for the image acquisition. Misalignment of the each image generated by the minute eye movement of the patients was corrected by the mutual information method because the distribution of the contrast media on image is changing throughout the time sequences. Several region of interest (ROI) were selected by a physician and the intensities of the selected region were plotted according to the time sequences. Results: The registration of ICGA time sequential images is required not only translate transform but also rotational transform. Signal intensities showed variation based on gamma-variate function depending on ROIs and capillary vessels show more variance of signal intensity than major vessels. CNV showed intermediate variance of signal intensity and prolonged transit time. Conclusion: The resulting registered images can be used not only for quantitative analysis, but also for perfusion analysis. Various investigative approached on CNV using this method will be helpful in the characterization of the lesion and follow-up.
Digital storage and analysis of color Doppler echocardiograms
NASA Technical Reports Server (NTRS)
Chandra, S.; Thomas, J. D.
1997-01-01
Color Doppler flow mapping has played an important role in clinical echocardiography. Most of the clinical work, however, has been primarily qualitative. Although qualitative information is very valuable, there is considerable quantitative information stored within the velocity map that has not been extensively exploited so far. Recently, many researchers have shown interest in using the encoded velocities to address the clinical problems such as quantification of valvular regurgitation, calculation of cardiac output, and characterization of ventricular filling. In this article, we review some basic physics and engineering aspects of color Doppler echocardiography, as well as drawbacks of trying to retrieve velocities from video tape data. Digital storage, which plays a critical role in performing quantitative analysis, is discussed in some detail with special attention to velocity encoding in DICOM 3.0 (medical image storage standard) and the use of digital compression. Lossy compression can considerably reduce file size with minimal loss of information (mostly redundant); this is critical for digital storage because of the enormous amount of data generated (a 10 minute study could require 18 Gigabytes of storage capacity). Lossy JPEG compression and its impact on quantitative analysis has been studied, showing that images compressed at 27:1 using the JPEG algorithm compares favorably with directly digitized video images, the current goldstandard. Some potential applications of these velocities in analyzing the proximal convergence zones, mitral inflow, and some areas of future development are also discussed in the article.
NASA Astrophysics Data System (ADS)
Zhuo, Shuangmu; Chen, Jianxin; Xie, Shusen; Hong, Zhibin; Jiang, Xingshan
2009-03-01
Intrinsic two-photon excited fluorescence (TPEF) and second-harmonic generation (SHG) signals are shown to differentiate between normal and neoplastic human esophageal stroma. It was found that TPEF and SHG signals from normal and neoplastic stroma exhibit different organization features, providing quantitative information about the biomorphology and biochemistry of tissue. By comparing normal with neoplastic stroma, there were significant differences in collagen-related changes, elastin-related changes, and alteration in proportions of matrix molecules, giving insight into the stromal changes associated with cancer progression and providing substantial potential to be applied in vivo to the clinical diagnosis of epithelial precancers and cancers.
Productivity Measurement in Research and Development Laboratories.
1981-09-01
identified by Szilagyi and Wallace (67:447-453), because these concepts "refer to the adequacy of the information that is generated and employed in...specified time period. Quantitative terms are used to ex- press qualitative judgments. Szilagyi and Wallace (66:457-458) further expand on the...Scientists/Engineers." Unpublished master’s thesis, GSM/SM/76D-36, Wright-Patterson AFB OH, December 1976. ADA 036462. 67. Szilagyi , Andrew D. Jr., and Marc
Representation and Reconstruction of Three-dimensional Microstructures in Ni-based Superalloys
2010-12-20
Materiala, 56, pp. 427-437 (2009); • Application of joint histogram and mutual information to registration and data fusion problems in serial...sectioning data sets and synthetically generated microstructures. The method is easy to use, and allows for a quantitative description of shapes. Further...following objectives were achieved: • we have successfully applied 3-D moment invariant analysis to several experimental data sets; • we have extended 2-D
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Imaging Performance of Quantitative Transmission Ultrasound
Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott
2015-01-01
Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918
MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.
Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J
2015-10-15
Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh
1998-01-01
An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.
How to integrate quantitative information into imaging reports for oncologic patients.
Martí-Bonmatí, L; Ruiz-Martínez, E; Ten, A; Alberich-Bayarri, A
2018-05-01
Nowadays, the images and information generated in imaging tests, as well as the reports that are issued, are digital and represent a reliable source of data. Reports can be classified according to their content and to the type of information they include into three main types: organized (free text in natural language), predefined (with templates and guidelines elaborated with previously determined natural language like that used in BI-RADS and PI-RADS), or structured (with drop-down menus displaying questions with various possible answers that have been agreed on with the rest of the multidisciplinary team, which use standardized lexicons and are structured in the form of a database with data that can be traced and exploited with statistical tools and data mining). The structured report, compatible with Management of Radiology Report Templates (MRRT), makes it possible to incorporate quantitative information related with the digital analysis of the data from the acquired images to accurately and precisely describe the properties and behavior of tissues by means of radiomics (characteristics and parameters). In conclusion, structured digital information (images, text, measurements, radiomic features, and imaging biomarkers) should be integrated into computerized reports so that they can be indexed in large repositories. Radiologic databanks are fundamental for exploiting health information, phenotyping lesions and diseases, and extracting conclusions in personalized medicine. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
Second-harmonic patterned polarization-analyzed reflection confocal microscope
NASA Astrophysics Data System (ADS)
Okoro, Chukwuemeka; Toussaint, Kimani C.
2017-08-01
We introduce the second-harmonic patterned polarization-analyzed reflection confocal (SPPARC) microscope-a multimodal imaging platform that integrates Mueller matrix polarimetry with reflection confocal and second-harmonic generation (SHG) microscopy. SPPARC microscopy provides label-free three-dimensional (3-D), SHG-patterned confocal images that lend themselves to spatially dependent, linear polarimetric analysis for extraction of rich polarization information based on the Mueller calculus. To demonstrate its capabilities, we use SPPARC microscopy to analyze both porcine tendon and ligament samples and find differences in both circular degree-of-polarization and depolarization parameters. Moreover, using the collagen-generated SHG signal as an endogenous counterstain, we show that the technique can be used to provide 3-D polarimetric information of the surrounding extrafibrillar matrix plus cells or EFMC region. The unique characteristics of SPPARC microscopy holds strong potential for it to more accurately and quantitatively describe microstructural changes in collagen-rich samples in three spatial dimensions.
Review and Synthesize Completed Research Through Systematic Review.
Hopp, Lisa; Rittenmeyer, Leslie
2015-10-01
The evidence-based health care movement has generated new opportunity for scholars to generate synthesized sources of evidence. Systematic reviews are rigorous forms of synthesized evidence that scholars can conduct if they have requisite skills, time, and access to excellent library resources. Systematic reviews play an important role in synthesizing what is known and unknown about a particular health issue. Thus, they have a synergistic relationship with primary research. They can both inform clinical decisions when the evidence is adequate and identify gaps in knowledge to inform research priorities. Systematic reviews can be conducted of quantitative and qualitative evidence to answer many types of questions. They all share characteristics of rigor that arise from a priori protocol development, transparency, exhaustive searching, dual independent reviewers who critically appraise studies using standardized tools, rigor in synthesis, and peer review at multiple stages in the conduct and reporting of the systematic review. © The Author(s) 2015.
Filamentary model in resistive switching materials
NASA Astrophysics Data System (ADS)
Jasmin, Alladin C.
2017-12-01
The need for next generation computer devices is increasing as the demand for efficient data processing increases. The amount of data generated every second also increases which requires large data storage devices. Oxide-based memory devices are being studied to explore new research frontiers thanks to modern advances in nanofabrication. Various oxide materials are studied as active layers for non-volatile memory. This technology has potential application in resistive random-access-memory (ReRAM) and can be easily integrated in CMOS technologies. The long term perspective of this research field is to develop devices which mimic how the brain processes information. To realize such application, a thorough understanding of the charge transport and switching mechanism is important. A new perspective in the multistate resistive switching based on current-induced filament dynamics will be discussed. A simple equivalent circuit of the device gives quantitative information about the nature of the conducting filament at different resistance states.
Method and tool for network vulnerability analysis
Swiler, Laura Painton [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM
2006-03-14
A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."
Quantitative prediction of drug side effects based on drug-related features.
Niu, Yanqing; Zhang, Wen
2017-09-01
Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.
Measuring the performance of visual to auditory information conversion.
Tan, Shern Shiou; Maul, Tomás Henrique Bode; Mennie, Neil Russell
2013-01-01
Visual to auditory conversion systems have been in existence for several decades. Besides being among the front runners in providing visual capabilities to blind users, the auditory cues generated from image sonification systems are still easier to learn and adapt to compared to other similar techniques. Other advantages include low cost, easy customizability, and universality. However, every system developed so far has its own set of strengths and weaknesses. In order to improve these systems further, we propose an automated and quantitative method to measure the performance of such systems. With these quantitative measurements, it is possible to gauge the relative strengths and weaknesses of different systems and rank the systems accordingly. Performance is measured by both the interpretability and also the information preservation of visual to auditory conversions. Interpretability is measured by computing the correlation of inter image distance (IID) and inter sound distance (ISD) whereas the information preservation is computed by applying Information Theory to measure the entropy of both visual and corresponding auditory signals. These measurements provide a basis and some insights on how the systems work. With an automated interpretability measure as a standard, more image sonification systems can be developed, compared, and then improved. Even though the measure does not test systems as thoroughly as carefully designed psychological experiments, a quantitative measurement like the one proposed here can compare systems to a certain degree without incurring much cost. Underlying this research is the hope that a major breakthrough in image sonification systems will allow blind users to cost effectively regain enough visual functions to allow them to lead secure and productive lives.
Integrated work-flow for quantitative metabolome profiling of plants, Peucedani Radix as a case.
Song, Yuelin; Song, Qingqing; Liu, Yao; Li, Jun; Wan, Jian-Bo; Wang, Yitao; Jiang, Yong; Tu, Pengfei
2017-02-08
Universal acquisition of reliable information regarding the qualitative and quantitative properties of complicated matrices is the premise for the success of metabolomics study. Liquid chromatography-mass spectrometry (LC-MS) is now serving as a workhorse for metabolomics; however, LC-MS-based non-targeted metabolomics is suffering from some shortcomings, even some cutting-edge techniques have been introduced. Aiming to tackle, to some extent, the drawbacks of the conventional approaches, such as redundant information, detector saturation, low sensitivity, and inconstant signal number among different runs, herein, a novel and flexible work-flow consisting of three progressive steps was proposed to profile in depth the quantitative metabolome of plants. The roots of Peucedanum praeruptorum Dunn (Peucedani Radix, PR) that are rich in various coumarin isomers, were employed as a case study to verify the applicability. First, offline two dimensional LC-MS was utilized for in-depth detection of metabolites in a pooled PR extract namely universal metabolome standard (UMS). Second, mass fragmentation rules, notably concerning angular-type pyranocoumarins that are the primary chemical homologues in PR, and available databases were integrated for signal assignment and structural annotation. Third, optimum collision energy (OCE) as well as ion transition for multiple monitoring reaction measurement was online optimized with a reference compound-free strategy for each annotated component and large-scale relative quantification of all annotated components was accomplished by plotting calibration curves via serially diluting UMS. It is worthwhile to highlight that the potential of OCE for isomer discrimination was described and the linearity ranges of those primary ingredients were extended by suppressing their responses. The integrated workflow is expected to be qualified as a promising pipeline to clarify the quantitative metabolome of plants because it could not only holistically provide qualitative information, but also straightforwardly generate accurate quantitative dataset. Copyright © 2016 Elsevier B.V. All rights reserved.
Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lepinski, James
2013-09-30
A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2013 CFR
2013-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2011 CFR
2011-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2014 CFR
2014-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2012 CFR
2012-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M
2017-08-01
Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.
Atomic force microscopy of starch systems.
Zhu, Fan
2017-09-22
Atomic force microscopy (AFM) generates information on topography, adhesion, and elasticity of sample surface by touching with a tip. Under suitable experimental settings, AFM can image biopolymers of few nanometers. Starch is a major food and industrial component. AFM has been used to probe the morphology, properties, modifications, and interactions of starches from diverse botanical origins at both micro- and nano-structural levels. The structural information obtained by AFM supports the blocklet structure of the granules, and provides qualitative and quantitative basis for some physicochemical properties of diverse starch systems. It becomes evident that AFM can complement other microscopic techniques to provide novel structural insights for starch systems.
Badran, Hani; Pluye, Pierre; Grad, Roland
2017-03-14
The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. ©Hani Badran, Pierre Pluye, Roland Grad. Originally published in JMIR Medical Education (http://mededu.jmir.org), 14.03.2017.
Three-dimensional magnetic resonance imaging of physeal injury: reliability and clinical utility.
Lurie, Brett; Koff, Matthew F; Shah, Parina; Feldmann, Eric James; Amacker, Nadja; Downey-Zayas, Timothy; Green, Daniel; Potter, Hollis G
2014-01-01
Injuries to the physis are common in children with a subset resulting in an osseous bar and potential growth disturbance. Magnetic resonance imaging allows for detailed assessment of the physis with the ability to generate 3-dimensional physeal models from volumetric data. The purpose of this study was to assess the interrater reliability of physeal bar area measurements generated using a validated semiautomated segmentation technique and to highlight the clinical utility of quantitative 3-dimensional (3D) physeal mapping in pediatric orthopaedic practice. The Radiology Information System/Picture Archiving Communication System (PACS) at our institution was searched to find consecutive patients who were imaged for the purpose of assessing a physeal bar or growth disturbance between December 2006 and October 2011. Physeal segmentation was retrospectively performed by 2 independent operators using semiautomated software to generate physeal maps and bar area measurements from 3-dimensional spoiled gradient recalled echo sequences. Inter-reliability was statistically analyzed. Subsequent surgical management for each patient was recorded from the patient notes and surgical records. We analyzed 24 patients (12M/12F) with a mean age of 11.4 years (range, 5-year to 15-year olds) and 25 physeal bars. Of the physeal bars: 9 (36%) were located in the distal tibia; 8 (32%) in the proximal tibia; 5 (20%) in the distal femur; 1 (4%) in the proximal femur; 1 (4%) in the proximal humerus; and 1 (4%) in the distal radius. The independent operator measurements of physeal bar area were highly correlated with a Pearson correlation coefficient (r) of 0.96 and an intraclass correlation coefficient for average measures of 0.99 (95% confidence interval, 0.97-0.99). Four patients underwent resection of the identified physeal bars, 9 patients were treated with epiphysiodesis, and 1 patient underwent bilateral tibial osteotomies. Semiautomated segmentation of the physis is a reproducible technique for generating physeal maps and accurately measuring physeal bars, providing quantitative and anatomic information that may inform surgical management and prognosis in patients with physeal injury. Level IV.
Analysis of high accuracy, quantitative proteomics data in the MaxQB database.
Schaab, Christoph; Geiger, Tamar; Stoehr, Gabriele; Cox, Juergen; Mann, Matthias
2012-03-01
MS-based proteomics generates rapidly increasing amounts of precise and quantitative information. Analysis of individual proteomic experiments has made great strides, but the crucial ability to compare and store information across different proteome measurements still presents many challenges. For example, it has been difficult to avoid contamination of databases with low quality peptide identifications, to control for the inflation in false positive identifications when combining data sets, and to integrate quantitative data. Although, for example, the contamination with low quality identifications has been addressed by joint analysis of deposited raw data in some public repositories, we reasoned that there should be a role for a database specifically designed for high resolution and quantitative data. Here we describe a novel database termed MaxQB that stores and displays collections of large proteomics projects and allows joint analysis and comparison. We demonstrate the analysis tools of MaxQB using proteome data of 11 different human cell lines and 28 mouse tissues. The database-wide false discovery rate is controlled by adjusting the project specific cutoff scores for the combined data sets. The 11 cell line proteomes together identify proteins expressed from more than half of all human genes. For each protein of interest, expression levels estimated by label-free quantification can be visualized across the cell lines. Similarly, the expression rank order and estimated amount of each protein within each proteome are plotted. We used MaxQB to calculate the signal reproducibility of the detected peptides for the same proteins across different proteomes. Spearman rank correlation between peptide intensity and detection probability of identified proteins was greater than 0.8 for 64% of the proteome, whereas a minority of proteins have negative correlation. This information can be used to pinpoint false protein identifications, independently of peptide database scores. The information contained in MaxQB, including high resolution fragment spectra, is accessible to the community via a user-friendly web interface at http://www.biochem.mpg.de/maxqb.
NASA Astrophysics Data System (ADS)
Torres-Verdin, C.
2007-05-01
This paper describes the successful implementation of a new 3D AVA stochastic inversion algorithm to quantitatively integrate pre-stack seismic amplitude data and well logs. The stochastic inversion algorithm is used to characterize flow units of a deepwater reservoir located in the central Gulf of Mexico. Conventional fluid/lithology sensitivity analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generates typical Class III AVA responses. On the other hand, layer- dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution. Accordingly, AVA stochastic inversion, which combines the advantages of AVA analysis with those of geostatistical inversion, provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties (P-velocity, S-velocity, density), and lithotype (sand- shale) distributions. The quantitative use of rock/fluid information through AVA seismic amplitude data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, yields accurate 3D models of petrophysical properties such as porosity and permeability. Finally, by fully integrating pre-stack seismic amplitude data and well logs, the vertical resolution of inverted products is higher than that of deterministic inversions methods.
Glawischnig, E; Gierl, A; Tomas, A; Bacher, A; Eisenreich, W
2001-03-01
Information on metabolic networks could provide the basis for the design of targets for metabolic engineering. To study metabolic flux in cereals, developing maize (Zea mays) kernels were grown in sterile culture on medium containing [U-(13)C(6)]glucose or [1,2-(13)C(2)]acetate. After growth, amino acids, lipids, and sitosterol were isolated from kernels as well as from the cobs, and their (13)C isotopomer compositions were determined by quantitative nuclear magnetic resonance spectroscopy. The highly specific labeling patterns were used to analyze the metabolic pathways leading to amino acids and the triterpene on a quantitative basis. The data show that serine is generated from phosphoglycerate, as well as from glycine. Lysine is formed entirely via the diaminopimelate pathway and sitosterol is synthesized entirely via the mevalonate route. The labeling data of amino acids and sitosterol were used to reconstruct the labeling patterns of key metabolic intermediates (e.g. acetyl-coenzyme A, pyruvate, phosphoenolpyruvate, erythrose 4-phosphate, and Rib 5-phosphate) that revealed quantitative information about carbon flux in the intermediary metabolism of developing maize kernels. Exogenous acetate served as an efficient precursor of sitosterol, as well as of amino acids of the aspartate and glutamate family; in comparison, metabolites formed in the plastidic compartments showed low acetate incorporation.
Maximum power point tracker for photovoltaic power plants
NASA Astrophysics Data System (ADS)
Arcidiacono, V.; Corsi, S.; Lambri, L.
The paper describes two different closed-loop control criteria for the maximum power point tracking of the voltage-current characteristic of a photovoltaic generator. The two criteria are discussed and compared, inter alia, with regard to the setting-up problems that they pose. Although a detailed analysis is not embarked upon, the paper also provides some quantitative information on the energy advantages obtained by using electronic maximum power point tracking systems, as compared with the situation in which the point of operation of the photovoltaic generator is not controlled at all. Lastly, the paper presents two high-efficiency MPPT converters for experimental photovoltaic plants of the stand-alone and the grid-interconnected type.
Label-free three-dimensional imaging of cell nucleus using third-harmonic generation microscopy
NASA Astrophysics Data System (ADS)
Lin, Jian; Zheng, Wei; Wang, Zi; Huang, Zhiwei
2014-09-01
We report the implementation of the combined third-harmonic generation (THG) and two-photon excited fluorescence (TPEF) microscopy for label-free three-dimensional (3-D) imaging of cell nucleus morphological changes in liver tissue. THG imaging shows regular spherical shapes of normal hepatocytes nuclei with inner chromatin structures while revealing the condensation of chromatins and nuclear fragmentations in hepatocytes of diseased liver tissue. Colocalized THG and TPEF imaging provides complementary information of cell nuclei and cytoplasm in tissue. This work suggests that 3-D THG microscopy has the potential for quantitative analysis of nuclear morphology in cells at a submicron-resolution without the need for DNA staining.
Label-free three-dimensional imaging of cell nucleus using third-harmonic generation microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jian; Zheng, Wei; Wang, Zi
2014-09-08
We report the implementation of the combined third-harmonic generation (THG) and two-photon excited fluorescence (TPEF) microscopy for label-free three-dimensional (3-D) imaging of cell nucleus morphological changes in liver tissue. THG imaging shows regular spherical shapes of normal hepatocytes nuclei with inner chromatin structures while revealing the condensation of chromatins and nuclear fragmentations in hepatocytes of diseased liver tissue. Colocalized THG and TPEF imaging provides complementary information of cell nuclei and cytoplasm in tissue. This work suggests that 3-D THG microscopy has the potential for quantitative analysis of nuclear morphology in cells at a submicron-resolution without the need for DNA staining.
Acoustic Facies Analysis of Side-Scan Sonar Data
NASA Astrophysics Data System (ADS)
Dwan, Fa Shu
Acoustic facies analysis methods have allowed the generation of system-independent values for the quantitative seafloor acoustic parameter, backscattering strength, from GLORIA and (TAMU) ^2 side-scan sonar data. The resulting acoustic facies parameters enable quantitative comparisons of data collected by different sonar systems, data from different environments, and measurements made with survey geometries. Backscattering strength values were extracted from the sonar amplitude data by inversion based on the sonar equation. Image processing products reveal seafloor features and patterns of relative intensity. To quantitatively compare data collected at different times or by different systems, and to ground truth-measurements and geoacoustic models, quantitative corrections must be made on any given data set for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area contribution, and grazing angle effects. In the sonar equation, backscattering strength is the sonar parameter which is directly related to seafloor properties. The GLORIA data used in this study are from the edge of a distal lobe of the Monterey Fan. An interfingered region of strong and weak seafloor signal returns from a flat seafloor region provides an ideal data set for this study. Inversion of imagery data from the region allows the quantitative definition of different acoustic facies. The (TAMU) ^2 data used are from a calibration site near the Green Canyon area of the Gulf of Mexico. Acoustic facies analysis techniques were implemented to generate statistical information for acoustic facies based on the estimates of backscattering strength. The backscattering strength values have been compared with Lambert's Law and other functions to parameterize the description of the acoustic facies. The resulting Lambertian constant values range from -26 dB to -36 dB. A modified Lambert relationship, which consists of both intercept and slope terms, appears to represent the BSS versus grazing angle profiles better based on chi^2 testing and error ellipse generation. Different regression functions, composed of trigonometric functions, were analyzed for different segments of the BSS profiles. A cotangent or sine/cosine function shows promising results for representing the entire grazing angle span of the BSS profiles.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...
Quantitative proteomics in Giardia duodenalis-Achievements and challenges.
Emery, Samantha J; Lacey, Ernest; Haynes, Paul A
2016-08-01
Giardia duodenalis (syn. G. lamblia and G. intestinalis) is a protozoan parasite of vertebrates and a major contributor to the global burden of diarrheal diseases and gastroenteritis. The publication of multiple genome sequences in the G. duodenalis species complex has provided important insights into parasite biology, and made post-genomic technologies, including proteomics, significantly more accessible. The aims of proteomics are to identify and quantify proteins present in a cell, and assign functions to them within the context of dynamic biological systems. In Giardia, proteomics in the post-genomic era has transitioned from reliance on gel-based systems to utilisation of a diverse array of techniques based on bottom-up LC-MS/MS technologies. Together, these have generated crucial foundations for subcellular proteomes, elucidated intra- and inter-assemblage isolate variation, and identified pathways and markers in differentiation, host-parasite interactions and drug resistance. However, in Giardia, proteomics remains an emerging field, with considerable shortcomings evident from the published research. These include a bias towards assemblage A, a lack of emphasis on quantitative analytical techniques, and limited information on post-translational protein modifications. Additionally, there are multiple areas of research for which proteomic data is not available to add value to published transcriptomic data. The challenge of amalgamating data in the systems biology paradigm necessitates the further generation of large, high-quality quantitative datasets to accurately model parasite biology. This review surveys the current proteomic research available for Giardia and evaluates their technical and quantitative approaches, while contextualising their biological insights into parasite pathology, isolate variation and eukaryotic evolution. Finally, we propose areas of priority for the generation of future proteomic data to explore fundamental questions in Giardia, including the analysis of post-translational modifications, and the design of MS-based assays for validation of differentially expressed proteins in large datasets. Copyright © 2016 Elsevier B.V. All rights reserved.
Vaccine Hesitancy and Online Information: The Influence of Digital Networks.
Getman, Rebekah; Helmi, Mohammad; Roberts, Hal; Yansane, Alfa; Cutler, David; Seymour, Brittany
2017-12-01
This article analyzes the digital childhood vaccination information network for vaccine-hesitant parents. The goal of this study was to explore the structure and influence of vaccine-hesitant content online by generating a database and network analysis of vaccine-relevant content. We used Media Cloud, a searchable big-data platform of over 550 million stories from 50,000 media sources, for quantitative and qualitative study of an online media sample based on keyword selection. We generated a hyperlink network map and measured indegree centrality of the sources and vaccine sentiment for a random sample of 450 stories. 28,122 publications from 4,817 sources met inclusion criteria. Clustered communities formed based on shared hyperlinks; communities tended to link within, not among, each other. The plurality of information was provaccine (46.44%, 95% confidence interval [39.86%, 53.20%]). The most influential sources were in the health community (National Institutes of Health, Centers for Disease Control and Prevention) or mainstream media ( New York Times); some user-generated sources also had strong influence and were provaccine (Wikipedia). The vaccine-hesitant community rarely interacted with provaccine content and simultaneously used primary provaccine content within vaccine-hesitant narratives. The sentiment of the overall conversation was consistent with scientific evidence. These findings demonstrate an online environment where scientific evidence online drives vaccine information outside of the vaccine-hesitant community but is also prominently used and misused within the robust vaccine-hesitant community. Future communication efforts should take current context into account; more information may not prevent vaccine hesitancy.
Next-Generation Terrestrial Laser Scanning to Measure Forest Canopy Structure
NASA Astrophysics Data System (ADS)
Danson, M.
2015-12-01
Terrestrial laser scanners (TLS) are now capable of semi-automatic reconstruction of the structure of complete trees or forest stands and have the potential to provide detailed information on tree architecture and foliage biophysical properties. The trends for the next generation of TLS are towards higher resolution, faster scanning and full-waveform data recording, with mobile, multispectral laser devices. The convergence of these technological advances in the next generation of TLS will allow the production of information for forest and woodland mapping and monitoring that is far more detailed, more accurate, and more comprehensive than any available today. This paper describes recent scientific advances in the application of TLS for characterising forest and woodland areas, drawing on the authors' development of the Salford Advanced Laser Canopy Analyser (SALCA), the activities of the Terrestrial Laser Scanner International Interest Group (TLSIIG), and recent advances in laser scanner technology around the world. The key findings illustrated in the paper are that (i) a complete understanding of system measurement characteristics is required for quantitative analysis of TLS data, (ii) full-waveform data recording is required for extraction of forest biophysical variables and, (iii) multi-wavelength systems provide additional spectral information that is essential for classifying different vegetation components. The paper uses a range of recent experimental TLS measurements to support these findings, and sets out a vision for new research to develop an information-rich future-forest information system, populated by mobile autonomous multispectral TLS devices.
Nomura, J-I; Uwano, I; Sasaki, M; Kudo, K; Yamashita, F; Ito, K; Fujiwara, S; Kobayashi, M; Ogasawara, K
2017-12-01
Preoperative hemodynamic impairment in the affected cerebral hemisphere is associated with the development of cerebral hyperperfusion following carotid endarterectomy. Cerebral oxygen extraction fraction images generated from 7T MR quantitative susceptibility mapping correlate with oxygen extraction fraction images on positron-emission tomography. The present study aimed to determine whether preoperative oxygen extraction fraction imaging generated from 7T MR quantitative susceptibility mapping could identify patients at risk for cerebral hyperperfusion following carotid endarterectomy. Seventy-seven patients with unilateral internal carotid artery stenosis (≥70%) underwent preoperative 3D T2*-weighted imaging using a multiple dipole-inversion algorithm with a 7T MR imager. Quantitative susceptibility mapping images were then obtained, and oxygen extraction fraction maps were generated. Quantitative brain perfusion single-photon emission CT was also performed before and immediately after carotid endarterectomy. ROIs were automatically placed in the bilateral middle cerebral artery territories in all images using a 3D stereotactic ROI template, and affected-to-contralateral ratios in the ROIs were calculated on quantitative susceptibility mapping-oxygen extraction fraction images. Ten patients (13%) showed post-carotid endarterectomy hyperperfusion (cerebral blood flow increases of ≥100% compared with preoperative values in the ROIs on brain perfusion SPECT). Multivariate analysis showed that a high quantitative susceptibility mapping-oxygen extraction fraction ratio was significantly associated with the development of post-carotid endarterectomy hyperperfusion (95% confidence interval, 33.5-249.7; P = .002). Sensitivity, specificity, and positive- and negative-predictive values of the quantitative susceptibility mapping-oxygen extraction fraction ratio for the prediction of the development of post-carotid endarterectomy hyperperfusion were 90%, 84%, 45%, and 98%, respectively. Preoperative oxygen extraction fraction imaging generated from 7T MR quantitative susceptibility mapping identifies patients at risk for cerebral hyperperfusion following carotid endarterectomy. © 2017 by American Journal of Neuroradiology.
Sardiu, Mihaela E; Gilmore, Joshua M; Carrozza, Michael J; Li, Bing; Workman, Jerry L; Florens, Laurence; Washburn, Michael P
2009-10-06
Protein complexes are key molecular machines executing a variety of essential cellular processes. Despite the availability of genome-wide protein-protein interaction studies, determining the connectivity between proteins within a complex remains a major challenge. Here we demonstrate a method that is able to predict the relationship of proteins within a stable protein complex. We employed a combination of computational approaches and a systematic collection of quantitative proteomics data from wild-type and deletion strain purifications to build a quantitative deletion-interaction network map and subsequently convert the resulting data into an interdependency-interaction model of a complex. We applied this approach to a data set generated from components of the Saccharomyces cerevisiae Rpd3 histone deacetylase complexes, which consists of two distinct small and large complexes that are held together by a module consisting of Rpd3, Sin3 and Ume1. The resulting representation reveals new protein-protein interactions and new submodule relationships, providing novel information for mapping the functional organization of a complex.
Compressive hyperspectral time-resolved wide-field fluorescence lifetime imaging
NASA Astrophysics Data System (ADS)
Pian, Qi; Yao, Ruoyang; Sinsuebphon, Nattawut; Intes, Xavier
2017-07-01
Spectrally resolved fluorescence lifetime imaging and spatial multiplexing have offered information content and collection-efficiency boosts in microscopy, but efficient implementations for macroscopic applications are still lacking. An imaging platform based on time-resolved structured light and hyperspectral single-pixel detection has been developed to perform quantitative macroscopic fluorescence lifetime imaging (MFLI) over a large field of view (FOV) and multiple spectral bands simultaneously. The system makes use of three digital micromirror device (DMD)-based spatial light modulators (SLMs) to generate spatial optical bases and reconstruct N by N images over 16 spectral channels with a time-resolved capability (∼40 ps temporal resolution) using fewer than N2 optical measurements. We demonstrate the potential of this new imaging platform by quantitatively imaging near-infrared (NIR) Förster resonance energy transfer (FRET) both in vitro and in vivo. The technique is well suited for quantitative hyperspectral lifetime imaging with a high sensitivity and paves the way for many important biomedical applications.
NASA Astrophysics Data System (ADS)
Ragno, Rino; Ballante, Flavio; Pirolli, Adele; Wickersham, Richard B.; Patsilinakos, Alexandros; Hesse, Stéphanie; Perspicace, Enrico; Kirsch, Gilbert
2015-08-01
Vascular endothelial growth factor receptor-2, (VEGFR-2), is a key element in angiogenesis, the process by which new blood vessels are formed, and is thus an important pharmaceutical target. Here, 3-D quantitative structure-activity relationship (3-D QSAR) were used to build a quantitative screening and pharmacophore model of the VEGFR-2 receptors for design of inhibitors with improved activities. Most of available experimental data information has been used as training set to derive optimized and fully cross-validated eight mono-probe and a multi-probe quantitative models. Notable is the use of 262 molecules, aligned following both structure-based and ligand-based protocols, as external test set confirming the 3-D QSAR models' predictive capability and their usefulness in design new VEGFR-2 inhibitors. From a survey on literature, this is the first generation of a wide-ranging computational medicinal chemistry application on VEGFR2 inhibitors.
A preliminary study of DTI Fingerprinting on stroke analysis.
Ma, Heather T; Ye, Chenfei; Wu, Jun; Yang, Pengfei; Chen, Xuhui; Yang, Zhengyi; Ma, Jingbo
2014-01-01
DTI (Diffusion Tensor Imaging) is a well-known MRI (Magnetic Resonance Imaging) technique which provides useful structural information about human brain. However, the quantitative measurement to physiological variation of subtypes of ischemic stroke is not available. An automatically quantitative method for DTI analysis will enhance the DTI application in clinics. In this study, we proposed a DTI Fingerprinting technology to quantitatively analyze white matter tissue, which was applied in stroke classification. The TBSS (Tract Based Spatial Statistics) method was employed to generate mask automatically. To evaluate the clustering performance of the automatic method, lesion ROI (Region of Interest) is manually drawn on the DWI images as a reference. The results from the DTI Fingerprinting were compared with those obtained from the reference ROIs. It indicates that the DTI Fingerprinting could identify different states of ischemic stroke and has promising potential to provide a more comprehensive measure of the DTI data. Further development should be carried out to improve DTI Fingerprinting technology in clinics.
Plant hormone signaling during development: insights from computational models.
Oliva, Marina; Farcot, Etienne; Vernoux, Teva
2013-02-01
Recent years have seen an impressive increase in our knowledge of the topology of plant hormone signaling networks. The complexity of these topologies has motivated the development of models for several hormones to aid understanding of how signaling networks process hormonal inputs. Such work has generated essential insights into the mechanisms of hormone perception and of regulation of cellular responses such as transcription in response to hormones. In addition, modeling approaches have contributed significantly to exploring how spatio-temporal regulation of hormone signaling contributes to plant growth and patterning. New tools have also been developed to obtain quantitative information on hormone distribution during development and to test model predictions, opening the way for quantitative understanding of the developmental roles of hormones. Copyright © 2012 Elsevier Ltd. All rights reserved.
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2014 CFR
2014-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2013 CFR
2013-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2011 CFR
2011-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2012 CFR
2012-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Schlicher, Bob G
Vulnerability in security of an information system is quantitatively predicted. The information system may receive malicious actions against its security and may receive corrective actions for restoring the security. A game oriented agent based model is constructed in a simulator application. The game ABM model represents security activity in the information system. The game ABM model has two opposing participants including an attacker and a defender, probabilistic game rules and allowable game states. A specified number of simulations are run and a probabilistic number of the plurality of allowable game states are reached in each simulation run. The probability ofmore » reaching a specified game state is unknown prior to running each simulation. Data generated during the game states is collected to determine a probability of one or more aspects of security in the information system.« less
ERIC Educational Resources Information Center
Boonsathorn, Wasita; Charoen, Danuvasin; Dryver, Arthur L.
2014-01-01
E-Learning brings access to a powerful but often overlooked teaching tool: random number generation. Using random number generation, a practically infinite number of quantitative problem-solution sets can be created. In addition, within the e-learning context, in the spirit of the mastery of learning, it is possible to assign online quantitative…
DSSTOX (DISTRIBUTED STRUCTURE-SEARCHABLE ...
Distributed Structure-Searchable Toxicity Database Network Major trends affecting public toxicity information resources have the potential to significantly alter the future of predictive toxicology. Chemical toxicity screening is undergoing shifts towards greater use of more fundamental information on gene/protein expression patterns and bioactivity and bioassay profiles, the latter generated with highthroughput screening technologies. Curated, systematically organized, and webaccessible toxicity and biological activity data in association with chemical structures, enabling the integration of diverse data information domains, will fuel the next frontier of advancement for QSAR (quantitative structure-activity relationship) and data mining technologies. The DSSTox project is supporting progress towards these goals on many fronts, promoting the use of formalized and structure-annotated toxicity data models, helping to interface these efforts with QSAR modelers, linking data from diverse sources, and creating a large, quality reviewed, central chemical structure information resource linked to various toxicity data sources
Metzger, Gregory J; Kalavagunta, Chaitanya; Spilseth, Benjamin; Bolan, Patrick J; Li, Xiufeng; Hutter, Diane; Nam, Jung W; Johnson, Andrew D; Henriksen, Jonathan C; Moench, Laura; Konety, Badrinath; Warlick, Christopher A; Schmechel, Stephen C; Koopmeiners, Joseph S
2016-06-01
Purpose To develop multiparametric magnetic resonance (MR) imaging models to generate a quantitative, user-independent, voxel-wise composite biomarker score (CBS) for detection of prostate cancer by using coregistered correlative histopathologic results, and to compare performance of CBS-based detection with that of single quantitative MR imaging parameters. Materials and Methods Institutional review board approval and informed consent were obtained. Patients with a diagnosis of prostate cancer underwent multiparametric MR imaging before surgery for treatment. All MR imaging voxels in the prostate were classified as cancer or noncancer on the basis of coregistered histopathologic data. Predictive models were developed by using more than one quantitative MR imaging parameter to generate CBS maps. Model development and evaluation of quantitative MR imaging parameters and CBS were performed separately for the peripheral zone and the whole gland. Model accuracy was evaluated by using the area under the receiver operating characteristic curve (AUC), and confidence intervals were calculated with the bootstrap procedure. The improvement in classification accuracy was evaluated by comparing the AUC for the multiparametric model and the single best-performing quantitative MR imaging parameter at the individual level and in aggregate. Results Quantitative T2, apparent diffusion coefficient (ADC), volume transfer constant (K(trans)), reflux rate constant (kep), and area under the gadolinium concentration curve at 90 seconds (AUGC90) were significantly different between cancer and noncancer voxels (P < .001), with ADC showing the best accuracy (peripheral zone AUC, 0.82; whole gland AUC, 0.74). Four-parameter models demonstrated the best performance in both the peripheral zone (AUC, 0.85; P = .010 vs ADC alone) and whole gland (AUC, 0.77; P = .043 vs ADC alone). Individual-level analysis showed statistically significant improvement in AUC in 82% (23 of 28) and 71% (24 of 34) of patients with peripheral-zone and whole-gland models, respectively, compared with ADC alone. Model-based CBS maps for cancer detection showed improved visualization of cancer location and extent. Conclusion Quantitative multiparametric MR imaging models developed by using coregistered correlative histopathologic data yielded a voxel-wise CBS that outperformed single quantitative MR imaging parameters for detection of prostate cancer, especially when the models were assessed at the individual level. (©) RSNA, 2016 Online supplemental material is available for this article.
Quantitative evaluation of low-cost frame-grabber boards for personal computers.
Kofler, J M; Gray, J E; Fuelberth, J T; Taubel, J P
1995-11-01
Nine moderately priced frame-grabber boards for both Macintosh (Apple Computers, Cupertino, CA) and IBM-compatible computers were evaluated using a Society of Motion Pictures and Television Engineers (SMPTE) pattern and a video signal generator for dynamic range, gray-scale reproducibility, and spatial integrity of the captured image. The degradation of the video information ranged from minor to severe. Some boards are of reasonable quality for applications in diagnostic imaging and education. However, price and quality are not necessarily directly related.
Modeling the filament winding process
NASA Technical Reports Server (NTRS)
Calius, E. P.; Springer, G. S.
1985-01-01
A model is presented which can be used to determine the appropriate values of the process variables for filament winding a cylinder. The model provides the cylinder temperature, viscosity, degree of cure, fiber position and fiber tension as functions of position and time during the filament winding and subsequent cure, and the residual stresses and strains within the cylinder during and after the cure. A computer code was developed to obtain quantitative results. Sample results are given which illustrate the information that can be generated with this code.
2006-11-01
Hampton, VA 23666 November 2006 Approved for public release: distribution is unlimited. 20070907323 ABERDEEN PROVING GROUND, MD 21010-5424 DISCLAIMER...REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour...SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION I AVAILABILITY STATEMENT Approved for public release; distribution is unlimited. 13. SUPPLEMENTARY
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 17 Commodity and Securities Exchanges 3 2014-04-01 2014-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
NASA Astrophysics Data System (ADS)
Birch, A. L.; Stallard, R. F.; Barnard, H. R.
2017-12-01
While relationships between land use/land cover and hydrology are well studied and understood in temperate parts of the world, little research exists in the humid tropics, where hydrologic research is often decades behind. Specifically, quantitative information on how physical and biological differences across varying land covers influence runoff generation and hydrologic flowpaths in the humid tropics is scarce; frequently leading to poorly informed hydrologic modelling and water policy decision making. This research effort seeks to quantify how tropical land cover change may alter physical hydrologic processes in the economically important Panama Canal Watershed (Republic of Panama) by separating streamflow into its different runoff components using end member mixing analysis. The samples collected for this project come from small headwater catchments of four varying land covers (mature tropical forest, young secondary forest, active pasture, recently clear-cut tropical forest) within the Smithsonian Tropical Research Institute's Agua Salud Project. During the past three years, samples have been collected at the four study catchments from streamflow and from a number of water sources within hillslope transects, and have been analyzed for stable water isotopes, major cations, and major anions. Major ion analysis of these samples has shown distinct geochemical differences for the potential runoff generating end members sampled (soil moisture/ preferential flow, groundwater, overland flow, throughfall, and precipitation). Based on this finding, an effort was made from May-August 2017 to intensively sample streamflow during wet season storm events, yielding a total of 5 events of varying intensity in each land cover/catchment, with sampling intensity ranging from sub-hourly to sub-daily. The focus of this poster presentation will be to present the result of hydrograph separation's done using end member mixing analysis from this May-August 2017 storm dataset. Expected results presented will yield an increase in the quantitative understanding of how land cover may influence physical hydrologic flowpaths and runoff generation in the humid tropics.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-08
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. The FHWA received no comments in...
The effect of hydrodynamic conditions on the phenotype of Pseudomonas fluorescens biofilms.
Simões, Manuel; Pereira, Maria O; Sillankorva, Sanna; Azeredo, Joana; Vieira, Maria J
2007-01-01
This study investigated the phenotypic characteristics of monoculture P. fluorescens biofilms grown under turbulent and laminar flow, using flow cells reactors with stainless steel substrata. The cellular physiology and the overall biofilm activity, structure and composition were characterized, and compared, within hydrodynamically distinct conditions. The results indicate that turbulent flow-generated biofilm cells were significantly less extensive, with decreased metabolic activity and a lower protein and polysaccharides composition per cell than those from laminar flow-generated biofilms. The effect of flow regime did not cause significantly different outer membrane protein expression. From the analysis of biofilm activity, structure and composition, turbulent flow-generated biofilms were metabolically more active, had twice more mass per cm(2), and higher cellular density and protein content (mainly cellular) than laminar flow-generated biofilms. Conversely, laminar flow-generated biofilms presented higher total and matrix polysaccharide contents. Direct visualisation and scanning electron microscopy analysis showed that these different flows generate structurally different biofilms, corroborating the quantitative results. The combination of applied methods provided useful information regarding a broad spectrum of biofilm parameters, which can contribute to control and model biofilm processes.
NASA Astrophysics Data System (ADS)
Barrineau, C. P.; Dobreva, I. D.; Bishop, M. P.; Houser, C.
2014-12-01
Aeolian systems are ideal natural laboratories for examining self-organization in patterned landscapes, as certain wind regimes generate certain morphologies. Topographic information and scale dependent analysis offer the opportunity to study such systems and characterize process-form relationships. A statistically based methodology for differentiating aeolian features would enable the quantitative association of certain surface characteristics with certain morphodynamic regimes. We conducted a multi-resolution analysis of LiDAR elevation data to assess scale-dependent morphometric variations in an aeolian landscape in South Texas. For each pixel, mean elevation values are calculated along concentric circles moving outward at 100-meter intervals (i.e. 500 m, 600 m, 700 m from pixel). The calculated average elevation values plotted against distance from the pixel of interest as curves are used to differentiate multi-scalar variations in elevation across the landscape. In this case, it is hypothesized these curves may be used to quantitatively differentiate certain morphometries from others like a spectral signature may be used to classify paved surfaces from natural vegetation, for example. After generating multi-resolution curves for all the pixels in a selected area of interest (AOI), a Principal Components Analysis is used to highlight commonalities and singularities between generated curves from pixels across the AOI. Our findings suggest that the resulting components could be used for identification of discrete aeolian features like open sands, trailing ridges and active dune crests, and, in particular, zones of deflation. This new approach to landscape characterization not only works to mitigate bias introduced when researchers must select training pixels for morphometric investigations, but can also reveal patterning in aeolian landscapes that would not be as obvious without quantitative characterization.
Gray correlation analysis and prediction models of living refuse generation in Shanghai city.
Liu, Gousheng; Yu, Jianguo
2007-01-01
A better understanding of the factors that affect the generation of municipal living refuse (MLF) and the accurate prediction of its generation are crucial for municipal planning projects and city management. Up to now, most of the design efforts have been based on a rough prediction of MLF without any actual support. In this paper, based on published data of socioeconomic variables and MLF generation from 1990 to 2003 in the city of Shanghai, the main factors that affect MLF generation have been quantitatively studied using the method of gray correlation coefficient. Several gray models, such as GM(1,1), GIM(1), GPPM(1) and GLPM(1), have been studied, and predicted results are verified with subsequent residual test. Results show that, among the selected seven factors, consumption of gas, water and electricity are the largest three factors affecting MLF generation, and GLPM(1) is the optimized model to predict MLF generation. Through this model, the predicted MLF generation in 2010 in Shanghai will be 7.65 million tons. The methods and results developed in this paper can provide valuable information for MLF management and related municipal planning projects.
Ready for a paradigm shift? Part 1: introducing the philosophy of qualitative research.
Petty, Nicola J; Thomson, Oliver P; Stew, Graham
2012-08-01
The manual therapy professions have almost exclusively focused on the use of quantitative research to help inform their practices. This paper argues that a greater use of qualitative research will help develop a more robust and comprehensive knowledge base in manual therapy. The types of knowledge used in practice and generated from the two research paradigms are explored. It is hoped that an understanding of the philosophical and theoretical underpinnings of qualitative research may encourage more manual therapists to value and use this approach to help further inform their practice; for some, this may involve a paradigm shift in thinking. Copyright © 2012 Elsevier Ltd. All rights reserved.
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2011 CFR
2011-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2012 CFR
2012-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2013 CFR
2013-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
Neuroergonomics: Quantitative Modeling of Individual, Shared, and Team Neurodynamic Information.
Stevens, Ronald H; Galloway, Trysha L; Willemsen-Dunlap, Ann
2018-06-01
The aim of this study was to use the same quantitative measure and scale to directly compare the neurodynamic information/organizations of individual team members with those of the team. Team processes are difficult to separate from those of individual team members due to the lack of quantitative measures that can be applied to both process sets. Second-by-second symbolic representations were created of each team member's electroencephalographic power, and quantitative estimates of their neurodynamic organizations were calculated from the Shannon entropy of the symbolic data streams. The information in the neurodynamic data streams of health care ( n = 24), submarine navigation ( n = 12), and high school problem-solving ( n = 13) dyads was separated into the information of each team member, the information shared by team members, and the overall team information. Most of the team information was the sum of each individual's neurodynamic information. The remaining team information was shared among the team members. This shared information averaged ~15% of the individual information, with momentary levels of 1% to 80%. Continuous quantitative estimates can be made from the shared, individual, and team neurodynamic information about the contributions of different team members to the overall neurodynamic organization of a team and the neurodynamic interdependencies among the team members. Information models provide a generalizable quantitative method for separating a team's neurodynamic organization into that of individual team members and that shared among team members.
Improving the imaging of calcifications in CT by histogram-based selective deblurring
NASA Astrophysics Data System (ADS)
Rollano-Hijarrubia, Empar; van der Meer, Frits; van der Lugt, Add; Weinans, Harrie; Vrooman, Henry; Vossepoel, Albert; Stokking, Rik
2005-04-01
Imaging of small high-density structures, such as calcifications, with computed tomography (CT) is limited by the spatial resolution of the system. Blur causes small calcifications to be imaged with lower contrast and overestimated volume, thereby hampering the analysis of vessels. The aim of this work is to reduce the blur of calcifications by applying three-dimensional (3D) deconvolution. Unfortunately, the high-frequency amplification of the deconvolution produces edge-related ring artifacts and enhances noise and original artifacts, which degrades the imaging of low-density structures. A method, referred to as Histogram-based Selective Deblurring (HiSD), was implemented to avoid these negative effects. HiSD uses the histogram information to generate a restored image in which the low-intensity voxel information of the observed image is combined with the high-intensity voxel information of the deconvolved image. To evaluate HiSD we scanned four in-vitro atherosclerotic plaques of carotid arteries with a multislice spiral CT and with a microfocus CT (μCT), used as reference. Restored images were generated from the observed images, and qualitatively and quantitatively compared with their corresponding μCT images. Transverse views and maximum-intensity projections of restored images show the decrease of blur of the calcifications in 3D. Measurements of the areas of 27 calcifications and total volumes of calcification of 4 plaques show that the overestimation of calcification was smaller for restored images (mean-error: 90% for area; 92% for volume) than for observed images (143%; 213%, respectively). The qualitative and quantitative analyses show that the imaging of calcifications in CT can be improved considerably by applying HiSD.
NASA Technical Reports Server (NTRS)
D'Amelio, F.; Wu, L. C.; Fox, R. A.; Daunton, N. G.; Corcoran, M. L.; Polyakov, I.
1998-01-01
Quantitative evaluation of gamma-aminobutyric acid immunoreactivity (GABA-IR) in the hindlimb representation of the rat somatosensory cortex after 14 days of exposure to hypergravity (hyper-G) was conducted by using computer-assisted image processing. The area of GABA-IR axosomatic terminals apposed to pyramidal cells of cortical layer V was reduced in rats exposed to hyper-G compared with control rats, which were exposed either to rotation alone or to vivarium conditions. Based on previous immunocytochemical and behavioral studies, we suggest that this reduction is due to changes in sensory feedback information from muscle receptors. Consequently, priorities for muscle recruitment are altered at the cortical level, and a new pattern of muscle activity is thus generated. It is proposed that the reduction observed in GABA-IR of the terminal area around pyramidal neurons is the immunocytochemical expression of changes in the activity of GABAergic cells that participate in reprogramming motor outputs to achieve effective movement control in response to alterations in the afferent information.
Quantitative Aspects of Single Molecule Microscopy
Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally
2015-01-01
Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102
Quantitative Evaluation Method of Each Generation Margin for Power System Planning
NASA Astrophysics Data System (ADS)
Su, Su; Tanaka, Kazuyuki
As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.
NASA Astrophysics Data System (ADS)
Prat, O. P.; Nelson, B. R.; Stevens, S. E.; Nickl, E.; Seo, D. J.; Kim, B.; Zhang, J.; Qi, Y.
2015-12-01
The processing of radar-only precipitation via the reanalysis from the National Mosaic and Multi-Sensor Quantitative (NMQ/Q2) based on the WSR-88D Next-generation Radar (Nexrad) network over the Continental United States (CONUS) is completed for the period covering from 2002 to 2011. While this constitutes a unique opportunity to study precipitation processes at higher resolution than conventionally possible (1-km, 5-min), the long-term radar-only product needs to be merged with in-situ information in order to be suitable for hydrological, meteorological and climatological applications. The radar-gauge merging is performed by using rain gauge information at daily (Global Historical Climatology Network-Daily: GHCN-D), hourly (Hydrometeorological Automated Data System: HADS), and 5-min (Automated Surface Observing Systems: ASOS; Climate Reference Network: CRN) resolution. The challenges related to incorporating differing resolution and quality networks to generate long-term large-scale gridded estimates of precipitation are enormous. In that perspective, we are implementing techniques for merging the rain gauge datasets and the radar-only estimates such as Inverse Distance Weighting (IDW), Simple Kriging (SK), Ordinary Kriging (OK), and Conditional Bias-Penalized Kriging (CBPK). An evaluation of the different radar-gauge merging techniques is presented and we provide an estimate of uncertainty for the gridded estimates. In addition, comparisons with a suite of lower resolution QPEs derived from ground based radar measurements (Stage IV) are provided in order to give a detailed picture of the improvements and remaining challenges.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... and opinions, but are not statistical surveys that yield quantitative results that can be generalized... generic clearance for qualitative information will not be used for quantitative information collections... for submission for other generic mechanisms that are designed to yield quantitative results. The...
NASA Technical Reports Server (NTRS)
Backus, George E.
1999-01-01
The purpose of the grant was to study how prior information about the geomagnetic field can be used to interpret surface and satellite magnetic measurements, to generate quantitative descriptions of prior information that might be so used, and to use this prior information to obtain from satellite data a model of the core field with statistically justifiable error estimates. The need for prior information in geophysical inversion has long been recognized. Data sets are finite, and faithful descriptions of aspects of the earth almost always require infinite-dimensional model spaces. By themselves, the data can confine the correct earth model only to an infinite-dimensional subset of the model space. Earth properties other than direct functions of the observed data cannot be estimated from those data without prior information about the earth. Prior information is based on what the observer already knows before the data become available. Such information can be "hard" or "soft". Hard information is a belief that the real earth must lie in some known region of model space. For example, the total ohmic dissipation in the core is probably less that the total observed geothermal heat flow out of the earth's surface. (In principle, ohmic heat in the core can be recaptured to help drive the dynamo, but this effect is probably small.) "Soft" information is a probability distribution on the model space, a distribution that the observer accepts as a quantitative description of her/his beliefs about the earth. The probability distribution can be a subjective prior in the sense of Bayes or the objective result of a statistical study of previous data or relevant theories.
Is this the real time for genomics?
Guarnaccia, Maria; Gentile, Giulia; Alessi, Enrico; Schneider, Claudio; Petralia, Salvatore; Cavallaro, Sebastiano
2014-01-01
In the last decades, molecular biology has moved from gene-by-gene analysis to more complex studies using a genome-wide scale. Thanks to high-throughput genomic technologies, such as microarrays and next-generation sequencing, a huge amount of information has been generated, expanding our knowledge on the genetic basis of various diseases. Although some of this information could be transferred to clinical diagnostics, the technologies available are not suitable for this purpose. In this review, we will discuss the drawbacks associated with the use of traditional DNA microarrays in diagnostics, pointing out emerging platforms that could overcome these obstacles and offer a more reproducible, qualitative and quantitative multigenic analysis. New miniaturized and automated devices, called Lab-on-Chip, begin to integrate PCR and microarray on the same platform, offering integrated sample-to-result systems. The introduction of this kind of innovative devices may facilitate the transition of genome-based tests into clinical routine. Copyright © 2014. Published by Elsevier Inc.
Generating Text from Functional Brain Images
Pereira, Francisco; Detre, Greg; Botvinick, Matthew
2011-01-01
Recent work has shown that it is possible to take brain images acquired during viewing of a scene and reconstruct an approximation of the scene from those images. Here we show that it is also possible to generate text about the mental content reflected in brain images. We began with images collected as participants read names of concrete items (e.g., “Apartment’’) while also seeing line drawings of the item named. We built a model of the mental semantic representation of concrete concepts from text data and learned to map aspects of such representation to patterns of activation in the corresponding brain image. In order to validate this mapping, without accessing information about the items viewed for left-out individual brain images, we were able to generate from each one a collection of semantically pertinent words (e.g., “door,” “window” for “Apartment’’). Furthermore, we show that the ability to generate such words allows us to perform a classification task and thus validate our method quantitatively. PMID:21927602
Inferring Biological Structures from Super-Resolution Single Molecule Images Using Generative Models
Maji, Suvrajit; Bruchez, Marcel P.
2012-01-01
Localization-based super resolution imaging is presently limited by sampling requirements for dynamic measurements of biological structures. Generating an image requires serial acquisition of individual molecular positions at sufficient density to define a biological structure, increasing the acquisition time. Efficient analysis of biological structures from sparse localization data could substantially improve the dynamic imaging capabilities of these methods. Using a feature extraction technique called the Hough Transform simple biological structures are identified from both simulated and real localization data. We demonstrate that these generative models can efficiently infer biological structures in the data from far fewer localizations than are required for complete spatial sampling. Analysis at partial data densities revealed efficient recovery of clathrin vesicle size distributions and microtubule orientation angles with as little as 10% of the localization data. This approach significantly increases the temporal resolution for dynamic imaging and provides quantitatively useful biological information. PMID:22629348
Ivanciuc, O; Ivanciuc, T; Klein, D J; Seitz, W A; Balaban, A T
2001-02-01
Quantitative structure-retention relationships (QSRR) represent statistical models that quantify the connection between the molecular structure and the chromatographic retention indices of organic compounds, allowing the prediction of retention indices of novel, not yet synthesized compounds, solely from their structural descriptors. Using multiple linear regression, QSRR models for the gas chromatographic Kováts retention indices of 129 alkylbenzenes are generated using molecular graph descriptors. The correlational ability of structural descriptors computed from 10 molecular matrices is investigated, showing that the novel reciprocal matrices give numerical indices with improved correlational ability. A QSRR equation with 5 graph descriptors gives the best calibration and prediction results, demonstrating the usefulness of the molecular graph descriptors in modeling chromatographic retention parameters. The sequential orthogonalization of descriptors suggests simpler QSRR models by eliminating redundant structural information.
Systems Biology in Immunology – A Computational Modeling Perspective
Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.
2011-01-01
Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182
Weisner, Thomas S; Fiese, Barbara H
2011-12-01
Mixed methods in family psychology refer to the systematic integration of qualitative and quantitative techniques to represent family processes and settings. Over the past decade, significant advances have been made in study design, analytic strategies, and technological support (such as software) that allow for the integration of quantitative and qualitative methods and for making appropriate inferences from mixed methods. This special section of the Journal of Family Psychology illustrates how mixed methods may be used to advance knowledge in family science through identifying important cultural differences in family structure, beliefs, and practices, and revealing patterns of family relationships to generate new measurement paradigms and inform clinical practice. Guidance is offered to advance mixed methods research in family psychology through sound principles of peer review.
Aguillo, I
2000-01-01
Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites.
40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.
Code of Federal Regulations, 2013 CFR
2013-07-01
... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Solubilities and Octanol-Water Partition Coefficients of Hydrophobic Substances,” Journal of Research of the...
40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.
Code of Federal Regulations, 2014 CFR
2014-07-01
... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Solubilities and Octanol-Water Partition Coefficients of Hydrophobic Substances,” Journal of Research of the...
Badawi, A M; Derbala, A S; Youssef, A M
1999-08-01
Computerized ultrasound tissue characterization has become an objective means for diagnosis of liver diseases. It is difficult to differentiate diffuse liver diseases, namely cirrhotic and fatty liver by visual inspection from the ultrasound images. The visual criteria for differentiating diffused diseases are rather confusing and highly dependent upon the sonographer's experience. This often causes a bias effects in the diagnostic procedure and limits its objectivity and reproducibility. Computerized tissue characterization to assist quantitatively the sonographer for the accurate differentiation and to minimize the degree of risk is thus justified. Fuzzy logic has emerged as one of the most active area in classification. In this paper, we present an approach that employs Fuzzy reasoning techniques to automatically differentiate diffuse liver diseases using numerical quantitative features measured from the ultrasound images. Fuzzy rules were generated from over 140 cases consisting of normal, fatty, and cirrhotic livers. The input to the fuzzy system is an eight dimensional vector of feature values: the mean gray level (MGL), the percentile 10%, the contrast (CON), the angular second moment (ASM), the entropy (ENT), the correlation (COR), the attenuation (ATTEN) and the speckle separation. The output of the fuzzy system is one of the three categories: cirrhosis, fatty or normal. The steps done for differentiating the pathologies are data acquisition and feature extraction, dividing the input spaces of the measured quantitative data into fuzzy sets. Based on the expert knowledge, the fuzzy rules are generated and applied using the fuzzy inference procedures to determine the pathology. Different membership functions are developed for the input spaces. This approach has resulted in very good sensitivities and specificity for classifying diffused liver pathologies. This classification technique can be used in the diagnostic process, together with the history information, laboratory, clinical and pathological examinations.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-25
... Comprehensive Quantitative Impact Study.'' DATES: You should submit comments by March 26, 2010. ADDRESSES... requesting approval of the following new information collection: Title: Basel Comprehensive Quantitative... quantitative impact study (QIS) to assess the impact of the proposed revisions that were published by the Basel...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... not statistical surveys that yield quantitative results that can be generalized to the population of... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. No comments were received in response...
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C
2016-07-21
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods
NASA Astrophysics Data System (ADS)
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.
2016-07-01
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantitative myocardial perfusion from static cardiac and dynamic arterial CT
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Branch, Kelley R.; Alessio, Adam M.
2018-05-01
Quantitative myocardial blood flow (MBF) estimation by dynamic contrast enhanced cardiac computed tomography (CT) requires multi-frame acquisition of contrast transit through the blood pool and myocardium to inform the arterial input and tissue response functions. Both the input and the tissue response functions for the entire myocardium are sampled with each acquisition. However, the long breath holds and frequent sampling can result in significant motion artifacts and relatively high radiation dose. To address these limitations, we propose and evaluate a new static cardiac and dynamic arterial (SCDA) quantitative MBF approach where (1) the input function is well sampled using either prediction from pre-scan timing bolus data or measured from dynamic thin slice ‘bolus tracking’ acquisitions, and (2) the whole-heart tissue response data is limited to one contrast enhanced CT acquisition. A perfusion model uses the dynamic arterial input function to generate a family of possible myocardial contrast enhancement curves corresponding to a range of MBF values. Combined with the timing of the single whole-heart acquisition, these curves generate a lookup table relating myocardial contrast enhancement to quantitative MBF. We tested the SCDA approach in 28 patients that underwent a full dynamic CT protocol both at rest and vasodilator stress conditions. Using measured input function plus single (enhanced CT only) or plus double (enhanced and contrast free baseline CT’s) myocardial acquisitions yielded MBF estimates with root mean square (RMS) error of 1.2 ml/min/g and 0.35 ml/min/g, and radiation dose reductions of 90% and 83%, respectively. The prediction of the input function based on timing bolus data and the static acquisition had an RMS error compared to the measured input function of 26.0% which led to MBF estimation errors greater than threefold higher than using the measured input function. SCDA presents a new, simplified approach for quantitative perfusion imaging with an acquisition strategy offering substantial radiation dose and computational complexity savings over dynamic CT.
Comparative study of the vapor analytes of trinitrotoluene (TNT)
NASA Astrophysics Data System (ADS)
Edge, Cindy C.; Gibb, Julie; Dugan, Regina E.
1998-12-01
Trinitrotoluene (TNT) is a high explosive used in most antipersonnel and antitank landmines. The Institute for Biological Detection Systems (IBDS) has developed a quantitative vapor delivery system, termed olfactometer, for conducting canine olfactory research. The research is conducted utilizing dynamic conditions, therefore, it is imperative to evaluate the headspace of TNT to ensure consistency with the dynamic generation of vapor. This study quantified the vapor headspace of military- grade TNT utilizing two different vapor generated methodologies, static and dynamic, reflecting differences between field and laboratory environments. Static vapor collection, which closely mimics conditions found during field detection, is defined as vapor collected in an open-air environment at ambient temperature. Dynamic vapor collection incorporates trapping of gases from a high flow vapor generation cell used during olfactometer operation. Analysis of samples collected by the two methodologies was performed by gas chromatography/mass spectrometry and the results provided information with regard to the constituents detected. However, constituent concentration did vary between the sampling methods. This study provides essential information regarding the vapor constituents associated with the TNT sampled using different sampling methods. These differences may be important in determining the detection signature dogs use to recognize TNT.
O'Neill, Sharon; Mathis, Magalie; Kovačič, Lidija; Zhang, Suisheng; Reinhardt, Jürgen; Scholz, Dimitri; Schopfer, Ulrich; Bouhelal, Rochdi; Knaus, Ulla G
2018-06-08
Protein-protein interactions critically regulate many biological systems, but quantifying functional assembly of multipass membrane complexes in their native context is still challenging. Here, we combined modeling-assisted protein modification and information from human disease variants with a minimal-size fusion tag, split-luciferase-based approach to probe assembly of the NADPH oxidase 4 (NOX4)-p22 phox enzyme, an integral membrane complex with unresolved structure, which is required for electron transfer and generation of reactive oxygen species (ROS). Integrated analyses of heterodimerization, trafficking, and catalytic activity identified determinants for the NOX4-p22 phox interaction, such as heme incorporation into NOX4 and hot spot residues in transmembrane domains 1 and 4 in p22 phox Moreover, their effect on NOX4 maturation and ROS generation was analyzed. We propose that this reversible and quantitative protein-protein interaction technique with its small split-fragment approach will provide a protein engineering and discovery tool not only for NOX research, but also for other intricate membrane protein complexes, and may thereby facilitate new drug discovery strategies for managing NOX-associated diseases. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.
Keighren, Margaret A; Flockhart, Jean; Hodson, Benjamin A; Shen, Guan-Yi; Birtley, James R; Notarnicola-Harwood, Antonio; West, John D
2015-08-01
Recent reports of a new generation of ubiquitous transgenic chimaera markers prompted us to consider the criteria used to evaluate new chimaera markers and develop more objective assessment methods. To investigate this experimentally we used several series of fetal and adult chimaeras, carrying an older, multi-copy transgenic marker. We used two additional independent markers and objective, quantitative criteria for cell selection and cell mixing to investigate quantitative and spatial aspects of developmental neutrality. We also suggest how the quantitative analysis we used could be simplified for future use with other markers. As a result, we recommend a five-step procedure for investigators to evaluate new chimaera markers based partly on criteria proposed previously but with a greater emphasis on examining the developmental neutrality of prospective new markers. These five steps comprise (1) review of published information, (2) evaluation of marker detection, (3) genetic crosses to check for effects on viability and growth, (4) comparisons of chimaeras with and without the marker and (5) analysis of chimaeras with both cell populations labelled. Finally, we review a number of different chimaera markers and evaluate them using the extended set of criteria. These comparisons indicate that, although the new generation of ubiquitous fluorescent markers are the best of those currently available and fulfil most of the criteria required of a chimaera marker, further work is required to determine whether they are developmentally neutral.
Nargotra, Amit; Sharma, Sujata; Koul, Jawahir Lal; Sangwan, Pyare Lal; Khan, Inshad Ali; Kumar, Ashwani; Taneja, Subhash Chander; Koul, Surrinder
2009-10-01
Quantitative structure activity relationship (QSAR) analysis of piperine analogs as inhibitors of efflux pump NorA from Staphylococcus aureus has been performed in order to obtain a highly accurate model enabling prediction of inhibition of S. aureus NorA of new chemical entities from natural sources as well as synthetic ones. Algorithm based on genetic function approximation method of variable selection in Cerius2 was used to generate the model. Among several types of descriptors viz., topological, spatial, thermodynamic, information content and E-state indices that were considered in generating the QSAR model, three descriptors such as partial negative surface area of the compounds, area of the molecular shadow in the XZ plane and heat of formation of the molecules resulted in a statistically significant model with r(2)=0.962 and cross-validation parameter q(2)=0.917. The validation of the QSAR models was done by cross-validation, leave-25%-out and external test set prediction. The theoretical approach indicates that the increase in the exposed partial negative surface area increases the inhibitory activity of the compound against NorA whereas the area of the molecular shadow in the XZ plane is inversely proportional to the inhibitory activity. This model also explains the relationship of the heat of formation of the compound with the inhibitory activity. The model is not only able to predict the activity of new compounds but also explains the important regions in the molecules in quantitative manner.
Hu, Zhe-Yi; Parker, Robert B.; Herring, Vanessa L.; Laizure, S. Casey
2012-01-01
Dabigatran etexilate (DABE) is an oral prodrug that is rapidly converted by esterases to dabigatran (DAB), a direct inhibitor of thrombin. To elucidate the esterase-mediated metabolic pathway of DABE, a high-performance liquid chromatography/mass spectrometer (LC-MS/MS)-based metabolite identification and semi-quantitative estimation approach was developed. To overcome the poor full-scan sensitivity of conventional triple quadrupole mass spectrometry, precursor-product ion pairs were predicted, to search for the potential in vitro metabolites. The detected metabolites were confirmed by the product ion scan. A dilution method was introduced to evaluate the matrix effects of tentatively identified metabolites without chemical standards. Quantitative information on detected metabolites was obtained using ‘metabolite standards’ generated from incubation samples that contain a high concentration of metabolite in combination with a correction factor for mass spectrometry response. Two in vitro metabolites of DABE (M1 and M2) were identified, and quantified by the semi-quantitative estimation approach. It is noteworthy that CES1 convert DABE to M1 while CES2 mediates the conversion of DABE to M2. M1 (or M2) was further metabolized to DAB by CES2 (or CES1). The approach presented here provides a solution to a bioanalytical need for fast identification and semi-quantitative estimation of CES metabolites in preclinical samples. PMID:23239178
On the Need for Quantitative Bias Analysis in the Peer-Review Process.
Fox, Matthew P; Lash, Timothy L
2017-05-15
Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Rodriguez-Falces, Javier
2013-01-01
In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are…
Academic Advising and First-Generation College Students: A Quantitative Study on Student Retention
ERIC Educational Resources Information Center
Swecker, Hadyn K.; Fifolt, Matthew; Searby, Linda
2014-01-01
For this quantitative study, we used a multiple logistic regression technique to investigate the relationship between the number of meetings with an academic advisor and retention of first-generation students, as represented by enrollment status and academic standing at a large, public research institution in the Southeast. Consistent with…
Use of multiple methods to determine factors affecting quality of care of patients with diabetes.
Khunti, K
1999-10-01
The process of care of patients with diabetes is complex; however, GPs are playing a greater role in its management. Despite the research evidence, the quality of care of patients with diabetes is variable. In order to improve care, information is required on the obstacles faced by practices in improving care. Qualitative and quantitative methods can be used for formation of hypotheses and the development of survey procedures. However, to date few examples exist in general practice research on the use of multiple methods using both quantitative and qualitative techniques for hypothesis generation. We aimed to determine information on all factors that may be associated with delivery of care to patients with diabetes. Factors for consideration on delivery of diabetes care were generated by multiple qualitative methods including brainstorming with health professionals and patients, a focus group and interviews with key informants which included GPs and practice nurses. Audit data showing variations in care of patients with diabetes were used to stimulate the brainstorming session. A systematic literature search focusing on quality of care of patients with diabetes in primary care was also conducted. Fifty-four potential factors were identified by multiple methods. Twenty (37.0%) were practice-related factors, 14 (25.9%) were patient-related factors and 20 (37.0%) were organizational factors. A combination of brainstorming and the literature review identified 51 (94.4%) factors. Patients did not identify factors in addition to those identified by other methods. The complexity of delivery of care to patients with diabetes is reflected in the large number of potential factors identified in this study. This study shows the feasibility of using multiple methods for hypothesis generation. Each evaluation method provided unique data which could not otherwise be easily obtained. This study highlights a way of combining various traditional methods in an attempt to overcome the deficiencies and bias that may occur when using a single method. Similar methods can also be used to generate hypotheses for other exploratory research. An important responsibility of health authorities and primary care groups will be to assess the health needs of their local populations. Multiple methods could also be used to identify and commission services to meet these needs.
An attribute-driven statistics generator for use in a G.I.S. environment
NASA Technical Reports Server (NTRS)
Thomas, R. W.; Ritter, P. R.; Kaugars, A.
1984-01-01
When performing research using digital geographic information it is often useful to produce quantitative characterizations of the data, usually within some constraints. In the research environment the different combinations of required data and constraints can often become quite complex. This paper describes a technique that gives the researcher a powerful and flexible way to set up many possible combinations of data and constraints without having to perform numerous intermediate steps or create temporary data bands. This method provides an efficient way to produce descriptive statistics in such situations.
NASA Astrophysics Data System (ADS)
Gordienko, Vyacheslav M.; Kurochkin, Nikolay N.; Markov, V. N.; Panchenko, Vladislav Ya; Pogosov, G. A.; Chastukhin, E. M.
1995-02-01
A method is proposed for on-line monitoring of laser industrial processing. The method is based on optical heterodyne measurements of the Doppler backscattering signal generated in the interaction zone. Qualitative and quantitative information on hydrodynamic flows in the interaction zone can be obtained. A report is given of measurements, carried out at cw CO2 laser radiation intensities up to 1 kW cm-2, on the surfaces of a number of condensed materials irradiated in the monostatic interaction configuration.
Quantitative characterization of surface topography using spectral analysis
NASA Astrophysics Data System (ADS)
Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars
2017-03-01
Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi
2015-01-01
Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366
Duan, Yixiang; Jia, Quanxi; Cao, Wenqing
2010-11-23
A hydrogen sensor for detecting/quantitating hydrogen and hydrogen isotopes includes a sampling line and a microplasma generator that excites hydrogen from a gas sample and produces light emission from excited hydrogen. A power supply provides power to the microplasma generator, and a spectrometer generates an emission spectrum from the light emission. A programmable computer is adapted for determining whether or not the gas sample includes hydrogen, and for quantitating the amount of hydrogen and/or hydrogen isotopes are present in the gas sample.
Cobb, Joshua N; Declerck, Genevieve; Greenberg, Anthony; Clark, Randy; McCouch, Susan
2013-04-01
More accurate and precise phenotyping strategies are necessary to empower high-resolution linkage mapping and genome-wide association studies and for training genomic selection models in plant improvement. Within this framework, the objective of modern phenotyping is to increase the accuracy, precision and throughput of phenotypic estimation at all levels of biological organization while reducing costs and minimizing labor through automation, remote sensing, improved data integration and experimental design. Much like the efforts to optimize genotyping during the 1980s and 1990s, designing effective phenotyping initiatives today requires multi-faceted collaborations between biologists, computer scientists, statisticians and engineers. Robust phenotyping systems are needed to characterize the full suite of genetic factors that contribute to quantitative phenotypic variation across cells, organs and tissues, developmental stages, years, environments, species and research programs. Next-generation phenotyping generates significantly more data than previously and requires novel data management, access and storage systems, increased use of ontologies to facilitate data integration, and new statistical tools for enhancing experimental design and extracting biologically meaningful signal from environmental and experimental noise. To ensure relevance, the implementation of efficient and informative phenotyping experiments also requires familiarity with diverse germplasm resources, population structures, and target populations of environments. Today, phenotyping is quickly emerging as the major operational bottleneck limiting the power of genetic analysis and genomic prediction. The challenge for the next generation of quantitative geneticists and plant breeders is not only to understand the genetic basis of complex trait variation, but also to use that knowledge to efficiently synthesize twenty-first century crop varieties.
Cloud computing approaches for prediction of ligand binding poses and pathways.
Lawrenz, Morgan; Shukla, Diwakar; Pande, Vijay S
2015-01-22
We describe an innovative protocol for ab initio prediction of ligand crystallographic binding poses and highly effective analysis of large datasets generated for protein-ligand dynamics. We include a procedure for setup and performance of distributed molecular dynamics simulations on cloud computing architectures, a model for efficient analysis of simulation data, and a metric for evaluation of model convergence. We give accurate binding pose predictions for five ligands ranging in affinity from 7 nM to > 200 μM for the immunophilin protein FKBP12, for expedited results in cases where experimental structures are difficult to produce. Our approach goes beyond single, low energy ligand poses to give quantitative kinetic information that can inform protein engineering and ligand design.
Molecular-level understanding of the carbonisation of polysaccharides.
Shuttleworth, Peter S; Budarin, Vitaliy; White, Robin J; Gun'ko, Vladimir M; Luque, Rafael; Clark, James H
2013-07-08
Understanding of both the textural and functionality changes occurring during (mesoporous) polysaccharide carbonisation at the molecular level provides a deeper insight into the whole spectrum of material properties, from chemical activity to pore shape and surface energy, which is crucial for the successful application of carbonaceous materials in adsorption, catalysis and chromatography. Obtained information will help to identify the most appropriate applications of the carbonaceous material generated during torrefaction and different types of pyrolysis processes and therefore will be important for the development of cost- and energy-efficient zero-waste biorefineries. The presented approach is informative and semi-quantitative with the potential to be extended to the formation of other biomass-derived carbonaceous materials. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Laser velocimeter application to oscillatory liquid flows
NASA Technical Reports Server (NTRS)
Gartrell, L. R.
1978-01-01
A laser velocimeter technique was used to measure the mean velocity and the frequency characteristics of an oscillatory flow component generated with a rotating flapper in liquid flow system at Reynolds numbers approximating 93,000. The velocity information was processed in the frequency domain using a tracker whose output was used to determine the flow spectrum. This was accomplished with the use of an autocorrelator/Fourier transform analyzer and a spectrum averaging analyzer where induced flow oscillations up to 40 Hz were detected. Tests were conducted at a mean flow velocity of approximately 2 m/s. The experimental results show that the laser velocimeter can provide quantitative information such as liquid flow velocity and frequency spectrum with a possible application to cryogenic fluid flows.
2011-01-01
Background The role of psychotherapy in the treatment of traumatic brain injury is receiving increased attention. The evaluation of psychotherapy with these patients has been conducted largely in the absence of quantitative data concerning the therapy itself. Quantitative methods for characterizing the sequence-sensitive structure of patient-therapist communication are now being developed with the objective of improving the effectiveness of psychotherapy following traumatic brain injury. Methods The content of three therapy session transcripts (sessions were separated by four months) obtained from a patient with a history of several motor vehicle accidents who was receiving dialectical behavior therapy was scored and analyzed using methods derived from the mathematical theory of symbolic dynamics. Results The analysis of symbol frequencies was largely uninformative. When repeated triples were examined a marked pattern of change in content was observed over the three sessions. The context free grammar complexity and the Lempel-Ziv complexity were calculated for each therapy session. For both measures, the rate of complexity generation, expressed as bits per minute, increased longitudinally during the course of therapy. The between-session increases in complexity generation rates are consistent with calculations of mutual information. Taken together these results indicate that there was a quantifiable increase in the variability of patient-therapist verbal behavior during the course of therapy. Comparison of complexity values against values obtained from equiprobable random surrogates established the presence of a nonrandom structure in patient-therapist dialog (P = .002). Conclusions While recognizing that only limited conclusions can be based on a case history, it can be noted that these quantitative observations are consistent with qualitative clinical observations of increases in the flexibility of discourse during therapy. These procedures can be of particular value in the examination of therapies following traumatic brain injury because, in some presentations, these therapies are complicated by deficits that result in subtle distortions of language that produce significant post-injury social impairment. Independently of the mathematical analysis applied to the investigation of therapy-generated symbol sequences, our experience suggests that the procedures presented here are of value in training therapists. PMID:21794113
Neural classification of the selected family of butterflies
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Mueller, W.; Górna, K.; Okoń, P.
2017-07-01
There have been noticed growing explorers' interest in drawing conclusions based on information of data coded in a graphic form. The neuronal identification of pictorial data, with special emphasis on both quantitative and qualitative analysis, is more frequently utilized to gain and deepen the empirical data knowledge. Extraction and then classification of selected picture features, such as color or surface structure, enables one to create computer tools in order to identify these objects presented as, for example, digital pictures. The work presents original computer system "Processing the image v.1.0" designed to digitalize pictures on the basis of color criterion. The system has been applied to generate a reference learning file for generating the Artificial Neural Network (ANN) to identify selected kinds of butterflies from the Papilionidae family.
40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.
Code of Federal Regulations, 2012 CFR
2012-07-01
... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Research of the National Bureau of Standards, 86:361-366 (1981). (7) Fujita, T. et al. “A New Substituent...
40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.
Code of Federal Regulations, 2010 CFR
2010-07-01
... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Research of the National Bureau of Standards, 86:361-366 (1981). (7) Fujita, T. et al. “A New Substituent...
40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.
Code of Federal Regulations, 2011 CFR
2011-07-01
... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Research of the National Bureau of Standards, 86:361-366 (1981). (7) Fujita, T. et al. “A New Substituent...
NASA Astrophysics Data System (ADS)
Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua
Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D =
Kessner, Darren; Novembre, John
2015-01-01
Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50–100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates. PMID:25672748
Flexible automated approach for quantitative liquid handling of complex biological samples.
Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H
2007-11-01
A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.
He, Gu; Qiu, Minghua; Li, Rui; Ouyang, Liang; Wu, Fengbo; Song, Xiangrong; Cheng, Li; Xiang, Mingli; Yu, Luoting
2012-06-01
Aurora-A has been known as one of the most important targets for cancer therapy, and some Aurora-A inhibitors have entered clinical trails. In this study, combination of the ligand-based and structure-based methods is used to clarify the essential quantitative structure-activity relationship of known Aurora-A inhibitors, and multicomplex-based pharmacophore-guided method has been suggested to generate a comprehensive pharmacophore of Aurora-A kinase based on a collection of crystal structures of Aurora-A-inhibitor complex. This model has been successfully used to identify the bioactive conformation and align 37 structurally diverse N-substituted 2'-(aminoaryl)benzothiazoles derivatives. The quantitative structure-activity relationship analyses have been performed on these Aurora-A inhibitors based on multicomplex-based pharmacophore-guided alignment. These results may provide important information for further design and virtual screening of novel Aurora-A inhibitors. © 2012 John Wiley & Sons A/S.
Delmore, Kira E; Liedvogel, Miriam
2016-01-01
The amazing accuracy of migratory orientation performance across the animal kingdom is facilitated by the use of magnetic and celestial compass systems that provide individuals with both directional and positional information. Quantitative genetics analyses in several animal systems suggests that migratory orientation has a strong genetic component. Nevertheless, the exact identity of genes controlling orientation remains largely unknown, making it difficult to obtain an accurate understanding of this fascinating behavior on the molecular level. Here, we provide an overview of molecular genetic techniques employed thus far, highlight the pros and cons of various approaches, generalize results from species-specific studies whenever possible, and evaluate how far the field has come since early quantitative genetics studies. We emphasize the importance of examining different levels of molecular control, and outline how future studies can take advantage of high-resolution tracking and sequencing techniques to characterize the genomic architecture of migratory orientation.
Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.
Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner
2016-01-01
Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Isono, Hiroshi; Hirata, Shinnosuke; Hachiya, Hiroyuki
2015-07-01
In medical ultrasonic images of liver disease, a texture with a speckle pattern indicates a microscopic structure such as nodules surrounded by fibrous tissues in hepatitis or cirrhosis. We have been applying texture analysis based on a co-occurrence matrix to ultrasonic images of fibrotic liver for quantitative tissue characterization. A co-occurrence matrix consists of the probability distribution of brightness of pixel pairs specified with spatial parameters and gives new information on liver disease. Ultrasonic images of different types of fibrotic liver were simulated and the texture-feature contrast was calculated to quantify the co-occurrence matrices generated from the images. The results show that the contrast converges with a value that can be theoretically estimated using a multi-Rayleigh model of echo signal amplitude distribution. We also found that the contrast value increases as liver fibrosis progresses and fluctuates depending on the size of fibrotic structure.
Jin, Cheng; Feng, Jianjiang; Wang, Lei; Yu, Heng; Liu, Jiang; Lu, Jiwen; Zhou, Jie
2018-05-01
In this paper, we present an approach for left atrial appendage (LAA) multi-phase fast segmentation and quantitative assisted diagnosis of atrial fibrillation (AF) based on 4D-CT data. We take full advantage of the temporal dimension information to segment the living, flailed LAA based on a parametric max-flow method and graph-cut approach to build 3-D model of each phase. To assist the diagnosis of AF, we calculate the volumes of 3-D models, and then generate a "volume-phase" curve to calculate the important dynamic metrics: ejection fraction, filling flux, and emptying flux of the LAA's blood by volume. This approach demonstrates more precise results than the conventional approaches that calculate metrics by area, and allows for the quick analysis of LAA-volume pattern changes of in a cardiac cycle. It may also provide insight into the individual differences in the lesions of the LAA. Furthermore, we apply support vector machines (SVMs) to achieve a quantitative auto-diagnosis of the AF by exploiting seven features from volume change ratios of the LAA, and perform multivariate logistic regression analysis for the risk of LAA thrombosis. The 100 cases utilized in this research were taken from the Philips 256-iCT. The experimental results demonstrate that our approach can construct the 3-D LAA geometries robustly compared to manual annotations, and reasonably infer that the LAA undergoes filling, emptying and re-filling, re-emptying in a cardiac cycle. This research provides a potential for exploring various physiological functions of the LAA and quantitatively estimating the risk of stroke in patients with AF. Copyright © 2018 Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2014 CFR
2014-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2014 CFR
2014-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2012 CFR
2012-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2011 CFR
2011-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2012 CFR
2012-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2011 CFR
2011-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
NASA Astrophysics Data System (ADS)
Anderson, O. Roger
The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.
Quantification of water resources uncertainties in the Luvuvhu sub-basin of the Limpopo river basin
NASA Astrophysics Data System (ADS)
Oosthuizen, N.; Hughes, D.; Kapangaziwiri, E.; Mwenge Kahinda, J.; Mvandaba, V.
2018-06-01
In the absence of historical observed data, models are generally used to describe the different hydrological processes and generate data and information that will inform management and policy decision making. Ideally, any hydrological model should be based on a sound conceptual understanding of the processes in the basin and be backed by quantitative information for the parameterization of the model. However, these data are often inadequate in many sub-basins, necessitating the incorporation of the uncertainty related to the estimation process. This paper reports on the impact of the uncertainty related to the parameterization of the Pitman monthly model and water use data on the estimates of the water resources of the Luvuvhu, a sub-basin of the Limpopo river basin. The study reviews existing information sources associated with the quantification of water balance components and gives an update of water resources of the sub-basin. The flows generated by the model at the outlet of the basin were between 44.03 Mm3 and 45.48 Mm3 per month when incorporating +20% uncertainty to the main physical runoff generating parameters. The total predictive uncertainty of the model increased when water use data such as small farm and large reservoirs and irrigation were included. The dam capacity data was considered at an average of 62% uncertainty mainly as a result of the large differences between the available information in the national water resources database and that digitised from satellite imagery. Water used by irrigated crops was estimated with an average of about 50% uncertainty. The mean simulated monthly flows were between 38.57 Mm3 and 54.83 Mm3 after the water use uncertainty was added. However, it is expected that the uncertainty could be reduced by using higher resolution remote sensing imagery.
Shaheen, Shabnam; Abbas, Safdar; Hussain, Javid; Mabood, Fazal; Umair, Muhammad; Ali, Maroof; Ahmad, Mushtaq; Zafar, Muhammad; Farooq, Umar; Khan, Ajmal
2017-01-01
Medicinal plants are important treasures for the treatment of different types of diseases. Current study provides significant ethnopharmacological information, both qualitative and quantitative on medical plants related to children disorders from district Bannu, Khyber Pakhtunkhwa (KPK) province of Pakistan. The information gathered was quantitatively analyzed using informant consensus factor, relative frequency of citation and use value method to establish a baseline data for more comprehensive investigations of bioactive compounds of indigenous medicinal plants specifically related to children disorders. To best of our knowledge it is first attempt to document ethno-botanical information of medicinal plants using quantitative approaches. Total of 130 informants were interviewed using questionnaire conducted during 2014–2016 to identify the preparations and uses of the medicinal plants for children diseases treatment. A total of 55 species of flowering plants belonging to 49 genera and 32 families were used as ethno-medicines in the study area. The largest number of specie belong to Leguminosae and Cucurbitaceae families (4 species each) followed by Apiaceae, Moraceae, Poaceae, Rosaceae, and Solanaceae (3 species each). In addition leaves and fruits are most used parts (28%), herbs are most used life form (47%), decoction method were used for administration (27%), and oral ingestion was the main used route of application (68.5%). The highest use value was reported for species Momordica charantia and Raphnus sativus (1 for each) and highest Informant Consensus Factor was observed for cardiovascular and rheumatic diseases categories (0.5 for each). Most of the species in the present study were used to cure gastrointestinal diseases (39 species). The results of present study revealed the importance of medicinal plant species and their significant role in the health care of the inhabitants in the present area. The people of Bannu own high traditional knowledge related to children diseases. In conclusion we recommend giving priority for further phytochemical investigation to plants that scored highest FIC, UV values, as such values could be considered as good indicator of prospective plants for discovering new drugs and attract future generations toward traditional healing practices. PMID:28769789
Shaheen, Shabnam; Abbas, Safdar; Hussain, Javid; Mabood, Fazal; Umair, Muhammad; Ali, Maroof; Ahmad, Mushtaq; Zafar, Muhammad; Farooq, Umar; Khan, Ajmal
2017-01-01
Medicinal plants are important treasures for the treatment of different types of diseases. Current study provides significant ethnopharmacological information, both qualitative and quantitative on medical plants related to children disorders from district Bannu, Khyber Pakhtunkhwa (KPK) province of Pakistan. The information gathered was quantitatively analyzed using informant consensus factor, relative frequency of citation and use value method to establish a baseline data for more comprehensive investigations of bioactive compounds of indigenous medicinal plants specifically related to children disorders. To best of our knowledge it is first attempt to document ethno-botanical information of medicinal plants using quantitative approaches. Total of 130 informants were interviewed using questionnaire conducted during 2014-2016 to identify the preparations and uses of the medicinal plants for children diseases treatment. A total of 55 species of flowering plants belonging to 49 genera and 32 families were used as ethno-medicines in the study area. The largest number of specie belong to Leguminosae and Cucurbitaceae families (4 species each) followed by Apiaceae, Moraceae, Poaceae, Rosaceae, and Solanaceae (3 species each). In addition leaves and fruits are most used parts (28%), herbs are most used life form (47%), decoction method were used for administration (27%), and oral ingestion was the main used route of application (68.5%). The highest use value was reported for species Momordica charantia and Raphnus sativus (1 for each) and highest Informant Consensus Factor was observed for cardiovascular and rheumatic diseases categories (0.5 for each). Most of the species in the present study were used to cure gastrointestinal diseases (39 species). The results of present study revealed the importance of medicinal plant species and their significant role in the health care of the inhabitants in the present area. The people of Bannu own high traditional knowledge related to children diseases. In conclusion we recommend giving priority for further phytochemical investigation to plants that scored highest FIC, UV values, as such values could be considered as good indicator of prospective plants for discovering new drugs and attract future generations toward traditional healing practices.
Leveraging Environmental Correlations: The Thermodynamics of Requisite Variety
NASA Astrophysics Data System (ADS)
Boyd, Alexander B.; Mandal, Dibyendu; Crutchfield, James P.
2017-06-01
Key to biological success, the requisite variety that confronts an adaptive organism is the set of detectable, accessible, and controllable states in its environment. We analyze its role in the thermodynamic functioning of information ratchets—a form of autonomous Maxwellian Demon capable of exploiting fluctuations in an external information reservoir to harvest useful work from a thermal bath. This establishes a quantitative paradigm for understanding how adaptive agents leverage structured thermal environments for their own thermodynamic benefit. General ratchets behave as memoryful communication channels, interacting with their environment sequentially and storing results to an output. The bulk of thermal ratchets analyzed to date, however, assume memoryless environments that generate input signals without temporal correlations. Employing computational mechanics and a new information-processing Second Law of Thermodynamics (IPSL) we remove these restrictions, analyzing general finite-state ratchets interacting with structured environments that generate correlated input signals. On the one hand, we demonstrate that a ratchet need not have memory to exploit an uncorrelated environment. On the other, and more appropriate to biological adaptation, we show that a ratchet must have memory to most effectively leverage structure and correlation in its environment. The lesson is that to optimally harvest work a ratchet's memory must reflect the input generator's memory. Finally, we investigate achieving the IPSL bounds on the amount of work a ratchet can extract from its environment, discovering that finite-state, optimal ratchets are unable to reach these bounds. In contrast, we show that infinite-state ratchets can go well beyond these bounds by utilizing their own infinite "negentropy". We conclude with an outline of the collective thermodynamics of information-ratchet swarms.
Dual-pathway multi-echo sequence for simultaneous frequency and T2 mapping
NASA Astrophysics Data System (ADS)
Cheng, Cheng-Chieh; Mei, Chang-Sheng; Duryea, Jeffrey; Chung, Hsiao-Wen; Chao, Tzu-Cheng; Panych, Lawrence P.; Madore, Bruno
2016-04-01
Purpose: To present a dual-pathway multi-echo steady state sequence and reconstruction algorithm to capture T2, T2∗ and field map information. Methods: Typically, pulse sequences based on spin echoes are needed for T2 mapping while gradient echoes are needed for field mapping, making it difficult to jointly acquire both types of information. A dual-pathway multi-echo pulse sequence is employed here to generate T2 and field maps from the same acquired data. The approach might be used, for example, to obtain both thermometry and tissue damage information during thermal therapies, or susceptibility and T2 information from a same head scan, or to generate bonus T2 maps during a knee scan. Results: Quantitative T2, T2∗ and field maps were generated in gel phantoms, ex vivo bovine muscle, and twelve volunteers. T2 results were validated against a spin-echo reference standard: A linear regression based on ROI analysis in phantoms provided close agreement (slope/R2 = 0.99/0.998). A pixel-wise in vivo Bland-Altman analysis of R2 = 1/T2 showed a bias of 0.034 Hz (about 0.3%), as averaged over four volunteers. Ex vivo results, with and without motion, suggested that tissue damage detection based on T2 rather than temperature-dose measurements might prove more robust to motion. Conclusion: T2, T2∗ and field maps were obtained simultaneously, from the same datasets, in thermometry, susceptibility-weighted imaging and knee-imaging contexts.
NASA Astrophysics Data System (ADS)
Sun, Chi-Kuang; Wei, Ming-Liang; Su, Yu-Hsiang; Weng, Wei-Hung; Liao, Yi-Hua
2017-02-01
Harmonic generation microscopy is a noninvasive repetitive imaging technique that provides real-time 3D microscopic images of human skin with a sub-femtoliter resolution and high penetration down to the reticular dermis. In this talk, we show that with a strong resonance effect, the third-harmonic-generation (THG) modality provides enhanced contrast on melanin and allows not only differential diagnosis of various pigmented skin lesions but also quantitative imaging for longterm tracking. This unique capability makes THG microscopy the only label-free technique capable of identifying the active melanocytes in human skin and to image their different dendriticity patterns. In this talk, we will review our recent efforts to in vivo image melanin distribution and quantitatively diagnose pigmented skin lesions using label-free harmonic generation biopsy. This talk will first cover the spectroscopic study on the melanin enhanced THG effect in human cells and the calibration strategy inside human skin for quantitative imaging. We will then review our recent clinical trials including: differential diagnosis capability study on pigmented skin tumors; as well as quantitative virtual biopsy study on pre- and post- treatment evaluation on melasma and solar lentigo. Our study indicates the unmatched capability of harmonic generation microscopy to perform virtual biopsy for noninvasive histopathological diagnosis of various pigmented skin tumors, as well as its unsurpassed capability to noninvasively reveal the pathological origin of different hyperpigmentary diseases on human face as well as to monitor the efficacy of laser depigmentation treatments. This work is sponsored by National Health Research Institutes.
Rethinking health numeracy: a multidisciplinary literature review.
Ancker, Jessica S; Kaufman, David
2007-01-01
The purpose of this review is to organize various published conceptions of health numeracy and to discuss how health numeracy contributes to the productive use of quantitative information for health. We define health numeracy as the individual-level skills needed to understand and use quantitative health information, including basic computation skills, ability to use information in documents and non-text formats such as graphs, and ability to communicate orally. We also identify two other factors affecting whether a consumer can use quantitative health information: design of documents and other information artifacts, and health-care providers' communication skills. We draw upon the distributed cognition perspective to argue that essential ingredients for the productive use of quantitative health information include not only health numeracy but also good provider communication skills, as well as documents and devices that are designed to enhance comprehension and cognition.
ERIC Educational Resources Information Center
Chamberlain, John Martyn
2017-01-01
Against the backdrop of contemporary debates surrounding the public role of criminology, this article argues that a key barrier to ensuring that the next generation of criminologists is equipped with the skills necessary to engage in critical forms of citizenship, is the quantitative "skills gap" that undergraduate students possess as a…
Williams-Blangero, Sarah; Vandeberg, John L; Subedi, Janardan; Jha, Bharat; Dyer, Tom D; Blangero, John
2008-04-15
Whipworm (Trichuris trichiura) infection is a soil-transmitted helminth infection that affects >1 billion people. It is a serious public health problem in many developing countries and can result in deficits in growth and cognitive development. In a follow-up study of significant heritability for whipworm infection, we conducted the first genome scan for quantitative trait loci (QTL) influencing the heritability of susceptibility to this important parasitic disease. Whipworm egg counts were determined for 1,253 members of the Jirel population of eastern Nepal. All individuals in the study sample belonged to a single pedigree including >26,000 pairs of relatives that are informative for genetic analysis. Linkage analysis of genome scan data generated for the pedigree provided unambiguous evidence for 2 QTL influencing susceptibility to whipworm infection, one located on chromosome 9 (logarithm of the odds ratio [LOD] score, 3.35; genomewide P = .0138) and the other located on chromosome 18 (LOD score, 3.29; genomewide P = .0159). There was also suggestive evidence that 2 loci located on chromosomes 12 and 13 influenced whipworm infection. The results of this first genome scan for T. trichiura egg counts provides new information on the determinants of genetic predisposition to whipworm infection.
Röwer, Claudia; Vissers, Johannes P C; Koy, Cornelia; Kipping, Marc; Hecker, Michael; Reimer, Toralf; Gerber, Bernd; Thiesen, Hans-Jürgen; Glocker, Michael O
2009-12-01
As more and more alternative treatments become available for breast carcinoma, there is a need to stratify patients and individual molecular information seems to be suitable for this purpose. In this study, we applied label-free protein quantitation by nanoscale LC-MS and investigated whether this approach could be used for defining a proteome signature for invasive ductal breast carcinoma. Tissue samples from healthy breast and tumor were collected from three patients. Protein identifications were based on LC-MS peptide fragmentation data which were obtained simultaneously to the quantitative information. Hereby, an invasive ductal breast carcinoma proteome signature was generated which contains 60 protein entries. The on-column concentrations for osteoinductive factor, vimentin, GAP-DH, and NDKA are provided as examples. These proteins represent distinctive gene ontology groups of differentially expressed proteins and are discussed as risk markers for primary tumor pathogenesis. The developed methodology has been found well applicable in a clinical environment in which standard operating procedures can be kept; a prerequisite for the definition of molecular parameter sets that shall be capable for stratification of patients.
Material Properties of Human Ocular Tissue at 7-µm Resolution.
Rohrbach, Daniel; Ito, Kazuyo; Lloyd, Harriet O; Silverman, Ronald H; Yoshida, Kenji; Yamaguchi, Tadashi; Mamou, Jonathan
2017-09-01
Quantitative assessment of the material properties of ocular tissues can provide valuable information for investigating several ophthalmic diseases. Quantitative acoustic microscopy (QAM) offers a means of obtaining such information, but few QAM investigations have been conducted on human ocular tissue. We imaged the optic nerve (ON) and iridocorneal angle in 12-µm deparaffinized sections of the human eye using a custom-built acoustic microscope with a 250-MHz transducer (7-µm lateral resolution). The two-dimensional QAM maps of ultrasound attenuation (α), speed of sound ( c), acoustic impedance ( Z), bulk modulus ( K), and mass density (ρ) were generated. Scanned samples were then stained and imaged by light microscopy for comparison with QAM maps. The spatial resolution and contrast of scanning acoustic microscopy (SAM) maps were sufficient to resolve anatomic layers of the retina (Re); anatomic features in SAM maps corresponded to those seen by light microscopy. Significant variations of the acoustic parameters were found. For example, the sclera was 220 MPa stiffer than Re, choroid, and ON tissue. To the authors' knowledge, this is the first systematic study to assess c, Z, K, ρ, and α of human ocular tissue at the high ultrasound frequencies used in this study.
Xu, Shuoyu; Kang, Chiang Huen; Gou, Xiaoli; Peng, Qiwen; Yan, Jie; Zhuo, Shuangmu; Cheng, Chee Leong; He, Yuting; Kang, Yuzhan; Xia, Wuzheng; So, Peter T C; Welsch, Roy; Rajapakse, Jagath C; Yu, Hanry
2016-04-01
Liver surface is covered by a collagenous layer called the Glisson's capsule. The structure of the Glisson's capsule is barely seen in the biopsy samples for histology assessment, thus the changes of the collagen network from the Glisson's capsule during the liver disease progression are not well studied. In this report, we investigated whether non-linear optical imaging of the Glisson's capsule at liver surface would yield sufficient information to allow quantitative staging of liver fibrosis. In contrast to conventional tissue sections whereby tissues are cut perpendicular to the liver surface and interior information from the liver biopsy samples were used, we have established a capsule index based on significant parameters extracted from the second harmonic generation (SHG) microscopy images of capsule collagen from anterior surface of rat livers. Thioacetamide (TAA) induced liver fibrosis animal models was used in this study. The capsule index is capable of differentiating different fibrosis stages, with area under receiver operating characteristics curve (AUC) up to 0.91, making it possible to quantitatively stage liver fibrosis via liver surface imaging potentially with endomicroscopy. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S
2015-01-16
Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.
Rockfall induced seismic signals: case study in Montserrat, Catalonia
NASA Astrophysics Data System (ADS)
Vilajosana, I.; Suriñach, E.; Abellán, A.; Khazaradze, G.; Garcia, D.; Llosa, J.
2008-08-01
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10-4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 A rockfall event generates seismic signals with specific characteristics in the time domain; 2 the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-01
... emissions, (applicants are encouraged to provide quantitative information regarding expected reductions in...). Applicants are encouraged to provide quantitative information that validates the existence of substantial... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
78 FR 8113 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-05
... to collect information as part of quantitative research related to residential mortgage loan... is the subject of the disclosure. The quantitative research will involve testing the mortgage loan... quantitative research methodologies. The contractors will select participants via screening questionnaires to...
A Primer on Disseminating Applied Quantitative Research
ERIC Educational Resources Information Center
Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.
2010-01-01
Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... human health assessment program that evaluates quantitative and qualitative risk information on effects... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...
SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.
Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P
2016-07-01
The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*
Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.
2016-01-01
The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445
Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas
2017-01-01
The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.
Thermodynamics and Mechanics of Membrane Curvature Generation and Sensing by Proteins and Lipids
Baumgart, Tobias; Capraro, Benjamin R.; Zhu, Chen; Das, Sovan L.
2014-01-01
Research investigating lipid membrane curvature generation and sensing is a rapidly developing frontier in membrane physical chemistry and biophysics. The fast recent progress is based on the discovery of a plethora of proteins involved in coupling membrane shape to cellular membrane function, the design of new quantitative experimental techniques to study aspects of membrane curvature, and the development of analytical theories and simulation techniques that allow a mechanistic interpretation of quantitative measurements. The present review first provides an overview of important classes of membrane proteins for which function is coupled to membrane curvature. We then survey several mechanisms that are assumed to underlie membrane curvature sensing and generation. Finally, we discuss relatively simple thermodynamic/mechanical models that allow quantitative interpretation of experimental observations. PMID:21219150
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-12
... INFORMATION: Title: Comparative Effectiveness Research Inventory. Abstract: The information collection... will not be used for quantitative information collections that are designed to yield reliably... mechanisms that are designed to yield quantitative results. The Agency received no comments in response to...
Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting
2014-01-01
To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.
The impact of systemic cortical alterations on perception
NASA Astrophysics Data System (ADS)
Zhang, Zheng
2011-12-01
Perception is the process of transmitting and interpreting sensory information, and the primary somatosensory (SI) area in the human cortex is the main sensory receptive area for the sensation of touch. The elaborate neuroanatomical connectivity that subserves the neuronal communication between adjacent and near-adjacent regions within sensory cortex has been widely recognized to be essential to normal sensory function. As a result, systemic cortical alterations that impact the cortical regional interaction, as associated with many neurological disorders, are expected to have significant impact on sensory perception. Recently, our research group has developed a novel sensory diagnostic system that employs quantitative sensory testing methods and is able to non-invasively assess central nervous system healthy status. The intent of this study is to utilize quantitative sensory testing methods that were designed to generate discriminable perception to objectively and quantitatively assess the impacts of different conditions on human sensory information processing capacity. The correlation between human perceptions with observations from animal research enables a better understanding of the underlying neurophysiology of human perception. Additional findings on different subject populations provide valuable insight of the underlying mechanisms for the development and maintenance of different neurological diseases. During the course of the study, several protocols were designed and utilized. And this set of sensory-based perceptual metrics was employed to study the effects of different conditions (non-noxious thermal stimulation, chronic pain stage, and normal aging) on sensory perception. It was found that these conditions result in significant deviations of the subjects' tactile information processing capacities from normal values. Although the observed shift of sensory detection sensitivity could be a result of enhanced peripheral activity, the changes in the effects of adaptation most likely reflect changes in central nervous system. The findings in this work provide valuable information for better understanding the underlying mechanisms involved in the development and maintenance of different neurological conditions.
Ceol, M; Forino, M; Gambaro, G; Sauer, U; Schleicher, E D; D'Angelo, A; Anglani, F
2001-01-01
Gene expression can be examined with different techniques including ribonuclease protection assay (RPA), in situ hybridisation (ISH), and quantitative reverse transcription-polymerase chain reaction (RT/PCR). These methods differ considerably in their sensitivity and precision in detecting and quantifying low abundance mRNA. Although there is evidence that RT/PCR can be performed in a quantitative manner, the quantitative capacity of this method is generally underestimated. To demonstrate that the comparative kinetic RT/PCR strategy-which uses a housekeeping gene as internal standard-is a quantitative method to detect significant differences in mRNA levels between different samples, the inhibitory effect of heparin on phorbol 12-myristate 13-acetate (PMA)-induced-TGF-beta1 mRNA expression was evaluated by RT/PCR and RPA, the standard method of mRNA quantification, and the results were compared. The reproducibility of RT/PCR amplification was calculated by comparing the quantity of G3PDH and TGF-beta1 PCR products, generated during the exponential phases, estimated from two different RT/PCR (G3PDH, r = 0.968, P = 0.0000; TGF-beta1, r = 0.966, P = 0.0000). The quantitative capacity of comparative kinetic RT/PCR was demonstrated by comparing the results obtained from RPA and RT/PCR using linear regression analysis. Starting from the same RNA extraction, but using only 1% of the RNA for the RT/PCR compared to RPA, significant correlation was observed (r = 0.984, P = 0.0004). Moreover the morphometric analysis of ISH signal was applied for the semi-quantitative evaluation of the expression and localisation of TGF-beta1 mRNA in the entire cell population. Our results demonstrate the close similarity of the RT/PCR and RPA methods in giving quantitative information on mRNA expression and indicate the possibility to adopt the comparative kinetic RT/PCR as reliable quantitative method of mRNA analysis. Copyright 2001 Wiley-Liss, Inc.
76 FR 13018 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. Total Burden Estimate for the...
75 FR 81665 - Notice of Intent to Seek Approval to Reinstate an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... are both quantitative and descriptive. Quantitative information from the most recently completed... activities with respect to industrial collaboration [cir] Conducting a survey of all center participants to probe the participant satisfaction with center activities [cir] Compiling a set of quantitative...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as a result of... encouraged to provide quantitative information that validates the existence of substantial transportation... quantitative and qualitative measures. Therefore, applicants for TIGER Discretionary Grants are generally...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... provide quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as... provide quantitative information that validates the existence of substantial transportation-related costs... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
76 FR 52383 - Reports, Forms, and Recordkeeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-22
... OMB: Title: 49 CFR 575--Consumer Information Regulations (sections 103 and 105) Quantitative Research... research and is now requesting to conduct follow- up quantitative research with consumers to assess current.... The results of that research phase were used to inform the quantitative phase of research which this...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that.... This type of generic clearance for qualitative information will not be used for quantitative... for submission for other generic mechanisms that are designed to yield quantitative results. The...
Burns, Darren K; Jones, Andrew P; Suhrcke, Marc
2016-03-01
Markets throughout the world have been reducing barriers to international trade and investment in recent years. The resulting increases in levels of international trade and investment have subsequently generated research interest into the potential population health impact. We present a systematic review of quantitative studies investigating the relationship between international trade, foreign direct investment and non-nutritional health outcomes. Articles were systematically collected from the SCOPUS, PubMed, EconLit and Web of Science databases. Due to the heterogeneous nature of the evidence considered, the 16 included articles were subdivided into individual level data analyses, selected country analyses and international panel analyses. Articles were then quality assessed using a tool developed as part of the project. Nine of the studies were assessed to be high quality, six as medium quality, and one as low quality. The evidence from the quantitative literature suggests that overall, there appears to be a beneficial association between international trade and population health. There was also evidence of the importance of foreign direct investment, yet a lack of research considering the direction of causality. Taken together, quantitative research into the relationship between trade and non-nutritional health indicates trade to be beneficial, yet this body of research is still in its infancy. Future quantitative studies based on this foundation will provide a stronger basis on which to inform relevant national and international institutions about the health consequences of trade policies. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luria, Paolo; Aspinall, Peter A
2003-08-01
The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less
Integration of PKPD relationships into benefit–risk analysis
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-01-01
Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398
Integration of PKPD relationships into benefit-risk analysis.
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-11-01
Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.
Proteomic Data Resources for EDRN Ovary Cancer Researchers within the EDRN — EDRN Public Portal
This project will generate a highly valuable data resource and make it available to all EDRN ovarian cancer researchers. The resource will include comprehensive proteomic (tandem mass spectrometry, MS/MS) data generated from plasma samples that have been collected between four months and four years prior to clinical detection of ovarian cancer. These pre-clinical samples, provided from the Beta Carotene and Retinol Efficacy Trial (CARET) prospective study, will be interrogated using IPAS, the proteomic profiling method developed by the Hanash Laboratory and with the quantitative methods developed by the McIntosh laboratory. In addition, we will combine these pre-clinical data with already completed IPAS interrogations of plasma collected at the time of ovarian cancer diagnosis. Thus together we will provide information on both pre-clinical and clinical behavior of a large number of proteins. Based on our preliminary work we are able to quantify over 500 plasma proteins in each of these experiments, many of which are putative ovarian cancer biomarkers, showing the platform is capable of providing useful information regarding biomarker candidates.
MOCASSIN-prot: a multi-objective clustering approach for protein similarity networks.
Keel, Brittney N; Deng, Bo; Moriyama, Etsuko N
2018-04-15
Proteins often include multiple conserved domains. Various evolutionary events including duplication and loss of domains, domain shuffling, as well as sequence divergence contribute to generating complexities in protein structures, and consequently, in their functions. The evolutionary history of proteins is hence best modeled through networks that incorporate information both from the sequence divergence and the domain content. Here, a game-theoretic approach proposed for protein network construction is adapted into the framework of multi-objective optimization, and extended to incorporate clustering refinement procedure. The new method, MOCASSIN-prot, was applied to cluster multi-domain proteins from ten genomes. The performance of MOCASSIN-prot was compared against two protein clustering methods, Markov clustering (TRIBE-MCL) and spectral clustering (SCPS). We showed that compared to these two methods, MOCASSIN-prot, which uses both domain composition and quantitative sequence similarity information, generates fewer false positives. It achieves more functionally coherent protein clusters and better differentiates protein families. MOCASSIN-prot, implemented in Perl and Matlab, is freely available at http://bioinfolab.unl.edu/emlab/MOCASSINprot. emoriyama2@unl.edu. Supplementary data are available at Bioinformatics online.
Progress and opportunities in EELS and EDS tomography.
Collins, Sean M; Midgley, Paul A
2017-09-01
Electron tomography using energy loss and X-ray spectroscopy in the electron microscope continues to develop in rapidly evolving and diverse directions, enabling new insight into the three-dimensional chemistry and physics of nanoscale volumes. Progress has been made recently in improving reconstructions from EELS and EDS signals in electron tomography by applying compressed sensing methods, characterizing new detector technologies in detail, deriving improved models of signal generation, and exploring machine learning approaches to signal processing. These disparate threads can be brought together in a cohesive framework in terms of a model-based approach to analytical electron tomography. Models incorporate information on signal generation and detection as well as prior knowledge of structures in the spectrum image data. Many recent examples illustrate the flexibility of this approach and its feasibility for addressing challenges in non-linear or limited signals in EELS and EDS tomography. Further work in combining multiple imaging and spectroscopy modalities, developing synergistic data acquisition, processing, and reconstruction approaches, and improving the precision of quantitative spectroscopic tomography will expand the frontiers of spatial resolution, dose limits, and maximal information recovery. Copyright © 2017 Elsevier B.V. All rights reserved.
Prospects of second generation artificial intelligence tools in calibration of chemical sensors.
Braibanti, Antonio; Rao, Rupenaguntla Sambasiva; Ramam, Veluri Anantha; Rao, Gollapalli Nageswara; Rao, Vaddadi Venkata Panakala
2005-05-01
Multivariate data driven calibration models with neural networks (NNs) are developed for binary (Cu++ and Ca++) and quaternary (K+, Ca++, NO3- and Cl-) ion-selective electrode (ISE) data. The response profiles of ISEs with concentrations are non-linear and sub-Nernstian. This task represents function approximation of multi-variate, multi-response, correlated, non-linear data with unknown noise structure i.e. multi-component calibration/prediction in chemometric parlance. Radial distribution function (RBF) and Fuzzy-ARTMAP-NN models implemented in the software packages, TRAJAN and Professional II, are employed for the calibration. The optimum NN models reported are based on residuals in concentration space. Being a data driven information technology, NN does not require a model, prior- or posterior- distribution of data or noise structure. Missing information, spikes or newer trends in different concentration ranges can be modeled through novelty detection. Two simulated data sets generated from mathematical functions are modeled as a function of number of data points and network parameters like number of neurons and nearest neighbors. The success of RBF and Fuzzy-ARTMAP-NNs to develop adequate calibration models for experimental data and function approximation models for more complex simulated data sets ensures AI2 (artificial intelligence, 2nd generation) as a promising technology in quantitation.
Household hazardous waste management: a review.
Inglezakis, Vassilis J; Moustakas, Konstantinos
2015-03-01
This paper deals with the waste stream of household hazardous waste (HHW) presenting existing management systems, legislation overview and other relevant quantitative and qualitative information. European Union legislation and international management schemes are summarized and presented in a concise manner by the use of diagrams in order to provide crucial information on HHW. Furthermore, sources and types, numerical figures about generation, collection and relevant management costs are within the scope of the present paper. The review shows that the term used to refer to hazardous waste generated in households is not clearly defined in legislation, while there is absence of specific acts regulating the management of HHW. The lack of obligation to segregate HHW from the household waste and the different terminology used makes it difficult to determine the quantities and composition of this waste stream, while its generation amount is relatively small and, therefore, is commonly overlooked in waste statistics. The paper aims to cover the gap in the related literature on a subject that is included within the crucial waste management challenges at world level, considering that HHW can also have impact on other waste streams by altering the redox conditions or causing direct reactions with other non hazardous waste substances. Copyright © 2014 Elsevier Ltd. All rights reserved.
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2014 CFR
2014-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2011 CFR
2011-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2010 CFR
2010-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2012 CFR
2012-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2013 CFR
2013-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
Calibrating genomic and allelic coverage bias in single-cell sequencing.
Zhang, Cheng-Zhong; Adalsteinsson, Viktor A; Francis, Joshua; Cornils, Hauke; Jung, Joonil; Maire, Cecile; Ligon, Keith L; Meyerson, Matthew; Love, J Christopher
2015-04-16
Artifacts introduced in whole-genome amplification (WGA) make it difficult to derive accurate genomic information from single-cell genomes and require different analytical strategies from bulk genome analysis. Here, we describe statistical methods to quantitatively assess the amplification bias resulting from whole-genome amplification of single-cell genomic DNA. Analysis of single-cell DNA libraries generated by different technologies revealed universal features of the genome coverage bias predominantly generated at the amplicon level (1-10 kb). The magnitude of coverage bias can be accurately calibrated from low-pass sequencing (∼0.1 × ) to predict the depth-of-coverage yield of single-cell DNA libraries sequenced at arbitrary depths. We further provide a benchmark comparison of single-cell libraries generated by multi-strand displacement amplification (MDA) and multiple annealing and looping-based amplification cycles (MALBAC). Finally, we develop statistical models to calibrate allelic bias in single-cell whole-genome amplification and demonstrate a census-based strategy for efficient and accurate variant detection from low-input biopsy samples.
Calibrating genomic and allelic coverage bias in single-cell sequencing
Francis, Joshua; Cornils, Hauke; Jung, Joonil; Maire, Cecile; Ligon, Keith L.; Meyerson, Matthew; Love, J. Christopher
2016-01-01
Artifacts introduced in whole-genome amplification (WGA) make it difficult to derive accurate genomic information from single-cell genomes and require different analytical strategies from bulk genome analysis. Here, we describe statistical methods to quantitatively assess the amplification bias resulting from whole-genome amplification of single-cell genomic DNA. Analysis of single-cell DNA libraries generated by different technologies revealed universal features of the genome coverage bias predominantly generated at the amplicon level (1–10 kb). The magnitude of coverage bias can be accurately calibrated from low-pass sequencing (~0.1 ×) to predict the depth-of-coverage yield of single-cell DNA libraries sequenced at arbitrary depths. We further provide a benchmark comparison of single-cell libraries generated by multi-strand displacement amplification (MDA) and multiple annealing and looping-based amplification cycles (MALBAC). Finally, we develop statistical models to calibrate allelic bias in single-cell whole-genome amplification and demonstrate a census-based strategy for efficient and accurate variant detection from low-input biopsy samples. PMID:25879913
77 FR 75498 - Request for Comments on a New Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-20
... statistical surveys that yield quantitative results that can be generalized to the population of study. DATES... surveys that yield quantitative results that can be generalized to the population of study. This feedback... qualitative information will not be used for quantitative information collections that are designed to yield...
78 FR 57903 - Notice of Intent To Seek Approval To Renew an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
.... The indicators are both quantitative and descriptive. Quantitative information from the most recently... center activities with respect to industrial collaboration. [cir] Conducting a survey of all center... quantitative indicators determined by NSF to analyze the management and operation of the center. [cir...
An Apollo compatible cloud physics experiment.
NASA Technical Reports Server (NTRS)
Eaton, L. R.; Hollinden, A. B.; Satterblom, P. R.
1973-01-01
Consideration of the utilization of a low-gravity environment to obtain experimental information, in the area of cloud microphysics, which cannot be obtained in ground laboratories. The experiment discussed is designed to obtain quantitative answers about evaporation and breakup of salt particles from ocean spray and other sources. In addition to salt nuclei distribution mechanisms, this breakup has ecological importance in relation to the spreading of salt mists from salted highways and spreading of brine cooling tower spray from electrical power generation plants. This experiment is being submitted for consideration on the Apollo-Soyuz Test Program in 1975.
Quantitative polarized light microscopy using spectral multiplexing interferometry.
Li, Chengshuai; Zhu, Yizheng
2015-06-01
We propose an interferometric spectral multiplexing method for measuring birefringent specimens with simple configuration and high sensitivity. The retardation and orientation of sample birefringence are simultaneously encoded onto two spectral carrier waves, generated interferometrically by a birefringent crystal through polarization mixing. A single interference spectrum hence contains sufficient information for birefringence determination, eliminating the need for mechanical rotation or electrical modulation. The technique is analyzed theoretically and validated experimentally on cellulose film. System simplicity permits the possibility of mitigating system birefringence background. Further analysis demonstrates the technique's exquisite sensitivity as high as ∼20 pm for retardation measurement.
A literature review of studies using qualitative research to explore chronic neuromuscular disease.
LaDonna, Kori A
2011-06-01
Although most neuromuscular disease research articles reflect traditional quantitative approaches, qualitative methods are becoming more prevalent in the neuromuscular literature. Arguably, qualitative research provides rich data that may be used to generate patient-centered outcome measures or influence current standards of care. The purpose of this article is to explore the qualitative literature pertaining to individuals and families living with chronic neuromuscular disease in order to suggest implications for practice. Fifty-six qualitative articles addressing seven research themes including Illness Experience; Work, Recreation, and Services; Assisted Ventilation; Caregiving; Genetics; Communication and Information Seeking; and Palliative Care were identified.
Fostering synergy between cell biology and systems biology.
Eddy, James A; Funk, Cory C; Price, Nathan D
2015-08-01
In the shared pursuit of elucidating detailed mechanisms of cell function, systems biology presents a natural complement to ongoing efforts in cell biology. Systems biology aims to characterize biological systems through integrated and quantitative modeling of cellular information. The process of model building and analysis provides value through synthesizing and cataloging information about cells and molecules, predicting mechanisms and identifying generalizable themes, generating hypotheses and guiding experimental design, and highlighting knowledge gaps and refining understanding. In turn, incorporating domain expertise and experimental data is crucial for building towards whole cell models. An iterative cycle of interaction between cell and systems biologists advances the goals of both fields and establishes a framework for mechanistic understanding of the genome-to-phenome relationship. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.
1989-01-01
A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.
NASA Technical Reports Server (NTRS)
Leser, William P.; Yuan, Fuh-Gwo; Leser, William P.
2013-01-01
A method of numerically estimating dynamic Green's functions using the finite element method is proposed. These Green's functions are accurate in a limited frequency range dependent on the mesh size used to generate them. This range can often match or exceed the frequency sensitivity of the traditional acoustic emission sensors. An algorithm is also developed to characterize an acoustic emission source by obtaining information about its strength and temporal dependence. This information can then be used to reproduce the source in a finite element model for further analysis. Numerical examples are presented that demonstrate the ability of the band-limited Green's functions approach to determine the moment tensor coefficients of several reference signals to within seven percent, as well as accurately reproduce the source-time function.
Ciniciato, Gustavo P. M. K.; Ng, Fong-Lee; Phang, Siew-Moi; Jaafar, Muhammad Musoddiq; Fisher, Adrian C.; Yunus, Kamran; Periasamy, Vengadesh
2016-01-01
Microbial fuel cells operating with autotrophic microorganisms are known as biophotovoltaic devices. It represents a great opportunity for environmentally-friendly power generation using the energy of the sunlight. The efficiency of electricity generation in this novel system is however low. This is partially reflected by the poor understanding of the bioelectrochemical mechanisms behind the electron transfer from these microorganisms to the electrode surface. In this work, we propose a combination of electrochemical and fluorescence techniques, giving emphasis to the pulse amplitude modulation fluorescence. The combination of these two techniques allow us to obtain information that can assist in understanding the electrical response obtained from the generation of electricity through the intrinsic properties related to the photosynthetic efficiency that can be obtained from the fluorescence emitted. These were achieved quantitatively by means of observed changes in four photosynthetic parameters with the bioanode generating electricity. These are the maximum quantum yield (Fv/Fm), alpha (α), light saturation coefficient (Ek) and maximum rate of electron transfer (rETRm). The relationship between the increases in the current density collected by the bioanode to the decrease of the rETRm values in the photosynthetic pathway for the two microorganisms was also discussed. PMID:27502051
NASA Astrophysics Data System (ADS)
Ciniciato, Gustavo P. M. K.; Ng, Fong-Lee; Phang, Siew-Moi; Jaafar, Muhammad Musoddiq; Fisher, Adrian C.; Yunus, Kamran; Periasamy, Vengadesh
2016-08-01
Microbial fuel cells operating with autotrophic microorganisms are known as biophotovoltaic devices. It represents a great opportunity for environmentally-friendly power generation using the energy of the sunlight. The efficiency of electricity generation in this novel system is however low. This is partially reflected by the poor understanding of the bioelectrochemical mechanisms behind the electron transfer from these microorganisms to the electrode surface. In this work, we propose a combination of electrochemical and fluorescence techniques, giving emphasis to the pulse amplitude modulation fluorescence. The combination of these two techniques allow us to obtain information that can assist in understanding the electrical response obtained from the generation of electricity through the intrinsic properties related to the photosynthetic efficiency that can be obtained from the fluorescence emitted. These were achieved quantitatively by means of observed changes in four photosynthetic parameters with the bioanode generating electricity. These are the maximum quantum yield (Fv/Fm), alpha (α), light saturation coefficient (Ek) and maximum rate of electron transfer (rETRm). The relationship between the increases in the current density collected by the bioanode to the decrease of the rETRm values in the photosynthetic pathway for the two microorganisms was also discussed.
Analysis of tonal noise generating mechanisms in low-speed axial-flow fans
NASA Astrophysics Data System (ADS)
Canepa, Edward; Cattanei, Andrea; Zecchin, Fabio Mazzocut
2016-08-01
The present paper reports a comparison of experimental SPL spectral data related to the tonal noise generated by axial-flow fans. A nine blade rotor has been operated at free discharge conditions and in four geometrical configurations in which different kinds of tonal noise generating mechanisms are present: large-scale inlet turbulent structures, tip-gap flow, turbulent wakes, and rotor-stator interaction. The measurements have been taken in a hemi-anechoic chamber at constant rotational speed and, in order to vary the acoustic source strength, during low angular acceleration, linear speed ramps. In order to avoid erroneous quantitative evaluations if the acoustic propagation effects are not considered, the acoustic response functions of the different test configurations have been computed by means of the spectral decomposition method. Then, the properties of the tonal noise generating mechanisms have been studied. To this aim, the constant-Strouhal number SPL, obtained by means of measurements taken during the speed ramps, have been compared with the propagation function. Finally, the analysis of the phase of the acoustic pressure has allowed to distinguish between random and deterministic tonal noise generating mechanisms and to collect information about the presence of important propagation effects.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
... research to explore issues of quantitative benefit information. They all described the collection of data... research will involve quantitative assessment of the comprehension of important information in the document... of experiences and varying degrees of satisfaction with information currently provided at the time...
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
Motion-guided attention promotes adaptive communications during social navigation.
Lemasson, B H; Anderson, J J; Goodwin, R A
2013-03-07
Animals are capable of enhanced decision making through cooperation, whereby accurate decisions can occur quickly through decentralized consensus. These interactions often depend upon reliable social cues, which can result in highly coordinated activities in uncertain environments. Yet information within a crowd may be lost in translation, generating confusion and enhancing individual risk. As quantitative data detailing animal social interactions accumulate, the mechanisms enabling individuals to rapidly and accurately process competing social cues remain unresolved. Here, we model how motion-guided attention influences the exchange of visual information during social navigation. We also compare the performance of this mechanism to the hypothesis that robust social coordination requires individuals to numerically limit their attention to a set of n-nearest neighbours. While we find that such numerically limited attention does not generate robust social navigation across ecological contexts, several notable qualities arise from selective attention to motion cues. First, individuals can instantly become a local information hub when startled into action, without requiring changes in neighbour attention level. Second, individuals can circumvent speed-accuracy trade-offs by tuning their motion thresholds. In turn, these properties enable groups to collectively dampen or amplify social information. Lastly, the minority required to sway a group's short-term directional decisions can change substantially with social context. Our findings suggest that motion-guided attention is a fundamental and efficient mechanism underlying collaborative decision making during social navigation.
Robles, Estuardo
2017-09-01
In no vertebrate species do we possess an accurate, comprehensive tally of neuron types in the brain. This is in no small part due to the vast diversity of neuronal types that comprise complex vertebrate nervous systems. A fundamental goal of neuroscience is to construct comprehensive catalogs of cell types defined by structure, connectivity, and physiological response properties. This type of information will be invaluable for generating models of how assemblies of neurons encode and distribute sensory information and correspondingly alter behavior. This review summarizes recent efforts in the larval zebrafish to construct sensory projectomes, comprehensive analyses of axonal morphologies in sensory axon tracts. Focusing on the olfactory and optic tract, these studies revealed principles of sensory information processing in the olfactory and visual systems that could not have been directly quantified by other methods. In essence, these studies reconstructed the optic and olfactory tract in a virtual manner, providing insights into patterns of neuronal growth that underlie the formation of sensory axon tracts. Quantitative analysis of neuronal diversity revealed organizing principles that determine information flow through sensory systems in the zebrafish that are likely to be conserved across vertebrate species. The generation of comprehensive cell type classifications based on structural, physiological, and molecular features will lead to testable hypotheses on the functional role of individual sensory neuron subtypes in controlling specific sensory-evoked behaviors.
Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface
2017-02-01
COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three
Vera-Candioti, Luciana; Culzoni, María J; Olivieri, Alejandro C; Goicoechea, Héctor C
2008-11-01
Drug monitoring in serum samples was performed using second-order data generated by CE-DAD, processed with a suitable chemometric strategy. Carbamazepine could be accurately quantitated in the presence of its main metabolite (carbamazepine epoxide), other therapeutic drugs (lamotrigine, phenobarbital, phenytoin, phenylephrine, ibuprofen, acetaminophen, theophylline, caffeine, acetyl salicylic acid), and additional serum endogenous components. The analytical strategy consisted of the following steps: (i) serum sample clean-up to remove matrix interferences, (ii) data pre-processing, in order to reduce the background and to correct for electrophoretic time shifts, and (iii) resolution of fully overlapped CE peaks (corresponding to carbamazepine, its metabolite, lamotrigine and unexpected serum components) by the well-known multivariate curve resolution-alternating least squares algorithm, which extracts quantitative information that can be uniquely ascribed to the analyte of interest. The analyte concentration in serum samples ranged from 2.00 to 8.00 mg/L. Mean recoveries were 102.6% (s=7.7) for binary samples, and 94.8% (s=13.5) for spiked serum samples, while CV (%)=4.0 was computed for five replicate, indicative of the acceptable accuracy and precision of the proposed method.
A web-based quantitative signal detection system on adverse drug reaction in China.
Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan
2009-07-01
To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.
Araneda, Cristian; Díaz, Nelson F.; Gomez, Gilda; López, María Eugenia; Iturra, Patricia
2012-01-01
Spawning time in salmonids is a sex-limited quantitative trait that can be modified by selection. In rainbow trout (Oncorhynchus mykiss), various quantitative trait loci (QTL) that affect the expression of this trait have been discovered. In this study, we describe four microsatellite loci associated with two possible spawning time QTL regions in coho salmon (Oncorhynchus kisutch). The four loci were identified in females from two populations (early and late spawners) produced by divergent selection from the same base population. Three of the loci (OmyFGT34TUF, One2ASC and One19ASC) that were strongly associated with spawning time in coho salmon (p < 0.0002) were previously associated with QTL for the same trait in rainbow trout; a fourth loci (Oki10) with a suggestive association (p = 0.00035) mapped 10 cM from locus OmyFGT34TUF in rainbow trout. The changes in allelic frequency observed after three generations of selection were greater than expected because of genetic drift. This work shows that comparing information from closely-related species is a valid strategy for identifying QTLs for marker-assisted selection in species whose genomes are poorly characterized or lack a saturated genetic map. PMID:22888302
Naik, P K; Singh, T; Singh, H
2009-07-01
Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.
Branham, Sandra E; Stansell, Zachary J; Couillard, David M; Farnham, Mark W
2017-03-01
Five quantitative trait loci and one epistatic interaction were associated with heat tolerance in a doubled haploid population of broccoli evaluated in three summer field trials. Predicted rising global temperatures due to climate change have generated a demand for crops that are resistant to yield and quality losses from heat stress. Broccoli (Brassica oleracea var. italica) is a cool weather crop with high temperatures during production decreasing both head quality and yield. Breeding for heat tolerance in broccoli has potential to both expand viable production areas and extend the growing season but breeding efficiency is constrained by limited genetic information. A doubled haploid (DH) broccoli population segregating for heat tolerance was evaluated for head quality in three summer fields in Charleston, SC, USA. Multiple quantitative trait loci (QTL) mapping of 1,423 single nucleotide polymorphisms developed through genotyping-by-sequencing identified five QTL and one positive epistatic interaction that explained 62.1% of variation in heat tolerance. The QTL identified here can be used to develop markers for marker-assisted selection and to increase our understanding of the molecular mechanisms underlying plant response to heat stress.
NASA Astrophysics Data System (ADS)
Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.
2004-08-01
The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.
75 FR 18571 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-12
... Comprehensive Quantitative Impact Study.'' The OCC has also given notice that it has sent this collection to OMB... following new information collection: Title: Basel Comprehensive Quantitative Impact Study. OMB Control No... the Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to...
78 FR 68450 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... hours. Key informant interviews and the quantitative survey will be conducted by telephone. As telephone... qualitative and quantitative data in order to develop and refine the Tool, and assess feasibility and audience... collection will be used to help inform a quantitative stage of work to include a national sample of...
Visualizing the Critique: Integrating Quantitative Reasoning with the Design Process
ERIC Educational Resources Information Center
Weinstein, Kathryn
2017-01-01
In the age of "Big Data," information is often quantitative in nature. The ability to analyze information through the sifting of data has been identified as a core competency for success in navigating daily life and participation in the contemporary workforce. This skill, known as Quantitative Reasoning (QR), is characterized by the…
Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces
2012-03-01
with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation
openBIS: a flexible framework for managing and analyzing complex data in biology research
2011-01-01
Background Modern data generation techniques used in distributed systems biology research projects often create datasets of enormous size and diversity. We argue that in order to overcome the challenge of managing those large quantitative datasets and maximise the biological information extracted from them, a sound information system is required. Ease of integration with data analysis pipelines and other computational tools is a key requirement for it. Results We have developed openBIS, an open source software framework for constructing user-friendly, scalable and powerful information systems for data and metadata acquired in biological experiments. openBIS enables users to collect, integrate, share, publish data and to connect to data processing pipelines. This framework can be extended and has been customized for different data types acquired by a range of technologies. Conclusions openBIS is currently being used by several SystemsX.ch and EU projects applying mass spectrometric measurements of metabolites and proteins, High Content Screening, or Next Generation Sequencing technologies. The attributes that make it interesting to a large research community involved in systems biology projects include versatility, simplicity in deployment, scalability to very large data, flexibility to handle any biological data type and extensibility to the needs of any research domain. PMID:22151573
Robinson-Papp, Jessica; George, Mary Catherine; Wongmek, Arada; Nmashie, Alexandra; Merlin, Jessica S; Ali, Yousaf; Epstein, Lawrence; Green, Mark; Serban, Stelian; Sheth, Parag; Simpson, David M
2015-09-01
The extent to which patients take chronic pain medications as prescribed is not well studied, and there are no generally agreed-upon measures. The Quantitative Analgesic Questionnaire (QAQ) is a new instrument designed to comprehensively document patient-reported medication use, generate scores to quantify it (by individual drug, class, and/or overall), and compare it (qualitatively and/or quantitatively) to the regimen as prescribed. The aim of this study was to describe the development and preliminary validation of the QAQ. The QAQ was studied in a convenience sample of 149 HIV-infected participants. We found that the QAQ scores computed for participants' chronic pain medication regimens were valid based on their correlation with 1) patient-reported pain intensity (r = 0.38; P < 0.001) and 2) experienced pain management physicians' independent quantification of the regimens (r = 0.89; P < 0.001). The QAQ also demonstrated high interrater reliability (r = 0.957; P < 0.001). Detailed examination of the QAQ data in a subset of 34 participants demonstrated that the QAQ revealed suboptimal adherence in 44% of participants and contained information that would not have been gleaned from review of the medical record alone in 94%, including use of over-the-counter medications and quantification of "as needed" dosing. The QAQ also was found to be useful in quantifying change in the medication regimen over time, capturing a change in 50% of the participants from baseline to eight week follow-up. The QAQ is a simple tool that can facilitate understanding of patient-reported chronic pain medication regimens, including calculation of percent adherence and generation of quantitative scores suitable for estimating and tracking change in medication use over time. Copyright © 2015 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
A test for selection employing quantitative trait locus and mutation accumulation data.
Rice, Daniel P; Townsend, Jeffrey P
2012-04-01
Evolutionary biologists attribute much of the phenotypic diversity observed in nature to the action of natural selection. However, for many phenotypic traits, especially quantitative phenotypic traits, it has been challenging to test for the historical action of selection. An important challenge for biologists studying quantitative traits, therefore, is to distinguish between traits that have evolved under the influence of strong selection and those that have evolved neutrally. Most existing tests for selection employ molecular data, but selection also leaves a mark on the genetic architecture underlying a trait. In particular, the distribution of quantitative trait locus (QTL) effect sizes and the distribution of mutational effects together provide information regarding the history of selection. Despite the increasing availability of QTL and mutation accumulation data, such data have not yet been effectively exploited for this purpose. We present a model of the evolution of QTL and employ it to formulate a test for historical selection. To provide a baseline for neutral evolution of the trait, we estimate the distribution of mutational effects from mutation accumulation experiments. We then apply a maximum-likelihood-based method of inference to estimate the range of selection strengths under which such a distribution of mutations could generate the observed QTL. Our test thus represents the first integration of population genetic theory and QTL data to measure the historical influence of selection.
Quantitative biomarkers of colonic dysplasia based on intrinsic second-harmonic generation signal
NASA Astrophysics Data System (ADS)
Zhuo, Shuangmu; Zhu, Xiaoqin; Wu, Guizhu; Chen, Jianxin; Xie, Shusen
2011-12-01
Most colorectal cancers arise from dysplastic lesions, such as adenomatous polyps, and these lesions are difficult to be detected by the current endoscopic screening approaches. Here, we present the use of an intrinsic second-harmonic generation (SHG) signal as a novel means to differentiate between normal and dysplastic human colonic tissues. We find that the SHG signal can quantitatively identify collagen change associated with colonic dysplasia that is indiscernible by conventional pathologic techniques. By comparing normal with dysplastic mucosa, there were significant differences in collagen density and collagen fiber direction, providing substantial potential to become quantitative intrinsic biomarkers for in vivo clinical diagnosis of colonic dysplasia.
Shachak, Aviv; Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R; Lemieux-Charles, Louise
2013-06-04
Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design.
NASA Astrophysics Data System (ADS)
Lan, Ganhui; Tu, Yuhai
2016-05-01
Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also study the thermodynamic costs of adaptation for cells to maintain an accurate memory. The statistical physics based approach described here should be useful in understanding design principles for cellular biochemical circuits in general.
Lan, Ganhui; Tu, Yuhai
2016-05-01
Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also study the thermodynamic costs of adaptation for cells to maintain an accurate memory. The statistical physics based approach described here should be useful in understanding design principles for cellular biochemical circuits in general.
Contrast-enhanced endoscopic ultrasonography in digestive diseases.
Hirooka, Yoshiki; Itoh, Akihiro; Kawashima, Hiroki; Ohno, Eizaburo; Itoh, Yuya; Nakamura, Yosuke; Hiramatsu, Takeshi; Sugimoto, Hiroyuki; Sumi, Hajime; Hayashi, Daijiro; Ohmiya, Naoki; Miyahara, Ryoji; Nakamura, Masanao; Funasaka, Kohei; Ishigami, Masatoshi; Katano, Yoshiaki; Goto, Hidemi
2012-10-01
Contrast-enhanced endoscopic ultrasonography (CE-EUS) was introduced in the early 1990s. The concept of the injection of carbon dioxide microbubbles into the hepatic artery as a contrast material (enhanced ultrasonography) led to "endoscopic ultrasonographic angiography". After the arrival of the first-generation contrast agent, high-frequency (12 MHz) EUS brought about the enhancement of EUS images in the diagnosis of pancreatico-biliary diseases, upper gastrointestinal (GI) cancer, and submucosal tumors. The electronic scanning endosonoscope with both radial and linear probes enabled the use of high-end ultrasound machines and depicted the enhancement of both color/power Doppler flow-based imaging and harmonic-based imaging using second-generation contrast agents. Many reports have described the usefulness of the differential diagnosis of pancreatic diseases and other abdominal lesions. Quantitative evaluation of CE-EUS images was an objective method of diagnosis using the time-intensity curve (TIC), but it was limited to the region of interest. Recently developed Inflow Time Mapping™ can be generated from stored clips and used to display the pattern of signal enhancement with time after injection, offering temporal difference of contrast agents and improved tumor characterization. On the other hand, three-dimensional CE-EUS images added new information to the literature, but lacked positional information. Three-dimensional CE-EUS with accurate positional information is awaited. To date, most reports have been related to pancreatic lesions or lymph nodes. Hemodynamic analysis might be of use for diseases in other organs: upper GI cancer diagnosis, submucosal tumors, and biliary disorders, and it might also provide functional information. Studies of CE-EUS in diseases in many other organs will increase in the near future.
HeatWave: the next generation of thermography devices
NASA Astrophysics Data System (ADS)
Moghadam, Peyman; Vidas, Stephen
2014-05-01
Energy sustainability is a major challenge of the 21st century. To reduce environmental impact, changes are required not only on the supply side of the energy chain by introducing renewable energy sources, but also on the demand side by reducing energy usage and improving energy efficiency. Currently, 2D thermal imaging is used for energy auditing, which measures the thermal radiation from the surfaces of objects and represents it as a set of color-mapped images that can be analysed for the purpose of energy efficiency monitoring. A limitation of such a method for energy auditing is that it lacks information on the geometry and location of objects with reference to each other, particularly across separate images. Such a limitation prevents any quantitative analysis to be done, for example, detecting any energy performance changes before and after retrofitting. To address these limitations, we have developed a next generation thermography device called Heat Wave. Heat Wave is a hand-held 3D thermography device that consists of a thermal camera, a range sensor and color camera, and can be used to generate precise 3D model of objects with augmented temperature and visible information. As an operator holding the device smoothly waves it around the objects of interest, Heat Wave can continuously track its own pose in space and integrate new information from the range and thermal and color cameras into a single, and precise 3D multi-modal model. Information from multiple viewpoints can be incorporated together to improve the accuracy, reliability and robustness of the global model. The approach also makes it possible to reduce any systematic errors associated with the estimation of surface temperature from the thermal images.
Quantiprot - a Python package for quantitative analysis of protein sequences.
Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold
2017-07-17
The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.
NASA Astrophysics Data System (ADS)
Fan, X.; Chen, L.; Ma, Z.
2010-12-01
Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.
Kliner, Dustin; Wang, Li; Winger, Daniel; Follansbee, William P; Soman, Prem
2015-12-01
Gated single-photon emission computed tomography (SPECT) is widely used for myocardial perfusion imaging and provides an automated assessment of left ventricular ejection fraction (LVEF). We prospectively tested the repeatability of serial SPECT-derived LVEF. This information is essential in order to inform the interpretation of a change in LV function on serial testing. Consenting patients (n = 50) from among those referred for clinically indicated gated myocardial perfusion SPECT (MPs) were recruited. Following the clinical rest-stress study, patients were repositioned on the camera table for a second acquisition using identical parameters. Patient positioning, image acquisition and processing for the second scan were independently performed by a technologist blinded to the clinical scan. Quantitative LVEF was generated by Quantitative Gated SPECT and recorded as EF1 and EF2, respectively. Repeatability of serial results was assessed using the Bland-Altman method. The limits of repeatability and repeatability coefficients were generated to determine the maximum variation in LVEF that can be expected to result from test variability. Repeatability was tested across a broad range of LV systolic function and myocardial perfusion. The mean difference between EF1 and EF2 was 1.6% (EF units), with 95% limits of repeatability of +9.1% to -6.0% (repeatability coefficient 7.5%). Correlation between serial EF measurements was excellent (r = 0.9809). Similar results were obtained in subgroups based on normal or abnormal EF and myocardial perfusion. The largest repeatability coefficient of 8.1% was seen in patients with abnormal LV systolic function. When test protocol and acquisition parameters are kept constant, a difference of >8% EF units on serial MPs is indicative of a true change 95% of the time.
76 FR 12960 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... population to which generalizations will be made, the sampling frame, the sample design (including... for submission for other generic mechanisms that are designed to yield quantitative results. The... generic clearance for qualitative information will not be used for quantitative information collections...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-01
.... Currie, Program Analyst, Office of policy for Extramural Research Administration, 6705 Rockledge Drive... perceptions and opinions, but are not statistical surveys that yield quantitative results that can be... generic clearance for qualitative information will not be used for quantitative information collections...
77 FR 72831 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... consumers to make better informed financial decisions. Together with the CFPB's Office of Research, OFE is also responsible for conducting ``research related to consumer financial education and counseling... quantitative data through in-person and telephone surveys. The information collected through quantitative...
Atherton, K; Young, B; Salmon, P
2017-11-01
Clinical practice in haematological oncology often involves difficult diagnostic and treatment decisions. In this context, understanding patients' information needs and the functions that information serves for them is particularly important. We systematically reviewed qualitative and quantitative evidence on haematological oncology patients' information needs to inform how these needs can best be addressed in clinical practice. PsycINFO, Medline and CINAHL Plus electronic databases were searched for relevant empirical papers published from January 2003 to July 2016. Synthesis of the findings drew on meta-ethnography and meta-study. Most quantitative studies used a survey design and indicated that patients are largely content with the information they receive from physicians, however much or little they actually receive, although a minority of patients are not content with information. Qualitative studies suggest that a sense of being in a caring relationship with a physician allows patients to feel content with the information they have been given, whereas patients who lack such a relationship want more information. The qualitative evidence can help explain the lack of association between the amount of information received and contentment with it in the quantitative research. Trusting relationships are integral to helping patients feel that their information needs have been met. © 2017 John Wiley & Sons Ltd.
Rusyn, Ivan; Sedykh, Alexander; Guyton, Kathryn Z.; Tropsha, Alexander
2012-01-01
Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules; however, traditional QSAR models often have limited predictive power due to the lack of data and complexity of in vivo endpoints. Although it has been indeed difficult to obtain experimentally derived toxicity data on a large number of chemicals in the past, the results of quantitative in vitro screening of thousands of environmental chemicals in hundreds of experimental systems are now available and continue to accumulate. In addition, publicly accessible toxicogenomics data collected on hundreds of chemicals provide another dimension of molecular information that is potentially useful for predictive toxicity modeling. These new characteristics of molecular bioactivity arising from short-term biological assays, i.e., in vitro screening and/or in vivo toxicogenomics data can now be exploited in combination with chemical structural information to generate hybrid QSAR–like quantitative models to predict human toxicity and carcinogenicity. Using several case studies, we illustrate the benefits of a hybrid modeling approach, namely improvements in the accuracy of models, enhanced interpretation of the most predictive features, and expanded applicability domain for wider chemical space coverage. PMID:22387746
Iancu, Ovidiu D; Darakjian, Priscila; Kawane, Sunita; Bottomly, Daniel; Hitzemann, Robert; McWeeney, Shannon
2012-01-01
Complex Mus musculus crosses, e.g., heterogeneous stock (HS), provide increased resolution for quantitative trait loci detection. However, increased genetic complexity challenges detection methods, with discordant results due to low data quality or complex genetic architecture. We quantified the impact of theses factors across three mouse crosses and two different detection methods, identifying procedures that greatly improve detection quality. Importantly, HS populations have complex genetic architectures not fully captured by the whole genome kinship matrix, calling for incorporating chromosome specific relatedness information. We analyze three increasingly complex crosses, using gene expression levels as quantitative traits. The three crosses were an F(2) intercross, a HS formed by crossing four inbred strains (HS4), and a HS (HS-CC) derived from the eight lines found in the collaborative cross. Brain (striatum) gene expression and genotype data were obtained using the Illumina platform. We found large disparities between methods, with concordance varying as genetic complexity increased; this problem was more acute for probes with distant regulatory elements (trans). A suite of data filtering steps resulted in substantial increases in reproducibility. Genetic relatedness between samples generated overabundance of detected eQTLs; an adjustment procedure that includes the kinship matrix attenuates this problem. However, we find that relatedness between individuals is not evenly distributed across the genome; information from distinct chromosomes results in relatedness structure different from the whole genome kinship matrix. Shared polymorphisms from distinct chromosomes collectively affect expression levels, confounding eQTL detection. We suggest that considering chromosome specific relatedness can result in improved eQTL detection.
TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics
Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi
2016-01-01
Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329
Quantification of Microbial Phenotypes
Martínez, Verónica S.; Krömer, Jens O.
2016-01-01
Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694
Bennett, B; Carosone-Link, P; Beeson, M; Gordon, L; Phares-Zook, N; Johnson, T E
2008-08-01
Interval-specific congenic strains (ISCS) allow fine mapping of a quantitative trait locus (QTL), narrowing its confidence interval by an order of magnitude or more. In earlier work, we mapped four QTL specifying differential ethanol sensitivity, assessed by loss of righting reflex because of ethanol (LORE), in the inbred long-sleep (ILS) and inbred short-sleep (ISS) strains, accounting for approximately 50% of the genetic variance for this trait. Subsequently, we generated reciprocal congenic strains in which each full QTL interval from ILS was bred onto the ISS background and vice versa. An earlier paper reported construction and results of the ISCS on the ISS background; here, we describe this process and report results on the ILS background. We developed multiple ISCS for each Lore QTL in which the QTL interval was broken into a number of smaller intervals. For each of the four QTL regions (chromosomes 1, 2, 11 and 15), we were successful in reducing the intervals significantly. Multiple, positive strains were overlapped to generate a single, reduced interval. Subsequently, this reduced region was overlaid on previous reductions from the ISS background congenics, resulting in substantial reductions in all QTL regions by approximately 75% from the initial mapping study. Genes with sequence or expression polymorphisms in the reduced intervals are potential candidates; evidence for these is presented. Genetic background effects can be important in detection of single QTL; combining this information with the generation of congenics on both backgrounds, as described here, is a powerful approach for fine mapping QTL.
A Modeling Approach for Burn Scar Assessment Using Natural Features and Elastic Property
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsap, L V; Zhang, Y; Goldgof, D B
2004-04-02
A modeling approach is presented for quantitative burn scar assessment. Emphases are given to: (1) constructing a finite element model from natural image features with an adaptive mesh, and (2) quantifying the Young's modulus of scars using the finite element model and the regularization method. A set of natural point features is extracted from the images of burn patients. A Delaunay triangle mesh is then generated that adapts to the point features. A 3D finite element model is built on top of the mesh with the aid of range images providing the depth information. The Young's modulus of scars ismore » quantified with a simplified regularization functional, assuming that the knowledge of scar's geometry is available. The consistency between the Relative Elasticity Index and the physician's rating based on the Vancouver Scale (a relative scale used to rate burn scars) indicates that the proposed modeling approach has high potentials for image-based quantitative burn scar assessment.« less
Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.
2012-01-01
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394
Preliminary experiments on pharmacokinetic diffuse fluorescence tomography of CT-scanning mode
NASA Astrophysics Data System (ADS)
Zhang, Yanqi; Wang, Xin; Yin, Guoyan; Li, Jiao; Zhou, Zhongxing; Zhao, Huijuan; Gao, Feng; Zhang, Limin
2016-10-01
In vivo tomographic imaging of the fluorescence pharmacokinetic parameters in tissues can provide additional specific and quantitative physiological and pathological information to that of fluorescence concentration. This modality normally requires a highly-sensitive diffuse fluorescence tomography (DFT) working in dynamic way to finally extract the pharmacokinetic parameters from the measured pharmacokinetics-associated temporally-varying boundary intensity. This paper is devoted to preliminary experimental validation of our proposed direct reconstruction scheme of instantaneous sampling based pharmacokinetic-DFT: A highly-sensitive DFT system of CT-scanning mode working with parallel four photomultiplier-tube photon-counting channels is developed to generate an instantaneous sampling dataset; A direct reconstruction scheme then extracts images of the pharmacokinetic parameters using the adaptive-EKF strategy. We design a dynamic phantom that can simulate the agent metabolism in living tissue. The results of the dynamic phantom experiments verify the validity of the experiment system and reconstruction algorithms, and demonstrate that system provides good resolution, high sensitivity and quantitativeness at different pump speed.
Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P
2012-06-05
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... Genevieve deAlmeida-Morris, Health Research Evaluator, Office of Science Policy and Communications, National... opinions, but are not statistical surveys that yield quantitative results that can be generalized to the... generic clearance for qualitative information will not be used for quantitative information collections...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-10
... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...), Office of Research and Development, U.S. Environmental Protection Agency, Washington, DC 20460; telephone... human health assessment program that evaluates quantitative and qualitative risk information on effects...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-06
... quantitative and qualitative risk information on effects that may result from exposure to specific chemical... Deputy Director, National Center for Environmental Assessment, (mail code: 8601D), Office of Research and... program that evaluates quantitative and qualitative risk information on effects that may result from...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-16
... health assessment program that evaluates quantitative and qualitative risk information on effects that..., National Center for Environmental Assessment, (mail code: 8601P), Office of Research and Development, U.S... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...
Gruen, D.M.; Young, C.E.; Pellin, M.J.
1989-12-26
A charged particle spectrometer is described for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode. 12 figs.
Tugnoli, Alessandro; Gubinelli, Gianfilippo; Landucci, Gabriele; Cozzani, Valerio
2014-08-30
The evaluation of the initial direction and velocity of the fragments generated in the fragmentation of a vessel due to internal pressure is an important information in the assessment of damage caused by fragments, in particular within the quantitative risk assessment (QRA) of chemical and process plants. In the present study an approach is proposed to the identification and validation of probability density functions (pdfs) for the initial direction of the fragments. A detailed review of a large number of past accidents provided the background information for the validation procedure. A specific method was developed for the validation of the proposed pdfs. Validated pdfs were obtained for both the vertical and horizontal angles of projection and for the initial velocity of the fragments. Copyright © 2014 Elsevier B.V. All rights reserved.
Shape and Size of Microfine Aggregates: X-ray Microcomputed Tomgraphy vs. Laser Diffraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erdogan,S.; Garboczi, E.; Fowler, D.
Microfine rock aggregates, formed naturally or in a crushing process, pass a No. 200 ASTM sieve, so have at least two orthogonal principal dimensions less than 75 {mu}m, the sieve opening size. In this paper, for the first time, we capture true 3-D shape and size data of several different types of microfine aggregates, using X-ray microcomputed tomography ({mu}CT) with a voxel size of 2 {mu}m. This information is used to generate shape analyses of various kinds. Particle size distributions are also generated from the {mu}CT data and quantitatively compared to the results of laser diffraction, which is the leadingmore » method for measuring particle size distributions of sub-millimeter size particles. By taking into account the actual particle shape, the differences between {mu}CT and laser diffraction can be qualitatively explained.« less
Halo-free Phase Contrast Microscopy
NASA Astrophysics Data System (ADS)
Nguyen, Tan H.; Kandel, Mikhail; Shakir, Haadi M.; Best-Popescu, Catherine; Arikkath, Jyothi; Do, Minh N.; Popescu, Gabriel
2017-03-01
We present a new approach for retrieving halo-free phase contrast microscopy (hfPC) images by upgrading the conventional PC microscope with an external interferometric module, which generates sufficient data for reversing the halo artifact. Acquiring four independent intensity images, our approach first measures haloed phase maps of the sample. We solve for the halo-free sample transmission function by using a physical model of the image formation under partial spatial coherence. Using this halo-free sample transmission, we can numerically generate artifact-free PC images. Furthermore, this transmission can be further used to obtain quantitative information about the sample, e.g., the thickness with known refractive indices, dry mass of live cells during their cycles. We tested our hfPC method on various control samples, e.g., beads, pillars and validated its potential for biological investigation by imaging live HeLa cells, red blood cells, and neurons.
McGarty, Arlene M; Melville, Craig A
2018-02-01
There is a need increase our understanding of what factors affect physical activity participation in children with intellectual disabilities (ID) and develop effective methods to overcome barriers and increase activity levels. This study aimed to systematically review parental perceptions of facilitators and barriers to physical activity for children with ID. A systematic search of Embase, Medline, ERIC, Web of Science, and PsycINFO was conducted (up to and including August, 2017) to identify relevant papers. A meta-ethnography approach was used to synthesise qualitative and quantitative results through the generation of third-order themes and a theoretical model. Ten studies were included, which ranged from weak to strong quality. Seventy-one second-order themes and 12 quantitative results were extracted. Five third-order themes were developed: family, child factors, inclusive programmes and facilities, social motivation, and child's experiences of physical activity. It is theorised that these factors can be facilitators or barriers to physical activity, depending on the information and education of relevant others, e.g. parents and coaches. Parents have an important role in supporting activity in children with ID. Increasing the information and education given to relevant others could be an important method of turning barriers into facilitators. Copyright © 2017 Elsevier Ltd. All rights reserved.
Wu, Pei-Wen; Mason, Katelyn E; Durbin-Johnson, Blythe P; Salemi, Michelle; Phinney, Brett S; Rocke, David M; Parker, Glendon J; Rice, Robert H
2017-07-01
Forensic association of hair shaft evidence with individuals is currently assessed by comparing mitochondrial DNA haplotypes of reference and casework samples, primarily for exclusionary purposes. Present work tests and validates more recent proteomic approaches to extract quantitative transcriptional and genetic information from hair samples of monozygotic twin pairs, which would be predicted to partition away from unrelated individuals if the datasets contain identifying information. Protein expression profiles and polymorphic, genetically variant hair peptides were generated from ten pairs of monozygotic twins. Profiling using the protein tryptic digests revealed that samples from identical twins had typically an order of magnitude fewer protein expression differences than unrelated individuals. The data did not indicate that the degree of difference within twin pairs increased with age. In parallel, data from the digests were used to detect genetically variant peptides that result from common nonsynonymous single nucleotide polymorphisms in genes expressed in the hair follicle. Compilation of the variants permitted sorting of the samples by hierarchical clustering, permitting accurate matching of twin pairs. The results demonstrate that genetic differences are detectable by proteomic methods and provide a framework for developing quantitative statistical estimates of personal identification that increase the value of hair shaft evidence. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
Mehmood, Irfan; Ejaz, Naveed; Sajjad, Muhammad; Baik, Sung Wook
2013-10-01
The objective of the present study is to explore prioritization methods in diagnostic imaging modalities to automatically determine the contents of medical images. In this paper, we propose an efficient prioritization of brain MRI. First, the visual perception of the radiologists is adapted to identify salient regions. Then this saliency information is used as an automatic label for accurate segmentation of brain lesion to determine the scientific value of that image. The qualitative and quantitative results prove that the rankings generated by the proposed method are closer to the rankings created by radiologists. Copyright © 2013 Elsevier Ltd. All rights reserved.
Seat pressure measurement technologies: considerations for their evaluation.
Gyi, D E; Porter, J M; Robertson, N K
1998-04-01
Interface pressure measurement has generated interest in the automotive industry as a technique which could be used in the prediction of driver discomfort for various car seat designs, and provide designers and manufacturers with rapid information early on in the design process. It is therefore essential that the data obtained are of the highest quality, relevant and have some quantitative meaning. Exploratory experimental work carried out with the commercially available Talley Pressure Monitor is outlined. This led to a better understanding of the strengths and weaknesses of this system and the re-design of the sensor matrix. Such evaluation, in the context of the actual experimental environment, is considered essential.
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
Expanding frontiers in plant transcriptomics in aid of functional genomics and molecular breeding.
Agarwal, Pinky; Parida, Swarup K; Mahto, Arunima; Das, Sweta; Mathew, Iny Elizebeth; Malik, Naveen; Tyagi, Akhilesh K
2014-12-01
The transcript pool of a plant part, under any given condition, is a collection of mRNAs that will pave the way for a biochemical reaction of the plant to stimuli. Over the past decades, transcriptome study has advanced from Northern blotting to RNA sequencing (RNA-seq), through other techniques, of which real-time quantitative polymerase chain reaction (PCR) and microarray are the most significant ones. The questions being addressed by such studies have also matured from a solitary process to expression atlas and marker-assisted genetic enhancement. Not only genes and their networks involved in various developmental processes of plant parts have been elucidated, but also stress tolerant genes have been highlighted. The transcriptome of a plant with altered expression of a target gene has given information about the downstream genes. Marker information has been used for breeding improved varieties. Fortunately, the data generated by transcriptome analysis has been made freely available for ample utilization and comparison. The review discusses this wide variety of transcriptome data being generated in plants, which includes developmental stages, abiotic and biotic stress, effect of altered gene expression, as well as comparative transcriptomics, with a special emphasis on microarray and RNA-seq. Such data can be used to determine the regulatory gene networks, which can subsequently be utilized for generating improved plant varieties. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Material Separation Using Dual-Energy CT: Current and Emerging Applications.
Patino, Manuel; Prochowski, Andrea; Agrawal, Mukta D; Simeone, Frank J; Gupta, Rajiv; Hahn, Peter F; Sahani, Dushyant V
2016-01-01
Dual-energy (DE) computed tomography (CT) offers the opportunity to generate material-specific images on the basis of the atomic number Z and the unique mass attenuation coefficient of a particular material at different x-ray energies. Material-specific images provide qualitative and quantitative information about tissue composition and contrast media distribution. The most significant contribution of DE CT-based material characterization comes from the capability to assess iodine distribution through the creation of an image that exclusively shows iodine. These iodine-specific images increase tissue contrast and amplify subtle differences in attenuation between normal and abnormal tissues, improving lesion detection and characterization in the abdomen. In addition, DE CT enables computational removal of iodine influence from a CT image, generating virtual noncontrast images. Several additional materials, including calcium, fat, and uric acid, can be separated, permitting imaging assessment of metabolic imbalances, elemental deficiencies, and abnormal deposition of materials within tissues. The ability to obtain material-specific images from a single, contrast-enhanced CT acquisition can complement the anatomic knowledge with functional information, and may be used to reduce the radiation dose by decreasing the number of phases in a multiphasic CT examination. DE CT also enables generation of energy-specific and virtual monochromatic images. Clinical applications of DE CT leverage both material-specific images and virtual monochromatic images to expand the current role of CT and overcome several limitations of single-energy CT. (©)RSNA, 2016.
Declercq, Jana; Tulkens, Stéphan; Van Leuven, Sarah
2018-03-01
This article examines the Twitter and Facebook uptake of health messages from an infotainment TV show on food, as broadcasted on Belgium's Dutch-language public broadcaster. The interest in and amount of health-related media coverage is rising, and this media coverage is an important source of information for laypeople, and impacts their health behaviours and therapy compliance. However, the role of the audience has also changed; consumers of media content increasingly are produsers, and, in the case of health, expert consumers. To explore how current audiences react to health claims, we have conducted a quantitative and qualitative content analysis of Twitter and Facebook reactions to an infotainment show about food and nutrition. We examine (1) to which elements in the show the audience reacts, to gain insight in the traction the nutrition-related content generates and (2) whether audience members are accepting or resisting the health information in the show. Our findings show that the information on health and production elicit the most reactions, and that health information incites a lot of refutation, low acceptance and a lot of suggestions on new information or new angles to complement the show's information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-11-01
NREL's new imaging tool could provide manufacturers with insight on their processes. Scientists at the National Renewable Energy Laboratory (NREL) have used capabilities within the Process Development and Integration Laboratory (PDIL) to generate quantitative minority-carrier lifetime maps of multicrystalline silicon (mc-Si) bricks. This feat has been accomplished by using the PDIL's photoluminescence (PL) imaging system in conjunction with transient lifetime measurements obtained using a custom NREL-designed resonance-coupled photoconductive decay (RCPCD) system. PL imaging can obtain rapid high-resolution images that provide a qualitative assessment of the material lifetime-with the lifetime proportional to the pixel intensity. In contrast, the RCPCD technique providesmore » a fast quantitative measure of the lifetime with a lower resolution and penetrates millimeters into the mc-Si brick, providing information on bulk lifetimes and material quality. This technique contrasts with commercially available minority-carrier lifetime mapping systems that use microwave conductivity measurements. Such measurements are dominated by surface recombination and lack information on the material quality within the bulk of the brick. By combining these two complementary techniques, we obtain high-resolution lifetime maps at very fast data acquisition times-attributes necessary for a production-based diagnostic tool. These bulk lifetime measurements provide manufacturers with invaluable feedback on their silicon ingot casting processes. NREL has been applying the PL images of lifetime in mc-Si bricks in collaboration with a U.S. photovoltaic industry partner through Recovery Act Funded Project ARRA T24. NREL developed a new tool to quantitatively map minority-carrier lifetime of multicrystalline silicon bricks by using photoluminescence imaging in conjunction with resonance-coupled photoconductive decay measurements. Researchers are not hindered by surface recombination and can look deeper into the material to map bulk lifetimes. The tool is being applied to silicon bricks in a project collaborating with a U.S. photovoltaic industry partner. Photovoltaic manufacturers can use the NREL tool to obtain valuable feedback on their silicon ingot casting processes.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
... quantitative studies. Focus groups serve the narrowly defined need for direct and informal opinion on a specific topic and as a qualitative research tool have three major purposes: To obtain information that is useful for developing variables and measures for quantitative studies, To better understand people's...
ERIC Educational Resources Information Center
Yu, Wei
2013-01-01
This dissertation applied the quantitative approach to the data gathered from online survey questionnaires regarding the three objects: Information Technology (IT) Portfolio Management, IT-Business Alignment, and IT Project Deliverables. By studying this data, this dissertation uncovered the underlying relationships that exist between the…
A kinetic study of 3-chlorophenol enhanced hydroxyl radical generation during ozonation.
Utsumi, Hideo; Han, Youn-Hee; Ichikawa, Kazuhiro
2003-12-01
Hydroxyl (OH) radical is proposed as an important factor in the ozonation of water. In the present study, the enhancing effect of 3-chlorophenol on OH radical generation was mathematically evaluated using electron spin resonance (ESR)/spin-trapping technique. OH radical was trapped with a 5,5-dimethyl-1-pyrroline-N-oxide (DMPO) as a stable adduct, DMPO-OH. The initial velocity of DMPO-OH generation in ozonated water containing 3-chlorophenol was quantitatively measured using a combined system of ESR spectroscopy with stopped-flow apparatus which was controlled by home-made software. The initial velocity of DMPO-OH generation increased as a function of the concentration of ozone and the more effectively of 3-chlorophenol concentration. The relation among ozone concentration, amount of 3-chlorophenol and the initial velocity (nu(0)) of DMPO-OH generation was mathematically analyzed and the following equation was obtained, nu(0) (10(-6)M/s)=[9.7 x [3-chlorophenol (10(-9)M)] + 0.0005]exp(57 x [ozone (10(-9)M)]). The equation fitted very well with the experimental results, and the correlation coefficient was larger than 0.99. The equation for the enhancing effect by 3-chlorophenol should provide useful information to optimize the condition in ozone treatment process of water containing phenolic pollutants.
Calibration of Heat Stress Monitor and its Measurement Uncertainty
NASA Astrophysics Data System (ADS)
Ekici, Can
2017-07-01
Wet-bulb globe temperature (WBGT) equation is a heat stress index that gives information for the workers in the industrial areas. WBGT equation is described in ISO Standard 7243 (ISO 7243 in Hot environments—estimation of the heat stress on working man, based on the WBGT index, ISO, Geneva, 1982). WBGT is the result of the combined quantitative effects of the natural wet-bulb temperature, dry-bulb temperature, and air temperature. WBGT is a calculated parameter. WBGT uses input estimates, and heat stress monitor measures these quantities. In this study, the calibration method of a heat stress monitor is described, and the model function for measurement uncertainty is given. Sensitivity coefficients were derived according to GUM. Two-pressure humidity generators were used to generate a controlled environment. Heat stress monitor was calibrated inside of the generator. Two-pressure humidity generator, which is located in Turkish Standard Institution, was used as the reference device. This device is traceable to national standards. Two-pressure humidity generator includes reference temperature Pt-100 sensors. The reference sensor was sheltered with a wet wick for the calibration of natural wet-bulb thermometer. The reference sensor was centred into a black globe that has got 150 mm diameter for the calibration of the black globe thermometer.
Richardson, Hugh H; Hickman, Zackary N; Govorov, Alexander O; Thomas, Alyssa C; Zhang, Wei; Kordesch, Martin E
2006-04-01
We investigate the system of optically excited gold NPs in an ice matrix aiming to understand heat generation and melting processes at the nanoscale level. Along with the traditional fluorescence method, we introduce thermooptical spectroscopy based on phase transformation of a matrix. With this, we can not only measure optical response but also thermal response, that is, heat generation. After several recrystallization cycles, the nanoparticles are embedded into the ice film where the optical and thermal properties of the nanoparticles are probed. Spatial fluorescence mapping shows the locations of Au nanoparticles, whereas the time-resolved Raman signal of ice reveals the melting process. From the time-dependent Raman signals, we determine the critical light intensities at which the laser beam is able to melt ice around the nanoparticles. The melting intensity depends strongly on temperature and position. The position-dependence is especially strong and reflects a mesoscopic character of heat generation. We think that it comes from the fact that nanoparticles form small complexes of different geometry and each complex has a unique thermal response. Theoretical calculations and experimental data are combined to make a quantitative measure of the amount of heat generated by optically excited Au nanoparticles and agglomerates. The information obtained in this study can be used to design nanoscale heaters and actuators.
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
Zhang, Zhen; Shang, Haihong; Shi, Yuzhen; Huang, Long; Li, Junwen; Ge, Qun; Gong, Juwu; Liu, Aiying; Chen, Tingting; Wang, Dan; Wang, Yanling; Palanga, Koffi Kibalou; Muhammad, Jamshed; Li, Weijie; Lu, Quanwei; Deng, Xiaoying; Tan, Yunna; Song, Weiwu; Cai, Juan; Li, Pengtao; Rashid, Harun or; Gong, Wankui; Yuan, Youlu
2016-04-11
Upland Cotton (Gossypium hirsutum) is one of the most important worldwide crops it provides natural high-quality fiber for the industrial production and everyday use. Next-generation sequencing is a powerful method to identify single nucleotide polymorphism markers on a large scale for the construction of a high-density genetic map for quantitative trait loci mapping. In this research, a recombinant inbred lines population developed from two upland cotton cultivars 0-153 and sGK9708 was used to construct a high-density genetic map through the specific locus amplified fragment sequencing method. The high-density genetic map harbored 5521 single nucleotide polymorphism markers which covered a total distance of 3259.37 cM with an average marker interval of 0.78 cM without gaps larger than 10 cM. In total 18 quantitative trait loci of boll weight were identified as stable quantitative trait loci and were detected in at least three out of 11 environments and explained 4.15-16.70 % of the observed phenotypic variation. In total, 344 candidate genes were identified within the confidence intervals of these stable quantitative trait loci based on the cotton genome sequence. These genes were categorized based on their function through gene ontology analysis, Kyoto Encyclopedia of Genes and Genomes analysis and eukaryotic orthologous groups analysis. This research reported the first high-density genetic map for Upland Cotton (Gossypium hirsutum) with a recombinant inbred line population using single nucleotide polymorphism markers developed by specific locus amplified fragment sequencing. We also identified quantitative trait loci of boll weight across 11 environments and identified candidate genes within the quantitative trait loci confidence intervals. The results of this research would provide useful information for the next-step work including fine mapping, gene functional analysis, pyramiding breeding of functional genes as well as marker-assisted selection.
Modeling the Information Age Combat Model: An Agent-Based Simulation of Network Centric Operations
NASA Technical Reports Server (NTRS)
Deller, Sean; Rabadi, Ghaith A.; Bell, Michael I.; Bowling, Shannon R.; Tolk, Andreas
2010-01-01
The Information Age Combat Model (IACM) was introduced by Cares in 2005 to contribute to the development of an understanding of the influence of connectivity on force effectiveness that can eventually lead to quantitative prediction and guidelines for design and employment. The structure of the IACM makes it clear that the Perron-Frobenius Eigenvalue is a quantifiable metric with which to measure the organization of a networked force. The results of recent experiments presented in Deller, et aI., (2009) indicate that the value of the Perron-Frobenius Eigenvalue is a significant measurement of the performance of an Information Age combat force. This was accomplished through the innovative use of an agent-based simulation to model the IACM and represents an initial contribution towards a new generation of combat models that are net-centric instead of using the current platform-centric approach. This paper describes the intent, challenges, design, and initial results of this agent-based simulation model.
New techniques for positron emission tomography in the study of human neurological disorders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1992-07-01
The general goals of the physics and kinetic modeling projects are to: (1) improve the quantitative information extractable from PET images, and (2) develop, implement and optimize tracer kinetic models for new PET neurotransmitter/receptor ligands aided by computer simulations. Work towards improving PET quantification has included projects evaluating: (1) iterative reconstruction algorithms using supplemental boundary information, (2) automated registration of dynamic PET emission and transmission data using sinogram edge detection, and (3) automated registration of multiple subjects to a common coordinate system, including the use of non-linear warping methods. Simulation routines have been developed providing more accurate representation of datamore » generated from neurotransmitter/receptor studies. Routines consider data generated from complex compartmental models, high or low specific activity administrations, non-specific binding, pre- or post-injection of cold or competing ligands, temporal resolution of the data, and radiolabeled metabolites. Computer simulations and human PET studies have been performed to optimize kinetic models for four new neurotransmitter/receptor ligands, [{sup 11}C]TRB (muscarinic), [{sup 11}C]flumazenil (benzodiazepine), [{sup 18}F]GBR12909, (dopamine), and [{sup 11}C]NMPB (muscarinic).« less
New techniques for positron emission tomography in the study of human neurological disorders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1992-01-01
The general goals of the physics and kinetic modeling projects are to: (1) improve the quantitative information extractable from PET images, and (2) develop, implement and optimize tracer kinetic models for new PET neurotransmitter/receptor ligands aided by computer simulations. Work towards improving PET quantification has included projects evaluating: (1) iterative reconstruction algorithms using supplemental boundary information, (2) automated registration of dynamic PET emission and transmission data using sinogram edge detection, and (3) automated registration of multiple subjects to a common coordinate system, including the use of non-linear warping methods. Simulation routines have been developed providing more accurate representation of datamore » generated from neurotransmitter/receptor studies. Routines consider data generated from complex compartmental models, high or low specific activity administrations, non-specific binding, pre- or post-injection of cold or competing ligands, temporal resolution of the data, and radiolabeled metabolites. Computer simulations and human PET studies have been performed to optimize kinetic models for four new neurotransmitter/receptor ligands, ({sup 11}C)TRB (muscarinic), ({sup 11}C)flumazenil (benzodiazepine), ({sup 18}F)GBR12909, (dopamine), and ({sup 11}C)NMPB (muscarinic).« less
Electric organ discharges and electric images during electrolocation
NASA Technical Reports Server (NTRS)
Assad, C.; Rasnow, B.; Stoddard, P. K.
1999-01-01
Weakly electric fish use active electrolocation - the generation and detection of electric currents - to explore their surroundings. Although electrosensory systems include some of the most extensively understood circuits in the vertebrate central nervous system, relatively little is known quantitatively about how fish electrolocate objects. We believe a prerequisite to understanding electrolocation and its underlying neural substrates is to quantify and visualize the peripheral electrosensory information measured by the electroreceptors. We have therefore focused on reconstructing both the electric organ discharges (EODs) and the electric images resulting from nearby objects and the fish's exploratory behaviors. Here, we review results from a combination of techniques, including field measurements, numerical and semi-analytical simulations, and video imaging of behaviors. EOD maps are presented and interpreted for six gymnotiform species. They reveal diverse electric field patterns that have significant implications for both the electrosensory and electromotor systems. Our simulations generated predictions of the electric images from nearby objects as well as sequences of electric images during exploratory behaviors. These methods are leading to the identification of image features and computational algorithms that could reliably encode electrosensory information and may help guide electrophysiological experiments exploring the neural basis of electrolocation.
NASA Astrophysics Data System (ADS)
Zainudin, W. N. R. A.; Ishak, W. W. M.
2017-09-01
In 2009, government of Malaysia has announced a National Renewable Energy Policy and Action Plan as part of their commitment to accelerate the growth in renewable energies (RE). However, an adoption of RE as a main source of energy is still at an early stage due to lack of public awareness and acceptance on RE. Up to date, there are insufficient studies done on the reasons behind this lack of awareness and acceptance. Therefore, this paper is interested to investigate the public acceptance towards development of RE by measuring their willingness to pay slightly more for energy generated from RE sources, denote as willingness level and whether the importance for the electricity to be supplied at absolute lowest possible cost regardless of source and environmental impact, denote as importance level and other socio-economic factors could improve their willingness level. Both qualitative and quantitative research methods are used to achieve the research objectives. A total of 164 respondents from local universities in Malaysia participated in a survey to collect this relevant information. Using Ordered Probit model, the study shows that among the relevant socio-economic factors, age seems to be an important factor to influence the willingness level of the respondents. This paper concludes that younger generation are more willing to pay slightly more for energy generated from RE sources as compared to older generation. One of the possible reason may due to better information access by the younger generation on the RE issues and its positive implication to the world. Finding from this paper is useful to help policy maker in designing RE advocacy programs that would be able to secure public participation. These efforts are important to ensure future success of the RE policy.
The spectrum of genomic signatures: from dinucleotides to chaos game representation.
Wang, Yingwei; Hill, Kathleen; Singh, Shiva; Kari, Lila
2005-02-14
In the post genomic era, access to complete genome sequence data for numerous diverse species has opened multiple avenues for examining and comparing primary DNA sequence organization of entire genomes. Previously, the concept of a genomic signature was introduced with the observation of species-type specific Dinucleotide Relative Abundance Profiles (DRAPs); dinucleotides were identified as the subsequences with the greatest bias in representation in a majority of genomes. Herein, we demonstrate that DRAP is one particular genomic signature contained within a broader spectrum of signatures. Within this spectrum, an alternative genomic signature, Chaos Game Representation (CGR), provides a unique visualization of patterns in sequence organization. A genomic signature is associated with a particular integer order or subsequence length that represents a measure of the resolution or granularity in the analysis of primary DNA sequence organization. We quantitatively explore the organizational information provided by genomic signatures of different orders through different distance measures, including a novel Image Distance. The Image Distance and other existing distance measures are evaluated by comparing the phylogenetic trees they generate for 26 complete mitochondrial genomes from a diversity of species. The phylogenetic tree generated by the Image Distance is compatible with the known relatedness of species. Quantitative evaluation of the spectrum of genomic signatures may be used to ultimately gain insight into the determinants and biological relevance of the genome signatures.
Pouch, Alison M; Aly, Ahmed H; Lai, Eric K; Yushkevich, Natalie; Stoffers, Rutger H; Gorman, Joseph H; Cheung, Albert T; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A
2017-09-01
Transesophageal echocardiography is the primary imaging modality for preoperative assessment of mitral valves with ischemic mitral regurgitation (IMR). While there are well known echocardiographic insights into the 3D morphology of mitral valves with IMR, such as annular dilation and leaflet tethering, less is understood about how quantification of valve dynamics can inform surgical treatment of IMR or predict short-term recurrence of the disease. As a step towards filling this knowledge gap, we present a novel framework for 4D segmentation and geometric modeling of the mitral valve in real-time 3D echocardiography (rt-3DE). The framework integrates multi-atlas label fusion and template-based medial modeling to generate quantitatively descriptive models of valve dynamics. The novelty of this work is that temporal consistency in the rt-3DE segmentations is enforced during both the segmentation and modeling stages with the use of groupwise label fusion and Kalman filtering. The algorithm is evaluated on rt-3DE data series from 10 patients: five with normal mitral valve morphology and five with severe IMR. In these 10 data series that total 207 individual 3DE images, each 3DE segmentation is validated against manual tracing and temporal consistency between segmentations is demonstrated. The ultimate goal is to generate accurate and consistent representations of valve dynamics that can both visually and quantitatively provide insight into normal and pathological valve function.
RILES, a novel method for temporal analysis of the in vivo regulation of miRNA expression
Ezzine, Safia; Vassaux, Georges; Pitard, Bruno; Barteau, Benoit; Malinge, Jean-Marc; Midoux, Patrick; Pichon, Chantal; Baril, Patrick
2013-01-01
Novel methods are required to investigate the complexity of microRNA (miRNA) biology and particularly their dynamic regulation under physiopathological conditions. Herein, a novel plasmid-based RNAi-Inducible Luciferase Expression System (RILES) was engineered to monitor the activity of endogenous RNAi machinery. When RILES is transfected in a target cell, the miRNA of interest suppresses the expression of a transcriptional repressor and consequently switch-ON the expression of the luciferase reporter gene. Hence, miRNA expression in cells is signed by the emission of bioluminescence signals that can be monitored using standard bioluminescence equipment. We validated this approach by monitoring in mice the expression of myomiRs-133, −206 and −1 in skeletal muscles and miRNA-122 in liver. Bioluminescence experiments demonstrated robust qualitative and quantitative data that correlate with the miRNA expression pattern detected by quantitative RT-PCR (qPCR). We further demonstrated that the regulation of miRNA-206 expression during the development of muscular atrophy is individual-dependent, time-regulated and more complex than the information generated by qPCR. As RILES is simple and versatile, we believe that this methodology will contribute to a better understanding of miRNA biology and could serve as a rationale for the development of a novel generation of regulatable gene expression systems with potential therapeutic applications. PMID:24013565
RILES, a novel method for temporal analysis of the in vivo regulation of miRNA expression.
Ezzine, Safia; Vassaux, Georges; Pitard, Bruno; Barteau, Benoit; Malinge, Jean-Marc; Midoux, Patrick; Pichon, Chantal; Baril, Patrick
2013-11-01
Novel methods are required to investigate the complexity of microRNA (miRNA) biology and particularly their dynamic regulation under physiopathological conditions. Herein, a novel plasmid-based RNAi-Inducible Luciferase Expression System (RILES) was engineered to monitor the activity of endogenous RNAi machinery. When RILES is transfected in a target cell, the miRNA of interest suppresses the expression of a transcriptional repressor and consequently switch-ON the expression of the luciferase reporter gene. Hence, miRNA expression in cells is signed by the emission of bioluminescence signals that can be monitored using standard bioluminescence equipment. We validated this approach by monitoring in mice the expression of myomiRs-133, -206 and -1 in skeletal muscles and miRNA-122 in liver. Bioluminescence experiments demonstrated robust qualitative and quantitative data that correlate with the miRNA expression pattern detected by quantitative RT-PCR (qPCR). We further demonstrated that the regulation of miRNA-206 expression during the development of muscular atrophy is individual-dependent, time-regulated and more complex than the information generated by qPCR. As RILES is simple and versatile, we believe that this methodology will contribute to a better understanding of miRNA biology and could serve as a rationale for the development of a novel generation of regulatable gene expression systems with potential therapeutic applications.
Functional anatomy of the gibbon forelimb: adaptations to a brachiating lifestyle
Michilsens, Fana; Vereecke, Evie E; D'Août, Kristiaan; Aerts, Peter
2009-01-01
It has been shown that gibbons are able to brachiate with very low mechanical costs. The conversion of muscle activity into smooth, purposeful movement of the limb depends on the morphometry of muscles and their mechanical action on the skeleton. Despite the gibbon's reputation for excellence in brachiation, little information is available regarding either its gross musculoskeletal anatomy or its more detailed muscle–tendon architecture. We provide quantitative anatomical data on the muscle–tendon architecture (muscle mass, physiological cross-sectional area, fascicle length and tendon length) of the forelimb of four gibbon species, collected by detailed dissections of unfixed cadavers. Data are compared between different gibbon species and with similar published data of non-brachiating primates such as macaques, chimpanzees and humans. No quantitative differences are found between the studied gibbon species. Both their forelimb anatomy and muscle dimensions are comparable when normalized to the same body mass. Gibbons have shoulder flexors, extensors, rotator muscles and elbow flexors with a high power or work-generating capacity and their wrist flexors have a high force-generating capacity. Compared with other primates, the elbow flexors of gibbons are particularly powerful, suggesting that these muscles are particularly important for a brachiating lifestyle. Based on this anatomical study, the shoulder flexors, extensors, rotator muscles, elbow flexors and wrist flexors are expected to contribute the most to brachiation. PMID:19519640
Synthesising quantitative and qualitative research in evidence-based patient information.
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-03-01
Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review.
Wang, Haiqin; Liu, Wenlong; He, Fuyuan; Chen, Zuohong; Zhang, Xili; Xie, Xianggui; Zeng, Jiaoli; Duan, Xiaopeng
2012-02-01
To explore the once sampling quantitation of Houttuynia cordata through its DNA polymorphic bands that carried information entropy, from other form that the expression of traditional Chinese medicine polymorphism, genetic polymorphism, of traditional Chinese medicine. The technique of inter simple sequence repeat (ISSR) was applied to analyze genetic polymorphism of H. cordata samples from the same GAP producing area, the DNA genetic bands were transformed its into the information entropy, and the minimum once sampling quantitation with the mathematical mode was measured. One hundred and thirty-four DNA bands were obtained by using 9 screened ISSR primers to amplify from 46 strains DNA samples of H. cordata from the same GAP, the information entropy was H=0.365 6-0.978 6, and RSD was 14.75%. The once sampling quantitation was W=11.22 kg (863 strains). The "once minimum sampling quantitation" were calculated from the angle of the genetic polymorphism of H. cordata, and a great differences between this volume and the amount from the angle of fingerprint were found.
ERIC Educational Resources Information Center
Ling, Chris D.; Bridgeman, Adam J.
2011-01-01
Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…
Functional linear models for association analysis of quantitative traits.
Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao
2013-11-01
Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY PERIODICALS, INC.
Modelling the heart as a communication system.
Ashikaga, Hiroshi; Aguilar-Rodríguez, José; Gorsky, Shai; Lusczek, Elizabeth; Marquitti, Flávia Maria Darcie; Thompson, Brian; Wu, Degang; Garland, Joshua
2015-04-06
Electrical communication between cardiomyocytes can be perturbed during arrhythmia, but these perturbations are not captured by conventional electrocardiographic metrics. We developed a theoretical framework to quantify electrical communication using information theory metrics in two-dimensional cell lattice models of cardiac excitation propagation. The time series generated by each cell was coarse-grained to 1 when excited or 0 when resting. The Shannon entropy for each cell was calculated from the time series during four clinically important heart rhythms: normal heartbeat, anatomical reentry, spiral reentry and multiple reentry. We also used mutual information to perform spatial profiling of communication during these cardiac arrhythmias. We found that information sharing between cells was spatially heterogeneous. In addition, cardiac arrhythmia significantly impacted information sharing within the heart. Entropy localized the path of the drifting core of spiral reentry, which could be an optimal target of therapeutic ablation. We conclude that information theory metrics can quantitatively assess electrical communication among cardiomyocytes. The traditional concept of the heart as a functional syncytium sharing electrical information cannot predict altered entropy and information sharing during complex arrhythmia. Information theory metrics may find clinical application in the identification of rhythm-specific treatments which are currently unmet by traditional electrocardiographic techniques. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Quantum Yield of Single Surface Plasmons Generated by a Quantum Dot Coupled with a Silver Nanowire.
Li, Qiang; Wei, Hong; Xu, Hongxing
2015-12-09
The interactions between surface plasmons (SPs) in metal nanostructures and excitons in quantum emitters (QEs) lead to many interesting phenomena and potential applications that are strongly dependent on the quantum yield of SPs. The difficulty in distinguishing all the possible exciton recombination channels hinders the experimental determination of SP quantum yield. Here, we experimentally measured for the first time the quantum yield of single SPs generated by the exciton-plasmon coupling in a system composed of a single quantum dot and a silver nanowire (NW). By utilizing the SP guiding property of the NW, the decay rates of all the exciton recombination channels, i.e., direct free space radiation channel, SP generation channel, and nonradiative damping channel, are quantitatively obtained. It is determined that the optimum emitter-NW coupling distance for the largest SP quantum yield is about 10 nm, resulting from the different distance-dependent decay rates of the three channels. These results are important for manipulating the coupling between plasmonic nanostructures and QEs and developing on-chip quantum plasmonic devices for potential nanophotonic and quantum information applications.
Tracking Expected Improvements of Decadal Prediction in Climate Services
NASA Astrophysics Data System (ADS)
Suckling, E.; Thompson, E.; Smith, L. A.
2013-12-01
Physics-based simulation models are ultimately expected to provide the best available (decision-relevant) probabilistic climate predictions, as they can capture the dynamics of the Earth System across a range of situations, situations for which observations for the construction of empirical models are scant if not nonexistent. This fact in itself provides neither evidence that predictions from today's Earth Systems Models will outperform today's empirical models, nor a guide to the space and time scales on which today's model predictions are adequate for a given purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales. The skill of these forecasts is contrasted with that of state-of-the-art climate models, and the challenges faced by each approach are discussed. The focus is on providing decision-relevant probability forecasts for decision support. An empirical model, known as Dynamic Climatology is shown to be competitive with CMIP5 climate models on decadal scale probability forecasts. Contrasting the skill of simulation models not only with each other but also with empirical models can reveal the space and time scales on which a generation of simulation models exploits their physical basis effectively. It can also quantify their ability to add information in the formation of operational forecasts. Difficulties (i) of information contamination (ii) of the interpretation of probabilistic skill and (iii) of artificial skill complicate each modelling approach, and are discussed. "Physics free" empirical models provide fixed, quantitative benchmarks for the evaluation of ever more complex climate models, that is not available from (inter)comparisons restricted to only complex models. At present, empirical models can also provide a background term for blending in the formation of probability forecasts from ensembles of simulation models. In weather forecasting this role is filled by the climatological distribution, and can significantly enhance the value of longer lead-time weather forecasts to those who use them. It is suggested that the direct comparison of simulation models with empirical models become a regular component of large model forecast intercomparison and evaluation. This would clarify the extent to which a given generation of state-of-the-art simulation models provide information beyond that available from simpler empirical models. It would also clarify current limitations in using simulation forecasting for decision support. No model-based probability forecast is complete without a quantitative estimate if its own irrelevance; this estimate is likely to increase as a function of lead time. A lack of decision-relevant quantitative skill would not bring the science-based foundation of anthropogenic warming into doubt. Similar levels of skill with empirical models does suggest a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to clearly state such weaknesses of a given generation of simulation models, while clearly stating their strength and their foundation, risks the credibility of science in support of policy in the long term.
NASA Astrophysics Data System (ADS)
Cabello, Violeta
2017-04-01
This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.
PERCH: A Unified Framework for Disease Gene Prioritization.
Feng, Bing-Jian
2017-03-01
To interpret genetic variants discovered from next-generation sequencing, integration of heterogeneous information is vital for success. This article describes a framework named PERCH (Polymorphism Evaluation, Ranking, and Classification for a Heritable trait), available at http://BJFengLab.org/. It can prioritize disease genes by quantitatively unifying a new deleteriousness measure called BayesDel, an improved assessment of the biological relevance of genes to the disease, a modified linkage analysis, a novel rare-variant association test, and a converted variant call quality score. It supports data that contain various combinations of extended pedigrees, trios, and case-controls, and allows for a reduced penetrance, an elevated phenocopy rate, liability classes, and covariates. BayesDel is more accurate than PolyPhen2, SIFT, FATHMM, LRT, Mutation Taster, Mutation Assessor, PhyloP, GERP++, SiPhy, CADD, MetaLR, and MetaSVM. The overall approach is faster and more powerful than the existing quantitative method pVAAST, as shown by the simulations of challenging situations in finding the missing heritability of a complex disease. This framework can also classify variants of unknown significance (variants of uncertain significance) by quantitatively integrating allele frequencies, deleteriousness, association, and co-segregation. PERCH is a versatile tool for gene prioritization in gene discovery research and variant classification in clinical genetic testing. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.
FRET-based genetically-encoded sensors for quantitative monitoring of metabolites.
Mohsin, Mohd; Ahmad, Altaf; Iqbal, Muhammad
2015-10-01
Neighboring cells in the same tissue can exist in different states of dynamic activities. After genomics, proteomics and metabolomics, fluxomics is now equally important for generating accurate quantitative information on the cellular and sub-cellular dynamics of ions and metabolite, which is critical for functional understanding of organisms. Various spectrometry techniques are used for monitoring ions and metabolites, although their temporal and spatial resolutions are limited. Discovery of the fluorescent proteins and their variants has revolutionized cell biology. Therefore, novel tools and methods targeting sub-cellular compartments need to be deployed in specific cells and targeted to sub-cellular compartments in order to quantify the target-molecule dynamics directly. We require tools that can measure cellular activities and protein dynamics with sub-cellular resolution. Biosensors based on fluorescence resonance energy transfer (FRET) are genetically encoded and hence can specifically target sub-cellular organelles by fusion to proteins or targetted sequences. Since last decade, FRET-based genetically encoded sensors for molecules involved in energy production, reactive oxygen species and secondary messengers have helped to unravel key aspects of cellular physiology. This review, describing the design and principles of sensors, presents a database of sensors for different analytes/processes, and illustrate examples of application in quantitative live cell imaging.
NASA Astrophysics Data System (ADS)
Hou, Yafei; Wang, Kan; Xiao, Kun; Qin, Weijian; Lu, Wenting; Tao, Wei; Cui, Daxiang
2017-04-01
Nowadays, lateral flow immunochromatographic assays are increasingly popular as a diagnostic tool for point-of-care (POC) test based on their simplicity, specificity, and sensitivity. Hence, quantitative detection and pluralistic popular application are urgently needed in medical examination. In this study, a smartphone-based dual-modality imaging system was developed for quantitative detection of color or fluorescent lateral flow test strips, which can be operated anywhere at any time. In this system, the white and ultra-violet (UV) light of optical device was designed, which was tunable with different strips, and the Sobel operator algorithm was used in the software, which could enhance the identification ability to recognize the test area from the background boundary information. Moreover, this technology based on extraction of the components from RGB format (red, green, and blue) of color strips or only red format of the fluorescent strips can obviously improve the high-signal intensity and sensitivity. Fifty samples were used to evaluate the accuracy of this system, and the ideal detection limit was calculated separately from detection of human chorionic gonadotropin (HCG) and carcinoembryonic antigen (CEA). The results indicated that smartphone-controlled dual-modality imaging system could provide various POC diagnoses, which becomes a potential technology for developing the next-generation of portable system in the near future.
Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.
Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C
2016-01-01
Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.
Exploiting induced variation to dissect quantitative traits in barley.
Druka, Arnis; Franckowiak, Jerome; Lundqvist, Udda; Bonar, Nicola; Alexander, Jill; Guzy-Wrobelska, Justyna; Ramsay, Luke; Druka, Ilze; Grant, Iain; Macaulay, Malcolm; Vendramin, Vera; Shahinnia, Fahimeh; Radovic, Slobodanka; Houston, Kelly; Harrap, David; Cardle, Linda; Marshall, David; Morgante, Michele; Stein, Nils; Waugh, Robbie
2010-04-01
The identification of genes underlying complex quantitative traits such as grain yield by means of conventional genetic analysis (positional cloning) requires the development of several large mapping populations. However, it is possible that phenotypically related, but more extreme, allelic variants generated by mutational studies could provide a means for more efficient cloning of QTLs (quantitative trait loci). In barley (Hordeum vulgare), with the development of high-throughput genome analysis tools, efficient genome-wide identification of genetic loci harbouring mutant alleles has recently become possible. Genotypic data from NILs (near-isogenic lines) that carry induced or natural variants of genes that control aspects of plant development can be compared with the location of QTLs to potentially identify candidate genes for development--related traits such as grain yield. As yield itself can be divided into a number of allometric component traits such as tillers per plant, kernels per spike and kernel size, mutant alleles that both affect these traits and are located within the confidence intervals for major yield QTLs may represent extreme variants of the underlying genes. In addition, the development of detailed comparative genomic models based on the alignment of a high-density barley gene map with the rice and sorghum physical maps, has enabled an informed prioritization of 'known function' genes as candidates for both QTLs and induced mutant genes.
Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments
Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.
2016-01-01
Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
Profiling mild steel welding processes to reduce fume emissions and costs in the workplace.
Keane, Michael J; Siert, Arlen; Chen, Bean T; Stone, Samuel G
2014-05-01
To provide quantitative information to choose the best welding processes for minimizing workplace emissions, nine gas metal arc welding (GMAW) processes for mild steel were assessed for fume generation rates, normalized fume generation rates (milligram fume per gram of electrode consumed), and normalized generation rates for elemental manganese, nickel, and iron. Shielded metal arc welding (SMAW) and flux-cored arc-welding (FCAW) processes were also profiled. The fumes were collected quantitatively in an American Welding Society-type fume chamber and weighed, recovered, homogenized, and analyzed by inductively coupled atomic emission spectroscopy for total metals. The processes included GMAW with short circuit, globular transfer, axial spray, pulsed spray, Surface Tension Transfer™, Regulated Metal Deposition™, and Cold Metal Transfer™ (CMT) modes. Flux-cored welding was gas shielded, and SMAW was a single rod type. Results indicate a wide range of fume emission factors for the process variations studied. Fume emission rates per gram of electrode consumed were highest for SMAW (~13 mg fume g(-1) electrode) and lowest for GMAW processes such as pulsed spray (~1.5mg g(-1)) and CMT (~1mg g(-1)). Manganese emission rates per gram of electrode consumed ranged from 0.45 mg g(-1) (SMAW) to 0.08 mg g(-1) (CMT). Nickel emission rates were generally low and ranged from ~0.09 (GMAW short circuit) to 0.004 mg g(-1) (CMT). Iron emission rates ranged from 3.7 (spray-mode GMAW) to 0.49 mg g(-1) (CMT). The processes studied have significantly different costs, and cost factors are presented based on a case study to allow comparisons between processes in specific cost categories. Costs per linear meter of weld were $31.07 (SMAW), $12.37 (GMAW short circuit), and $10.89 (FCAW). Although no single process is the best for minimizing fume emissions and costs while satisfying the weld requirements, there are several processes that can minimize emissions. This study provides information to aid in those choices. Suggestions for overcoming barriers to utilizing new and less hazardous welding processes are also discussed.
Profiling Mild Steel Welding Processes to Reduce Fume Emissions and Costs in the Workplace
Keane, Michael J.; Siert, Arlen; Chen, Bean T.; Stone, Samuel G.
2015-01-01
To provide quantitative information to choose the best welding processes for minimizing workplace emissions, nine gas metal arc welding (GMAW) processes for mild steel were assessed for fume generation rates, normalized fume generation rates (milligram fume per gram of electrode consumed), and normalized generation rates for elemental manganese, nickel, and iron. Shielded metal arc welding (SMAW) and flux-cored arc-welding (FCAW) processes were also profiled. The fumes were collected quantitatively in an American Welding Society-type fume chamber and weighed, recovered, homogenized, and analyzed by inductively coupled atomic emission spectroscopy for total metals. The processes included GMAW with short circuit, globular transfer, axial spray, pulsed spray, Surface Tension Transfer™, Regulated Metal Deposition™, and Cold Metal Transfer™ (CMT) modes. Flux-cored welding was gas shielded, and SMAW was a single rod type. Results indicate a wide range of fume emission factors for the process variations studied. Fume emission rates per gram of electrode consumed were highest for SMAW (~13 mg fume g−1 electrode) and lowest for GMAW processes such as pulsed spray (~1.5 mg g−1) and CMT (~1 mg g−1). Manganese emission rates per gram of electrode consumed ranged from 0.45 mg g−1 (SMAW) to 0.08 mg g−1 (CMT). Nickel emission rates were generally low and ranged from ~0.09 (GMAW short circuit) to 0.004 mg g−1 (CMT). Iron emission rates ranged from 3.7 (spray-mode GMAW) to 0.49 mg g−1 (CMT). The processes studied have significantly different costs, and cost factors are presented based on a case study to allow comparisons between processes in specific cost categories. Costs per linear meter of weld were $31.07 (SMAW), $12.37 (GMAW short circuit), and $10.89 (FCAW). Although no single process is the best for minimizing fume emissions and costs while satisfying the weld requirements, there are several processes that can minimize emissions. This study provides information to aid in those choices. Suggestions for overcoming barriers to utilizing new and less hazardous welding processes are also discussed. PMID:24515891
Smith, Blair H; Campbell, Harry; Blackwood, Douglas; Connell, John; Connor, Mike; Deary, Ian J; Dominiczak, Anna F; Fitzpatrick, Bridie; Ford, Ian; Jackson, Cathy; Haddow, Gillian; Kerr, Shona; Lindsay, Robert; McGilchrist, Mark; Morton, Robin; Murray, Graeme; Palmer, Colin N A; Pell, Jill P; Ralston, Stuart H; St Clair, David; Sullivan, Frank; Watt, Graham; Wolf, Roland; Wright, Alan; Porteous, David; Morris, Andrew D
2006-10-02
Generation Scotland: the Scottish Family Health Study aims to identify genetic variants accounting for variation in levels of quantitative traits underlying the major common complex diseases (such as cardiovascular disease, cognitive decline, mental illness) in Scotland. Generation Scotland will recruit a family-based cohort of up to 50,000 individuals (comprising siblings and parent-offspring groups) across Scotland. It will be a six-year programme, beginning in Glasgow and Tayside in the first two years (Phase 1) before extending to other parts of Scotland in the remaining four years (Phase 2). In Phase 1, individuals aged between 35 and 55 years, living in the East and West of Scotland will be invited to participate, along with at least one (and preferably more) siblings and any other first degree relatives aged 18 or over. The total initial sample size will be 15,000 and it is planned that this will increase to 50,000 in Phase 2. All participants will be asked to contribute blood samples from which DNA will be extracted and stored for future investigation. The information from the DNA, along with answers to a life-style and medical history questionnaire, clinical and biochemical measurements taken at the time of donation, and subsequent health developments over the life course (traced through electronic health records) will be stored and used for research purposes. In addition, a detailed public consultation process will begin that will allow respondents' views to shape and develop the study. This is an important aspect to the research, and forms the continuation of a long-term parallel engagement process. As well as gene identification, the family-based study design will allow measurement of the heritability and familial aggregation of relevant quantitative traits, and the study of how genetic effects may vary by parent-of-origin. Long-term potential outcomes of this research include the targeting of disease prevention and treatment, and the development of screening tools based on the new genetic information. This study approach is complementary to other population-based genetic epidemiology studies, such as UK Biobank, which are established primarily to characterise genes and genetic risk in the population.
Colloquium: Mechanical formalisms for tissue dynamics.
Tlili, Sham; Gay, Cyprien; Graner, François; Marcq, Philippe; Molino, François; Saramito, Pierre
2015-05-01
The understanding of morphogenesis in living organisms has been renewed by tremendous progress in experimental techniques that provide access to cell scale, quantitative information both on the shapes of cells within tissues and on the genes being expressed. This information suggests that our understanding of the respective contributions of gene expression and mechanics, and of their crucial entanglement, will soon leap forward. Biomechanics increasingly benefits from models, which assist the design and interpretation of experiments, point out the main ingredients and assumptions, and ultimately lead to predictions. The newly accessible local information thus calls for a reflection on how to select suitable classes of mechanical models. We review both mechanical ingredients suggested by the current knowledge of tissue behaviour, and modelling methods that can help generate a rheological diagram or a constitutive equation. We distinguish cell scale ("intra-cell") and tissue scale ("inter-cell") contributions. We recall the mathematical framework developed for continuum materials and explain how to transform a constitutive equation into a set of partial differential equations amenable to numerical resolution. We show that when plastic behaviour is relevant, the dissipation function formalism appears appropriate to generate constitutive equations; its variational nature facilitates numerical implementation, and we discuss adaptations needed in the case of large deformations. The present article gathers theoretical methods that can readily enhance the significance of the data to be extracted from recent or future high throughput biomechanical experiments.
Reanimating patients: cardio-respiratory CT and MR motion phantoms based on clinical CT patient data
NASA Astrophysics Data System (ADS)
Mayer, Johannes; Sauppe, Sebastian; Rank, Christopher M.; Sawall, Stefan; Kachelrieß, Marc
2017-03-01
Until today several algorithms have been developed that reduce or avoid artifacts caused by cardiac and respiratory motion in computed tomography (CT). The motion information is converted into so-called motion vector fields (MVFs) and used for motion compensation (MoCo) during the image reconstruction. To analyze these algorithms quantitatively there is the need for ground truth patient data displaying realistic motion. We developed a method to generate a digital ground truth displaying realistic cardiac and respiratory motion that can be used as a tool to assess MoCo algorithms. By the use of available MoCo methods we measured the motion in CT scans with high spatial and temporal resolution and transferred the motion information onto patient data with different anatomy or imaging modality, thereby reanimating the patient virtually. In addition to these images the ground truth motion information in the form of MVFs is available and can be used to benchmark the MVF estimation of MoCo algorithms. We here applied the method to generate 20 CT volumes displaying detailed cardiac motion that can be used for cone-beam CT (CBCT) simulations and a set of 8 MR volumes displaying respiratory motion. Our method is able to reanimate patient data virtually. In combination with the MVFs it serves as a digital ground truth and provides an improved framework to assess MoCo algorithms.
Frikha, Youssef; Fellner, Johann; Zairi, Moncef
2017-09-01
Despite initiatives for enhanced recycling and waste utilization, landfill still represents the dominant disposal path for municipal solid waste (MSW). The environmental impacts of landfills depend on several factors, including waste composition, technical barriers, landfill operation and climatic conditions. A profound evaluation of all factors and their impact is necessary in order to evaluate the environmental hazards emanating from landfills. The present paper investigates a sanitary landfill located in a semi-arid climate (Tunisia) and highlights major differences in quantitative and qualitative leachate characteristics compared to landfills situated in moderate climates. Besides the qualitative analysis of leachate samples, a quantitative analysis including the simulation of leachate generation (using the HELP model) has been conducted. The results of the analysis indicate a high load of salts (Cl, Na, inorganic nitrogen) in the leachate compared to other landfills. Furthermore the simulations with HELP model highlight that a major part of the leachate generated originates form the water content of waste.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-19
... panel will be employed to collect this information, which serves the need for direct and quantitative measurement of our target population, and which, as a quantitative research tool has some major benefits: To...
Analysis, Thematic Maps and Data Mining from Point Cloud to Ontology for Software Development
NASA Astrophysics Data System (ADS)
Nespeca, R.; De Luca, L.
2016-06-01
The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based) are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.
Zhuo, Shuangmu; Chen, Jianxin; Luo, Tianshu; Zou, Dingsong
2006-08-21
A Multimode nonlinear optical imaging technique based on the combination of multichannel mode and Lambda mode is developed to investigate human dermis. Our findings show that this technique not only improves the image contrast of the structural proteins of extracellular matrix (ECM) but also provides an image-guided spectral analysis method to identify both cellular and ECM intrinsic components including collagen, elastin, NAD(P)H and flavin. By the combined use of multichannel mode and Lambda mode in tandem, the obtained in-depth two photon-excited fluorescence (TPEF) and second-harmonic generation (SHG) imaging and TPEF/SHG signals depth-dependence decay can offer a sensitive tool for obtaining quantitative tissue structural and biochemical information. These results suggest that the technique has the potential to provide more accurate information for determining tissue physiological and pathological states.
NASA Astrophysics Data System (ADS)
Adabi, Saba; Conforto, Silvia; Hosseinzadeh, Matin; Noe, Shahryar; Daveluy, Steven; Mehregan, Darius; Nasiriavanaki, Mohammadreza
2017-02-01
Optical Coherence Tomography (OCT) offers real-time high-resolution three-dimensional images of tissue microstructures. In this study, we used OCT skin images acquired from ten volunteers, neither of whom had any skin conditions addressing the features of their anatomic location. OCT segmented images are analyzed based on their optical properties (attenuation coefficient) and textural image features e.g., contrast, correlation, homogeneity, energy, entropy, etc. Utilizing the information and referring to their clinical insight, we aim to make a comprehensive computational model for the healthy skin. The derived parameters represent the OCT microstructural morphology and might provide biological information for generating an atlas of normal skin from different anatomic sites of human skin and may allow for identification of cell microstructural changes in cancer patients. We then compared the parameters of healthy samples with those of abnormal skin and classified them using a linear Support Vector Machines (SVM) with 82% accuracy.
Urinary cell-free DNA is a versatile analyte for monitoring infections of the urinary tract.
Burnham, Philip; Dadhania, Darshana; Heyang, Michael; Chen, Fanny; Westblade, Lars F; Suthanthiran, Manikkam; Lee, John Richard; De Vlaminck, Iwijn
2018-06-20
Urinary tract infections are one of the most common infections in humans. Here we tested the utility of urinary cell-free DNA (cfDNA) to comprehensively monitor host and pathogen dynamics in bacterial and viral urinary tract infections. We isolated cfDNA from 141 urine samples from a cohort of 82 kidney transplant recipients and performed next-generation sequencing. We found that urinary cfDNA is highly informative about bacterial and viral composition of the microbiome, antimicrobial susceptibility, bacterial growth dynamics, kidney allograft injury, and host response to infection. These different layers of information are accessible from a single assay and individually agree with corresponding clinical tests based on quantitative PCR, conventional bacterial culture, and urinalysis. In addition, cfDNA reveals the frequent occurrence of pathologies that remain undiagnosed with conventional diagnostic protocols. Our work identifies urinary cfDNA as a highly versatile analyte to monitor infections of the urinary tract.
Relatedness-based Multi-Entity Summarization
Gunaratna, Kalpa; Yazdavar, Amir Hossein; Thirunarayan, Krishnaprasad; Sheth, Amit; Cheng, Gong
2017-01-01
Representing world knowledge in a machine processable format is important as entities and their descriptions have fueled tremendous growth in knowledge-rich information processing platforms, services, and systems. Prominent applications of knowledge graphs include search engines (e.g., Google Search and Microsoft Bing), email clients (e.g., Gmail), and intelligent personal assistants (e.g., Google Now, Amazon Echo, and Apple’s Siri). In this paper, we present an approach that can summarize facts about a collection of entities by analyzing their relatedness in preference to summarizing each entity in isolation. Specifically, we generate informative entity summaries by selecting: (i) inter-entity facts that are similar and (ii) intra-entity facts that are important and diverse. We employ a constrained knapsack problem solving approach to efficiently compute entity summaries. We perform both qualitative and quantitative experiments and demonstrate that our approach yields promising results compared to two other stand-alone state-of-the-art entity summarization approaches. PMID:29051696
Assessment of Health Effects of Exogenous Urea: Summary and Key Findings.
Dickerson, Aisha S; Lee, Janice S; Keshava, Channa; Hotchkiss, Andrew; Persad, Amanda S
2018-05-01
Urea has been utilized as a reductant in diesel fuels to lower emission of nitrogen oxides, igniting interest in probable human health hazards associated with exposure to exogenous urea. Here, we summarize and update key findings on potential health effects of exogenous urea, including carcinogenicity. No definitive target organs for oral exposure were identified; however, results in animal studies suggest that the liver and kidney could be potential target organs of urea toxicity. The available human-subject literature suggests that the impact on lung function is minimal. Based on the literature on exogenous urea, we concluded that there was inadequate information to assess the carcinogenic potential of urea, or perform a quantitative assessment to derive reference values. Given the limited information on exogenous urea, additional research to address gaps for exogenous urea should include long-term cancer bioassays, two-generation reproductive toxicity studies, and mode-of-action investigations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boehm, Céline; Degrande, Céline; Mattelaer, Olivier
The study of anomalous electromagnetic emission in the sky is the basis of indirect searches for dark matter. It is also a powerful tool to constrain the radiative decay of active neutrinos. Until now, quantitative analyses have focused on the flux and energy spectrum of such an emission; polarisation has never been considered. Here we show that we could be missing out on an essential piece of information. The radiative decay of neutrinos, as well as the interactions of dark matter and neutrinos with Standard Model particles can generate a circular polarisation signal in X-rays or γ-rays. If observed, thismore » could reveal important information about their spatial distribution and particle-antiparticle ratio, and could even reveal the nature of the high-energy particle physics processes taking place in astrophysical sites. The question of the observability of these polarised signatures and their separation from background astrophysical sources is left for future work.« less
Aquatics Systems Branch: transdisciplinary research to address water-related environmental problems
Dong, Quan; Walters, Katie D.
2015-01-01
The Aquatic Systems Branch at the Fort Collins Science Center is a group of scientists dedicated to advancing interdisciplinary science and providing science support to solve water-related environmental issues. Natural resource managers have an increasing need for scientific information and stakeholders face enormous challenges of increasing and competing demands for water. Our scientists are leaders in ecological flows, riparian ecology, hydroscape ecology, ecosystem management, and contaminant biology. The Aquatic Systems Branch employs and develops state-of-the-science approaches in field investigations, laboratory experiments, remote sensing, simulation and predictive modeling, and decision support tools. We use the aquatic experimental laboratory, the greenhouse, the botanical garden and other advanced facilities to conduct unique research. Our scientists pursue research on the ground, in the rivers, and in the skies, generating and testing hypotheses and collecting quantitative information to support planning and design in natural resource management and aquatic restoration.
Zhou, Xiaotong; Meng, Xiangjun; Cheng, Longmei; Su, Chong; Sun, Yantong; Sun, Lingxia; Tang, Zhaohui; Fawcett, John Paul; Yang, Yan; Gu, Jingkai
2017-05-16
Polyethylene glycols (PEGs) are synthetic polymers composed of repeating ethylene oxide subunits. They display excellent biocompatibility and are widely used as pharmaceutical excipients. To fully understand the biological fate of PEGs requires accurate and sensitive analytical methods for their quantitation. Application of conventional liquid chromatography-tandem mass spectrometry (LC-MS/MS) is difficult because PEGs have polydisperse molecular weights (MWs) and tend to produce multicharged ions in-source resulting in innumerable precursor ions. As a result, multiple reaction monitoring (MRM) fails to scan all ion pairs so that information on the fate of unselected ions is missed. This Article addresses this problem by application of liquid chromatography-triple-quadrupole/time-of-flight mass spectrometry (LC-Q-TOF MS) based on the MS ALL technique. This technique performs information-independent acquisition by allowing all PEG precursor ions to enter the collision cell (Q2). In-quadrupole collision-induced dissociation (CID) in Q2 then effectively generates several fragments from all PEGs due to the high collision energy (CE). A particular PEG product ion (m/z 133.08592) was found to be common to all linear PEGs and allowed their total quantitation in rat plasma with high sensitivity, excellent linearity and reproducibility. Assay validation showed the method was linear for all linear PEGs over the concentration range 0.05-5.0 μg/mL. The assay was successfully applied to the pharmacokinetic study in rat involving intravenous administration of linear PEG 600, PEG 4000, and PEG 20000. It is anticipated the method will have wide ranging applications and stimulate the development of assays for other pharmaceutical polymers in the future.
Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas
2016-04-26
Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.
Estimating unknown parameters in haemophilia using expert judgement elicitation.
Fischer, K; Lewandowski, D; Janssen, M P
2013-09-01
The increasing attention to healthcare costs and treatment efficiency has led to an increasing demand for quantitative data concerning patient and treatment characteristics in haemophilia. However, most of these data are difficult to obtain. The aim of this study was to use expert judgement elicitation (EJE) to estimate currently unavailable key parameters for treatment models in severe haemophilia A. Using a formal expert elicitation procedure, 19 international experts provided information on (i) natural bleeding frequency according to age and onset of bleeding, (ii) treatment of bleeds, (iii) time needed to control bleeding after starting secondary prophylaxis, (iv) dose requirements for secondary prophylaxis according to onset of bleeding, and (v) life-expectancy. For each parameter experts provided their quantitative estimates (median, P10, P90), which were combined using a graphical method. In addition, information was obtained concerning key decision parameters of haemophilia treatment. There was most agreement between experts regarding bleeding frequencies for patients treated on demand with an average onset of joint bleeding (1.7 years): median 12 joint bleeds per year (95% confidence interval 0.9-36) for patients ≤ 18, and 11 (0.8-61) for adult patients. Less agreement was observed concerning estimated effective dose for secondary prophylaxis in adults: median 2000 IU every other day The majority (63%) of experts expected that a single minor joint bleed could cause irreversible damage, and would accept up to three minor joint bleeds or one trauma related joint bleed annually on prophylaxis. Expert judgement elicitation allowed structured capturing of quantitative expert estimates. It generated novel data to be used in computer modelling, clinical care, and trial design. © 2013 John Wiley & Sons Ltd.
Quantitative effects of cyanogenesis on an adapted herbivore.
Ballhorn, D J; Heil, M; Pietrowski, A; Lieberei, R
2007-12-01
Plant cyanogenesis means the release of gaseous hydrogen cyanide (HCN) in response to cell damage and is considered as an effective defense against generalist herbivores. In contrast, specialists are generally believed not to be affected negatively by this trait. However, quantitative data on long-term effects of cyanogenesis on specialists are rare. In this study, we used lima bean accessions (Fabaceae: Phaseolus lunatus L.) with high quantitative variability of cyanogenic features comprising cyanogenic potential (HCNp; concentration of cyanogenic precursors) and cyanogenic capacities (HCNc; release of gaseous HCN per unit time). In feeding trials, we analyzed performance of herbivorous Mexican bean beetle (Coleoptera: Coccinellidae: Epilachna varivestis Mulsant) on selected lines characterized by high (HC-plants) and low HCNp (LC-plants). Larval and adult stages of this herbivore feed on a narrow range of legumes and prefer cyanogenic lima bean as host plant. Nevertheless, we found that performance of beetles (larval weight gain per time and body mass of adult beetles) was significantly affected by lima bean HCNp: Body weight decreased and developmental period of larvae and pupae increased on HC-plants during the first generation of beetles and then remained constant for four consecutive generations. In addition, we found continuously decreasing numbers of eggs and larval hatching as inter-generational effects on HC-plants. In contrast to HC-plants, constantly high performance was observed among four generations on LC-plants. Our results demonstrate that Mexican bean beetle, although preferentially feeding on lima bean, is quantitatively affected by the HCNp of its host plant. Effects can only be detected when considering more than one generation. Thus, cyanide-containing precursors can have negative effects even on herbivores adapted to feed on cyanogenic plants.
NASA Astrophysics Data System (ADS)
Lim, Hongki; Fessler, Jeffrey A.; Wilderman, Scott J.; Brooks, Allen F.; Dewaraja, Yuni K.
2018-06-01
While the yield of positrons used in Y-90 PET is independent of tissue media, Y-90 SPECT imaging is complicated by the tissue dependence of bremsstrahlung photon generation. The probability of bremsstrahlung production is proportional to the square of the atomic number of the medium. Hence, the same amount of activity in different tissue regions of the body will produce different numbers of bremsstrahlung photons. Existing reconstruction methods disregard this tissue-dependency, potentially impacting both qualitative and quantitative imaging of heterogeneous regions of the body such as bone with marrow cavities. In this proof-of-concept study, we propose a new maximum-likelihood method that incorporates bremsstrahlung generation probabilities into the system matrix, enabling images of the desired Y-90 distribution to be reconstructed instead of the ‘bremsstrahlung distribution’ that is obtained with existing methods. The tissue-dependent probabilities are generated by Monte Carlo simulation while bone volume fractions for each SPECT voxel are obtained from co-registered CT. First, we demonstrate the tissue dependency in a SPECT/CT imaging experiment with Y-90 in bone equivalent solution and water. Visually, the proposed reconstruction approach better matched the true image and the Y-90 PET image than the standard bremsstrahlung reconstruction approach. An XCAT phantom simulation including bone and marrow regions also demonstrated better agreement with the true image using the proposed reconstruction method. Quantitatively, compared with the standard reconstruction, the new method improved estimation of the liquid bone:water activity concentration ratio by 40% in the SPECT measurement and the cortical bone:marrow activity concentration ratio by 58% in the XCAT simulation.
Schadt, Simone; Bister, Bojan; Chowdhury, Swapan K; Funk, Christoph; Hop, Cornelis E C A; Humphreys, W Griffith; Igarashi, Fumihiko; James, Alexander D; Kagan, Mark; Khojasteh, S Cyrus; Nedderman, Angus N R; Prakash, Chandra; Runge, Frank; Scheible, Holger; Spracklin, Douglas K; Swart, Piet; Tse, Susanna; Yuan, Josh; Obach, R Scott
2018-06-01
Since the introduction of metabolites in safety testing (MIST) guidance by the Food and Drug Administration in 2008, major changes have occurred in the experimental methods for the identification and quantification of metabolites, ways to evaluate coverage of metabolites, and the timing of critical clinical and nonclinical studies to generate this information. In this cross-industry review, we discuss how the increased focus on human drug metabolites and their potential contribution to safety and drug-drug interactions has influenced the approaches taken by industry for the identification and quantitation of human drug metabolites. Before the MIST guidance was issued, the method of choice for generating comprehensive metabolite profile was radio chromatography. The MIST guidance increased the focus on human drug metabolites and their potential contribution to safety and drug-drug interactions and led to changes in the practices of drug metabolism scientists. In addition, the guidance suggested that human metabolism studies should also be accelerated, which has led to more frequent determination of human metabolite profiles from multiple ascending-dose clinical studies. Generating a comprehensive and quantitative profile of human metabolites has become a more urgent task. Together with technological advances, these events have led to a general shift of focus toward earlier human metabolism studies using high-resolution mass spectrometry and to a reduction in animal radiolabel absorption/distribution/metabolism/excretion studies. The changes induced by the MIST guidance are highlighted by six case studies included herein, reflecting different stages of implementation of the MIST guidance within the pharmaceutical industry. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.
NASA Astrophysics Data System (ADS)
Tang, Chaoqing; Tian, Gui Yun; Chen, Xiaotian; Wu, Jianbo; Li, Kongjing; Meng, Hongying
2017-12-01
Active thermography provides infrared images that contain sub-surface defect information, while visible images only reveal surface information. Mapping infrared information to visible images offers more comprehensive visualization for decision-making in rail inspection. However, the common information for registration is limited due to different modalities in both local and global level. For example, rail track which has low temperature contrast reveals rich details in visible images, but turns blurry in the infrared counterparts. This paper proposes a registration algorithm called Edge-Guided Speeded-Up-Robust-Features (EG-SURF) to address this issue. Rather than sequentially integrating local and global information in matching stage which suffered from buckets effect, this algorithm adaptively integrates local and global information into a descriptor to gather more common information before matching. This adaptability consists of two facets, an adaptable weighting factor between local and global information, and an adaptable main direction accuracy. The local information is extracted using SURF while the global information is represented by shape context from edges. Meanwhile, in shape context generation process, edges are weighted according to local scale and decomposed into bins using a vector decomposition manner to provide more accurate descriptor. The proposed algorithm is qualitatively and quantitatively validated using eddy current pulsed thermography scene in the experiments. In comparison with other algorithms, better performance has been achieved.
Generation 1.5 Written Error Patterns: A Comparative Study
ERIC Educational Resources Information Center
Doolan, Stephen M.; Miller, Donald
2012-01-01
In an attempt to contribute to existing research on Generation 1.5 students, the current study uses quantitative and qualitative methods to compare error patterns in a corpus of Generation 1.5, L1, and L2 community college student writing. This error analysis provides one important way to determine if error patterns in Generation 1.5 student…
Optical Ptychographic Microscope for Quantitative Bio-Mechanical Imaging
NASA Astrophysics Data System (ADS)
Anthony, Nicholas; Cadenazzi, Guido; Nugent, Keith; Abbey, Brian
The role that mechanical forces play in biological processes such as cell movement and death is becoming of significant interest to further develop our understanding of the inner workings of cells. The most common method used to obtain stress information is photoelasticity which maps a samples birefringence, or its direction dependent refractive indices, using polarized light. However this method only provides qualitative data and for stress information to be useful quantitative data is required. Ptychography is a method for quantitatively determining the phase of a samples complex transmission function. The technique relies upon the collection of multiple overlapping coherent diffraction patterns from laterally displaced points on the sample. The overlap of measurement points provides complementary information that significantly aids in the reconstruction of the complex wavefield exiting the sample and allows for quantitative imaging of weakly interacting specimens. Here we describe recent advances at La Trobe University Melbourne on achieving quantitative birefringence mapping using polarized light ptychography with applications in cell mechanics. Australian Synchrotron, ARC Centre of Excellence for Advanced Molecular Imaging.
Testing the Community-Based Learning Collaborative (CBLC) implementation model: a study protocol.
Hanson, Rochelle F; Schoenwald, Sonja; Saunders, Benjamin E; Chapman, Jason; Palinkas, Lawrence A; Moreland, Angela D; Dopp, Alex
2016-01-01
High rates of youth exposure to violence, either through direct victimization or witnessing, result in significant health/mental health consequences and high associated lifetime costs. Evidence-based treatments (EBTs), such as Trauma-Focused Cognitive Behavioral Therapy (TF-CBT), can prevent and/or reduce these negative effects, yet these treatments are not standard practice for therapists working with children identified by child welfare or mental health systems as needing services. While research indicates that collaboration among child welfare and mental health services sectors improves availability and sustainment of EBTs for children, few implementation strategies designed specifically to promote and sustain inter-professional collaboration (IC) and inter-organizational relationships (IOR) have undergone empirical investigation. A potential candidate for evaluation is the Community-Based Learning Collaborative (CBLC) implementation model, an adaptation of the Learning Collaborative which includes strategies designed to develop and strengthen inter-professional relationships between brokers and providers of mental health services to promote IC and IOR and achieve sustained implementation of EBTs for children within a community. This non-experimental, mixed methods study involves two phases: (1) analysis of existing prospective quantitative and qualitative quality improvement and project evaluation data collected pre and post, weekly, and monthly from 998 participants in one of seven CBLCs conducted as part of a statewide initiative; and (2) Phase 2 collection of new quantitative and qualitative (key informant interviews) data during the funded study period to evaluate changes in relations among IC, IOR, social networks and the penetration and sustainment of TF-CBT in targeted communities. Recruitment for Phase 2 is from the pool of 998 CBLC participants to achieve a targeted enrollment of n = 150. Study aims include: (1) Use existing quality improvement (weekly/monthly online surveys; pre-post surveys; interviews) and newly collected quantitative (monthly surveys) and qualitative (key informant interviews) data and social network analysis to test whether CBLC strategies are associated with penetration and sustainment of TF-CBT; and (2) Use existing quantitative quality improvement (weekly/monthly on-line surveys; pre/post surveys) and newly collected qualitative (key informant interviews) data and social network analysis to test whether CBLC strategies are associated with increased IOR and IC intensity. The proposed research leverages an on-going, statewide implementation initiative to generate evidence about implementation strategies needed to make trauma-focused EBTs more accessible to children. This study also provides feasibility data to inform an effectiveness trial that will utilize a time-series design to rigorously evaluate the CBLC model as a mechanism to improve access and sustained use of EBTs for children.
Sender–receiver systems and applying information theory for quantitative synthetic biology
Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark
2015-01-01
Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688
Rodriguez-Falces, Javier
2013-12-01
In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are difficult to describe and conceptualize. In addition, most traditional approaches aimed at describing extracellular potentials consist of complex mathematical machinery that gives no chance for physical interpretation. The aim of the present study is to present a new method to teach the formation of extracellular potentials around a muscle fiber from both a descriptive and quantitative perspective. The implementation of this method was tested through a written exam and a satisfaction survey. The new method enhanced the ability of students to visualize the generation of bioelectrical potentials. In addition, the new approach improved students' understanding of how changes in the fiber-to-electrode distance and in the shape of the excitation source are translated into changes in the extracellular potential. The survey results show that combining general principles of electrical fields with accurate graphic imagery gives students an intuitive, yet quantitative, feel for electrophysiological signals and enhances their motivation to continue their studies in the biomedical engineering field.
Stable operation of a Secure QKD system in the real-world setting
NASA Astrophysics Data System (ADS)
Tomita, Akihisa
2007-06-01
Quantum Key Distribution (QKD) now steps forward from the proof of principle to the validation of the practical feasibility. Nevertheless, the QKD technology should respond to the challenges from the real-world such as stable operation against the fluctuating environment, and security proof under the practical setting. We report our recent progress on stable operation of a QKD system, and key generation with security assurance. A QKD system should robust to temperature fluctuation in a common office environment. We developed a loop-mirror, a substitution of a Faraday mirror, to allow easy compensation for the temperature dependence of the device. Phase locking technique was also employed to synchronize the system clock to the quantum signals. This technique is indispensable for the transmission system based on the installed fiber cables, which stretch and shrink due to the temperature change. The security proof of QKD, however, has assumed the ideal conditions, such as the use of a genuine single photon source and/or unlimited computational resources. It has been highly desirable to give an assurance of security for practical systems, where the ideal conditions are no longer satisfied. We have constructed a theory to estimate the leakage information on the transmitted key under the practically attainable conditions, and have developed a QKD system equipped with software for secure key distillation. The QKD system generates the final key at the rate of 2000 bps after 20 km fiber transmission. Eavesdropper's information on the final key is guaranteed to be less than 2-7 per bit. This is the first successful generation of the secure key with quantitative assurance of the upper bound of the leakage information. It will put forth the realization of highly secure metropolitan optical communication network against any types of eavesdropping.
Winterbourn, Christine C
2014-02-01
Small molecule fluorescent probes are vital tools for monitoring reactive oxygen species in cells. The types of probe available, the extent to which they are specific or quantitative and complications in interpreting results are discussed. Most commonly used probes (e.g. dihydrodichlorofluorescein, dihydrorhodamine) have some value in providing information on changes to the redox environment of the cell, but they are not specific for any one oxidant and the response is affected by numerous chemical interactions and not just increased oxidant generation. These probes generate the fluorescent end product by a free radical mechanism, and to react with hydrogen peroxide they require a metal catalyst. Probe radicals can react with oxygen, superoxide, and various antioxidant molecules, all of which influence the signal. Newer generation probes such as boronates act by a different mechanism in which nucleophilic attack by the oxidant on a blocking group releases masked fluorescence. Boronates react with hydrogen peroxide, peroxynitrite, hypochlorous acid and in some cases superoxide, so are selective but not specific. They react with hydrogen peroxide very slowly, and kinetic considerations raise questions about how the reaction could occur in cells. Data from oxidant-sensitive fluorescent probes can provide some information on cellular redox activity but is widely misinterpreted. Recently developed non-redox probes show promise but are not generally available and more information on specificity and cellular reactions is needed. We do not yet have probes that can quantify cellular production of specific oxidants. This article is part of a Special Issue entitled Current methods to study reactive oxygen species - pros and cons and biophysics of membrane proteins. Guest Editor: Christine Winterbourn. Copyright © 2013 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-05
... includes both quantitative and non-quantitative limits on benefits. Examples of quantitative limits include... duration of treatment. Examples of non-quantitative limits include prior authorization and step therapy... relevant issuers would submit data and descriptive information on the [[Page 33136
Chatterjee, Shiladitya; Major, George H; Paull, Brett; Rodriguez, Estrella Sanz; Kaykhaii, Massoud; Linford, Matthew R
2018-04-21
The total ion current chromatogram (TICC) obtained by liquid-chromatography-mass spectrometry (LC-MS) is often extremely complex and 'noisy' in appearance, particularly when an electrospray ionization source is used. Accordingly, meaningful qualitative and quantitative information can be obtained in LC-MS by data mining processes. Here, one or more higher-quality mass chromatograms can be identified/extracted/isolated and combined to form a TICC, wherein much of the background mass noise is eliminated, and quantitative data for chromatographic peaks can be obtained. Pattern Recognition Entropy (PRE) is a new application of Shannon's statistical concept of entropy. PRE is both a pattern recognition tool and a summary statistic that can be used to identify information-containing mass chromatograms, where higher quality data (higher signal-to-noise mass chromatograms) usually have lower PRE values. Reduced TICCs are obtained by first calculating the PRE values of the component mass chromatograms. A plot of PRE value vs. m/z for the mass chromatograms is then generated, and the resulting band of PRE values is fit to a piecewise spline polynomial. The distribution of the differences between the individual PRE values and the spline fit is then used to select 'good' mass chromatograms. For the data set considered herein, best results were obtained with a threshold of 0.5 standard deviations below the average value (value of the spline). PRE reduces the number of component mass chromatograms significantly (by an order of magnitude) and at the same time preserves most of the chemical information that is collectively in them. It can also distinguish between mass chromatograms of chemically similar species. PRE is arguably a less computationally intensive alternative to the widely used CODA algorithm for variable reduction. It produces reduced TICCs of comparable if not higher quality, and it requires only a single user input for variable selection. Reduced TICCs generated by PRE can be smoothed to further improve their signal-to-noise ratios. Copyright © 2018 Elsevier B.V. All rights reserved.
Quantitative comparison of 3D third harmonic generation and fluorescence microscopy images.
Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C
2018-01-01
Third harmonic generation (THG) microscopy is a label-free imaging technique that shows great potential for rapid pathology of brain tissue during brain tumor surgery. However, the interpretation of THG brain images should be quantitatively linked to images of more standard imaging techniques, which so far has been done qualitatively only. We establish here such a quantitative link between THG images of mouse brain tissue and all-nuclei-highlighted fluorescence images, acquired simultaneously from the same tissue area. For quantitative comparison of a substantial pair of images, we present here a segmentation workflow that is applicable for both THG and fluorescence images, with a precision of 91.3 % and 95.8 % achieved respectively. We find that the correspondence between the main features of the two imaging modalities amounts to 88.9 %, providing quantitative evidence of the interpretation of dark holes as brain cells. Moreover, 80 % bright objects in THG images overlap with nuclei highlighted in the fluorescence images, and they are 2 times smaller than the dark holes, showing that cells of different morphologies can be recognized in THG images. We expect that the described quantitative comparison is applicable to other types of brain tissue and with more specific staining experiments for cell type identification. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantifying Information Gain from Dynamic Downscaling Experiments
NASA Astrophysics Data System (ADS)
Tian, Y.; Peters-Lidard, C. D.
2015-12-01
Dynamic climate downscaling experiments are designed to produce information at higher spatial and temporal resolutions. Such additional information is generated from the low-resolution initial and boundary conditions via the predictive power of the physical laws. However, errors and uncertainties in the initial and boundary conditions can be propagated and even amplified to the downscaled simulations. Additionally, the limit of predictability in nonlinear dynamical systems will also damper the information gain, even if the initial and boundary conditions were error-free. Thus it is critical to quantitatively define and measure the amount of information increase from dynamic downscaling experiments, to better understand and appreciate their potentials and limitations. We present a scheme to objectively measure the information gain from such experiments. The scheme is based on information theory, and we argue that if a downscaling experiment is to exhibit value, it has to produce more information than what can be simply inferred from information sources already available. These information sources include the initial and boundary conditions, the coarse resolution model in which the higher-resolution models are embedded, and the same set of physical laws. These existing information sources define an "information threshold" as a function of the spatial and temporal resolution, and this threshold serves as a benchmark to quantify the information gain from the downscaling experiments, or any other approaches. For a downscaling experiment to shown any value, the information has to be above this threshold. A recent NASA-supported downscaling experiment is used as an example to illustrate the application of this scheme.
WO{sub 3} thin film based multiple sensor array for electronic nose application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramgir, Niranjan S., E-mail: niranjanpr@yahoo.com, E-mail: deepakcct1991@gmail.com; Goyal, C. P.; Datta, N.
2015-06-24
Multiple sensor array comprising 16 x 2 sensing elements were realized using RF sputtered WO{sub 3} thin films. The sensor films were modified with a thin layer of sensitizers namely Au, Ni, Cu, Al, Pd, Ti, Pt. The resulting sensor array were tested for their response towards different gases namely H{sub 2}S, NH{sub 3}, NO and C{sub 2}H{sub 5}OH. The sensor response values measured from the response curves indicates that the sensor array generates a unique signature pattern (bar chart) for the gases. The sensor response values can be used to get both qualitative and quantitative information about the gas.
eQTL Mapping Using RNA-seq Data
Hu, Yijuan
2012-01-01
As RNA-seq is replacing gene expression microarrays to assess genome-wide transcription abundance, gene expression Quantitative Trait Locus (eQTL) studies using RNA-seq have emerged. RNA-seq delivers two novel features that are important for eQTL studies. First, it provides information on allele-specific expression (ASE), which is not available from gene expression microarrays. Second, it generates unprecedentedly rich data to study RNA-isoform expression. In this paper, we review current methods for eQTL mapping using ASE and discuss some future directions. We also review existing works that use RNA-seq data to study RNA-isoform expression and we discuss the gaps between these works and isoform-specific eQTL mapping. PMID:23667399
Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R.; Lemieux-Charles, Louise
2015-01-01
Research problem Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. Research question What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Literature review Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. Methodology We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. Results and discussion The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design. PMID:26190888
Values in Qualitative and Quantitative Research
ERIC Educational Resources Information Center
Duffy, Maureen; Chenail, Ronald J.
2008-01-01
The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…
Generating Linear Equations Based on Quantitative Reasoning
ERIC Educational Resources Information Center
Lee, Mi Yeon
2017-01-01
The Common Core's Standards for Mathematical Practice encourage teachers to develop their students' ability to reason abstractly and quantitatively by helping students make sense of quantities and their relationships within problem situations. The seventh-grade content standards include objectives pertaining to developing linear equations in…
Li, Weiyong; Worosila, Gregory D
2005-05-13
This research note demonstrates the simultaneous quantitation of a pharmaceutical active ingredient and three excipients in a simulated powder blend containing acetaminophen, Prosolv and Crospovidone. An experimental design approach was used in generating a 5-level (%, w/w) calibration sample set that included 125 samples. The samples were prepared by weighing suitable amount of powders into separate 20-mL scintillation vials and were mixed manually. Partial least squares (PLS) regression was used in calibration model development. The models generated accurate results for quantitation of Crospovidone (at 5%, w/w) and magnesium stearate (at 0.5%, w/w). Further testing of the models demonstrated that the 2-level models were as effective as the 5-level ones, which reduced the calibration sample number to 50. The models had a small bias for quantitation of acetaminophen (at 30%, w/w) and Prosolv (at 64.5%, w/w) in the blend. The implication of the bias is discussed.
Low rank magnetic resonance fingerprinting.
Mazor, Gal; Weizman, Lior; Tal, Assaf; Eldar, Yonina C
2016-08-01
Magnetic Resonance Fingerprinting (MRF) is a relatively new approach that provides quantitative MRI using randomized acquisition. Extraction of physical quantitative tissue values is preformed off-line, based on acquisition with varying parameters and a dictionary generated according to the Bloch equations. MRF uses hundreds of radio frequency (RF) excitation pulses for acquisition, and therefore high under-sampling ratio in the sampling domain (k-space) is required. This under-sampling causes spatial artifacts that hamper the ability to accurately estimate the quantitative tissue values. In this work, we introduce a new approach for quantitative MRI using MRF, called Low Rank MRF. We exploit the low rank property of the temporal domain, on top of the well-known sparsity of the MRF signal in the generated dictionary domain. We present an iterative scheme that consists of a gradient step followed by a low rank projection using the singular value decomposition. Experiments on real MRI data demonstrate superior results compared to conventional implementation of compressed sensing for MRF at 15% sampling ratio.
Jolley, K A; Wilson, D J; Kriz, P; McVean, G; Maiden, M C J
2005-03-01
Patterns of genetic diversity within populations of human pathogens, shaped by the ecology of host-microbe interactions, contain important information about the epidemiological history of infectious disease. Exploiting this information, however, requires a systematic approach that distinguishes the genetic signal generated by epidemiological processes from the effects of other forces, such as recombination, mutation, and population history. Here, a variety of quantitative techniques were employed to investigate multilocus sequence information from isolate collections of Neisseria meningitidis, a major cause of meningitis and septicemia world wide. This allowed quantitative evaluation of alternative explanations for the observed population structure. A coalescent-based approach was employed to estimate the rate of mutation, the rate of recombination, and the size distribution of recombination fragments from samples from disease-associated and carried meningococci obtained in the Czech Republic in 1993 and a global collection of disease-associated isolates collected globally from 1937 to 1996. The parameter estimates were used to reject a model in which genetic structure arose by chance in small populations, and analysis of molecular variation showed that geographically restricted gene flow was unlikely to be the cause of the genetic structure. The genetic differentiation between disease and carriage isolate collections indicated that, whereas certain genotypes were overrepresented among the disease-isolate collections (the "hyperinvasive" lineages), disease-associated and carried meningococci exhibited remarkably little differentiation at the level of individual nucleotide polymorphisms. In combination, these results indicated the repeated action of natural selection on meningococcal populations, possibly arising from the coevolutionary dynamic of host-pathogen interactions.
Automated tracking and quantification of angiogenic vessel formation in 3D microfluidic devices.
Wang, Mengmeng; Ong, Lee-Ling Sharon; Dauwels, Justin; Asada, H Harry
2017-01-01
Angiogenesis, the growth of new blood vessels from pre-existing vessels, is a critical step in cancer invasion. Better understanding of the angiogenic mechanisms is required to develop effective antiangiogenic therapies for cancer treatment. We culture angiogenic vessels in 3D microfluidic devices under different Sphingosin-1-phosphate (S1P) conditions and develop an automated vessel formation tracking system (AVFTS) to track the angiogenic vessel formation and extract quantitative vessel information from the experimental time-lapse phase contrast images. The proposed AVFTS first preprocesses the experimental images, then applies a distance transform and an augmented fast marching method in skeletonization, and finally implements the Hungarian method in branch tracking. When applying the AVFTS to our experimental data, we achieve 97.3% precision and 93.9% recall by comparing with the ground truth obtained from manual tracking by visual inspection. This system enables biologists to quantitatively compare the influence of different growth factors. Specifically, we conclude that the positive S1P gradient increases cell migration and vessel elongation, leading to a higher probability for branching to occur. The AVFTS is also applicable to distinguish tip and stalk cells by considering the relative cell locations in a branch. Moreover, we generate a novel type of cell lineage plot, which not only provides cell migration and proliferation histories but also demonstrates cell phenotypic changes and branch information.
IT: An Effective Pedagogic Tool in the Teaching of Quantitative Methods in Management.
ERIC Educational Resources Information Center
Nadkami, Sanjay M.
1998-01-01
Examines the possibility of supplementing conventional pedagogic methods with information technology-based teaching aids in the instruction of quantitative methods to undergraduate students. Considers the case for a problem-based learning approach, and discusses the role of information technology. (Author/LRW)
78 FR 70074 - Agency Information Collection Activities: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... after the award expires for their fiscal year of activity. The indicators are both quantitative and descriptive. Quantitative information from the most recently completed fiscal year such as: [cir] Number and... respect to industrial collaboration [cir] Conducting a survey of all center participants to probe the...
76 FR 13674 - Agency Information Collection Activities: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-14
... expires for their fiscal year of activity. The indicators are both quantitative and descriptive. Quantitative information from the most recently completed fiscal year such as: [cir] Number and diversity of... report of center activities with respect to industrial collaboration [cir] Conducting a survey of all...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
...-depth understanding of individuals' attitudes, beliefs, motivations, and feelings than do quantitative... and as a qualitative research tool have three major purposes: To obtain information that is useful for developing variables and measures for quantitative studies, To better understand people's attitudes and...
40 CFR 799.6786 - TSCA water solubility: Generator column method.
Code of Federal Regulations, 2010 CFR
2010-07-01
... quantitative) analysis of solvent extract in paragraph (c)(3)(iv) of this section. The design of the generator... partition coefficients of hydrophobic substances. Journal of Research, National Bureau of Standards, 86:361...
Near-field photometry for organic light-emitting diodes
NASA Astrophysics Data System (ADS)
Li, Rui; Harikumar, Krishnan; Isphording, Alexandar; Venkataramanan, Venkat
2013-03-01
Organic Light Emitting Diode (OLED) technology is rapidly maturing to be ready for next generation of light source for general lighting. The current standard test methods for solid state lighting have evolved for semiconductor sources, with point-like emission characteristics. However, OLED devices are extended surface emitters, where spatial uniformity and angular variation of brightness and colour are important. This necessitates advanced test methods to obtain meaningful data for fundamental understanding, lighting product development and deployment. In this work, a near field imaging goniophotometer was used to characterize lighting-class white OLED devices, where luminance and colour information of the pixels on the light sources were measured at a near field distance for various angles. Analysis was performed to obtain angle dependent luminous intensity, CIE chromaticity coordinates and correlated colour temperature (CCT) in the far field. Furthermore, a complete ray set with chromaticity information was generated, so that illuminance at any distance and angle from the light source can be determined. The generated ray set is needed for optical modeling and design of OLED luminaires. Our results show that luminance non-uniformity could potentially affect the luminaire aesthetics and CCT can vary with angle by more than 2000K. This leads to the same source being perceived as warm or cool depending on the viewing angle. As OLEDs are becoming commercially available, this could be a major challenge for lighting designers. Near field measurement can provide detailed specifications and quantitative comparison between OLED products for performance improvement.
Synthesising quantitative and qualitative research in evidence‐based patient information
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-01-01
Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review. PMID:17325406
An eQTL Analysis of Partial Resistance to Puccinia hordei in Barley
Chen, Xinwei; Hackett, Christine A.; Niks, Rients E.; Hedley, Peter E.; Booth, Clare; Druka, Arnis; Marcel, Thierry C.; Vels, Anton; Bayer, Micha; Milne, Iain; Morris, Jenny; Ramsay, Luke; Marshall, David; Cardle, Linda; Waugh, Robbie
2010-01-01
Background Genetic resistance to barley leaf rust caused by Puccinia hordei involves both R genes and quantitative trait loci. The R genes provide higher but less durable resistance than the quantitative trait loci. Consequently, exploring quantitative or partial resistance has become a favorable alternative for controlling disease. Four quantitative trait loci for partial resistance to leaf rust have been identified in the doubled haploid Steptoe (St)/Morex (Mx) mapping population. Further investigations are required to study the molecular mechanisms underpinning partial resistance and ultimately identify the causal genes. Methodology/Principal Findings We explored partial resistance to barley leaf rust using a genetical genomics approach. We recorded RNA transcript abundance corresponding to each probe on a 15K Agilent custom barley microarray in seedlings from St and Mx and 144 doubled haploid lines of the St/Mx population. A total of 1154 and 1037 genes were, respectively, identified as being P. hordei-responsive among the St and Mx and differentially expressed between P. hordei-infected St and Mx. Normalized ratios from 72 distant-pair hybridisations were used to map the genetic determinants of variation in transcript abundance by expression quantitative trait locus (eQTL) mapping generating 15685 eQTL from 9557 genes. Correlation analysis identified 128 genes that were correlated with resistance, of which 89 had eQTL co-locating with the phenotypic quantitative trait loci (pQTL). Transcript abundance in the parents and conservation of synteny with rice allowed us to prioritise six genes as candidates for Rphq11, the pQTL of largest effect, and highlight one, a phospholipid hydroperoxide glutathione peroxidase (HvPHGPx) for detailed analysis. Conclusions/Significance The eQTL approach yielded information that led to the identification of strong candidate genes underlying pQTL for resistance to leaf rust in barley and on the general pathogen response pathway. The dataset will facilitate a systems appraisal of this host-pathogen interaction and, potentially, for other traits measured in this population. PMID:20066049
Andrzejak, Ralph G.; Hauf, Martinus; Pollo, Claudio; Müller, Markus; Weisstanner, Christian; Wiest, Roland; Schindler, Kaspar
2015-01-01
Background Epilepsy surgery is a potentially curative treatment option for pharmacoresistent patients. If non-invasive methods alone do not allow to delineate the epileptogenic brain areas the surgical candidates undergo long-term monitoring with intracranial EEG. Visual EEG analysis is then used to identify the seizure onset zone for targeted resection as a standard procedure. Methods Despite of its great potential to assess the epileptogenicty of brain tissue, quantitative EEG analysis has not yet found its way into routine clinical practice. To demonstrate that quantitative EEG may yield clinically highly relevant information we retrospectively investigated how post-operative seizure control is associated with four selected EEG measures evaluated in the resected brain tissue and the seizure onset zone. Importantly, the exact spatial location of the intracranial electrodes was determined by coregistration of pre-operative MRI and post-implantation CT and coregistration with post-resection MRI was used to delineate the extent of tissue resection. Using data-driven thresholding, quantitative EEG results were separated into normally contributing and salient channels. Results In patients with favorable post-surgical seizure control a significantly larger fraction of salient channels in three of the four quantitative EEG measures was resected than in patients with unfavorable outcome in terms of seizure control (median over the whole peri-ictal recordings). The same statistics revealed no association with post-operative seizure control when EEG channels contributing to the seizure onset zone were studied. Conclusions We conclude that quantitative EEG measures provide clinically relevant and objective markers of target tissue, which may be used to optimize epilepsy surgery. The finding that differentiation between favorable and unfavorable outcome was better for the fraction of salient values in the resected brain tissue than in the seizure onset zone is consistent with growing evidence that spatially extended networks might be more relevant for seizure generation, evolution and termination than a single highly localized brain region (i.e. a “focus”) where seizures start. PMID:26513359
Thiele, Kristina; Quinting, Jana Marie; Stenneken, Prisca
2016-09-01
The investigation of word generation performance is an accepted, widely used, and well-established method for examining cognitive, language, or communication impairment due to brain damage. The performance measure traditionally applied in the investigation of word generation is the number of correct responses. Previous studies, however, have suggested that this measure does not capture all potentially relevant aspects of word generation performance and hence its underlying processes, so that its analytical and explanatory power of word generation performance might be rather limited. Therefore, additional qualitative or quantitative performance measures have been introduced to gain information that goes beyond the deficit and allows for therapeutic implications. We undertook a systematic review and meta-analysis of original research that focused on the application of additional measures of word generation performance in adult clinical populations with acquired brain injury. Word generation tasks are an integral part of many different tests, but only few use additional performance measures in addition to the number of correct responses in the analysis of word generation performance. Additional measures, which showed increased or similar diagnostic utility relative to the traditional performance measure, regarded clustering and switching, error types, and temporal characteristics. The potential of additional performance measures is not yet fully exhausted in patients with brain injury. The temporal measure of response latencies in particular is not adequately represented, though it may be a reliable measure especially for identifying subtle impairments. Unfortunately, there is no general consensus as of yet on which additional measures are best suited to characterizing word generation performance. Further research is needed to specify the additional parameters that are best qualified for identifying and characterizing impaired word generation performance.
ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-01-01
Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-05-01
Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/
Quantitative holographic interferometry applied to combustion and compressible flow research
NASA Astrophysics Data System (ADS)
Bryanston-Cross, Peter J.; Towers, D. P.
1993-03-01
The application of holographic interferometry to phase object analysis is described. Emphasis has been given to a method of extracting quantitative information automatically from the interferometric fringe data. To achieve this a carrier frequency has been added to the holographic data. This has made it possible, firstly to form a phase map using a fast Fourier transform (FFT) algorithm. Then to `solve,' or unwrap, this image to give a contiguous density map using a minimum weight spanning tree (MST) noise immune algorithm, known as fringe analysis (FRAN). Applications of this work to a burner flame and a compressible flow are presented. In both cases the spatial frequency of the fringes exceed the resolvable limit of conventional digital framestores. Therefore, a flatbed scanner with a resolution of 3200 X 2400 pixels has been used to produce very high resolution digital images from photographs. This approach has allowed the processing of data despite the presence of caustics, generated by strong thermal gradients at the edge of the combustion field. A similar example is presented from the analysis of a compressible transonic flow in the shock wave and trailing edge regions.
Volcano collapse promoted by hydrothermal alteration and edifice shape, Mount Rainier, Washington
Reid, M.E.; Sisson, T.W.; Brien, D.L.
2001-01-01
Catastrophic collapses of steep volcano flanks threaten many populated regions, and understanding factors that promote collapse could save lives and property. Large collapses of hydrothermally altered parts of Mount Rainier have generated far-traveled debris flows; future flows would threaten densely populated parts of the Puget Sound region. We evaluate edifice collapse hazards at Mount Rainier using a new three-dimensional slope stability method incorporating detailed geologic mapping and subsurface geophysical imaging to determine distributions of strong (fresh) and weak (altered) rock. Quantitative three-dimensional slope stability calculations reveal that sizeable flank collapse (>0.1 km3) is promoted by voluminous, weak, hydrothermally altered rock situated high on steep slopes. These conditions exist only on Mount Rainier's upper west slope, consistent with the Holocene debris-flow history. Widespread alteration on lower flanks or concealed in regions of gentle slope high on the edifice does not greatly facilitate collapse. Our quantitative stability assessment method can also provide useful hazard predictions using reconnaissance geologic information and is a potentially rapid and inexpensive new tool for aiding volcano hazard assessments.
Use of remote-sensing techniques to survey the physical habitat of large rivers
Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffery W.; Kennedy, Gregory W.; Smith, Stephen B.; Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffrey W.; Kennedy, Gregory W.; Smith, Stephen B.
1997-01-01
Remote-sensing techniques that can be used to quantitatively characterize the physical habitat in large rivers in the United States where traditional survey approaches typically used in small- and medium-sized streams and rivers would be ineffective or impossible to apply. The state-of-the-art remote-sensing technologies that we discuss here include side-scan sonar, RoxAnn, acoustic Doppler current profiler, remotely operated vehicles and camera systems, global positioning systems, and laser level survey systems. The use of these technologies will permit the collection of information needed to create computer visualizations and hard copy maps and generate quantitative databases that can be used in real-time mode in the field to characterize the physical habitat at a study location of interest and to guide the distribution of sampling effort needed to address other habitat-related study objectives. This report augments habitat sampling and characterization guidance provided by Meador et al. (1993) and is intended for use primarily by U.S. Geological Survey National Water Quality Assessment program managers and scientists who are documenting water quality in streams and rivers of the United States.
Computational technique for stepwise quantitative assessment of equation correctness
NASA Astrophysics Data System (ADS)
Othman, Nuru'l Izzah; Bakar, Zainab Abu
2017-04-01
Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.
Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations
Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.
2013-01-01
Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359
Measuring Quantum Coherence with Entanglement.
Streltsov, Alexander; Singh, Uttam; Dhar, Himadri Shekhar; Bera, Manabendra Nath; Adesso, Gerardo
2015-07-10
Quantum coherence is an essential ingredient in quantum information processing and plays a central role in emergent fields such as nanoscale thermodynamics and quantum biology. However, our understanding and quantitative characterization of coherence as an operational resource are still very limited. Here we show that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. This finding allows us to define a novel general class of measures of coherence for a quantum system of arbitrary dimension, in terms of the maximum bipartite entanglement that can be generated via incoherent operations applied to the system and an incoherent ancilla. The resulting measures are proven to be valid coherence monotones satisfying all the requirements dictated by the resource theory of quantum coherence. We demonstrate the usefulness of our approach by proving that the fidelity-based geometric measure of coherence is a full convex coherence monotone, and deriving a closed formula for it on arbitrary single-qubit states. Our work provides a clear quantitative and operational connection between coherence and entanglement, two landmark manifestations of quantum theory and both key enablers for quantum technologies.
Quantitative mapping of solute accumulation in a soil-root system by magnetic resonance imaging
NASA Astrophysics Data System (ADS)
Haber-Pohlmeier, S.; Vanderborght, J.; Pohlmeier, A.
2017-08-01
Differential uptake of water and solutes by plant roots generates heterogeneous concentration distributions in soils. Noninvasive observations of root system architecture and concentration patterns therefore provide information about root water and solute uptake. We present the application of magnetic resonance imaging (MRI) to image and monitor root architecture and the distribution of a tracer, GdDTPA2- (Gadolinium-diethylenetriaminepentacetate) noninvasively during an infiltration experiment in a soil column planted with white lupin. We show that inversion recovery preparation within the MRI imaging sequence can quantitatively map concentrations of a tracer in a complex root-soil system. Instead of a simple T1 weighting, the procedure is extended by a wide range of inversion times to precisely map T1 and subsequently to cover a much broader concentration range of the solute. The derived concentrations patterns were consistent with mass balances and showed that the GdDTPA2- tracer represents a solute that is excluded by roots. Monitoring and imaging the accumulation of the tracer in the root zone therefore offers the potential to determine where and by which roots water is taken up.
Conversion of evanescent Lamb waves into propagating waves via a narrow aperture edge.
Yan, Xiang; Yuan, Fuh-Gwo
2015-06-01
This paper presents a quantitative study of conversion of evanescent Lamb waves into propagating in isotropic plates. The conversion is substantiated by prescribing time-harmonic Lamb displacements/tractions through a narrow aperture at an edge of a semi-infinite plate. Complex-valued dispersion and group velocity curves are employed to characterize the conversion process. The amplitude coefficient of the propagating Lamb modes converted from evanescent is quantified based on the complex reciprocity theorem via a finite element analysis. The power flow generated into the plate can be separated into radiative and reactive parts made on the basis of propagating and evanescent Lamb waves, where propagating Lamb waves are theoretically proved to radiate pure real power flow, and evanescent Lamb waves carry reactive pure imaginary power flow. The propagating power conversion efficiency is then defined to quantitatively describe the conversion. The conversion efficiency is strongly frequency dependent and can be significant. With the converted propagating waves from evanescent, sensors at far-field can recapture some localized damage information that is generally possessed in evanescent waves and may have potential application in structural health monitoring.
Benefit-cost methodology study with example application of the use of wind generators
NASA Technical Reports Server (NTRS)
Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.
1975-01-01
An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.
Subjective Quantitative Studies of Human Agency
ERIC Educational Resources Information Center
Alkire, Sabina
2005-01-01
Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator... use the application forms and procedures specified by OSM in accordance with Office of Management and...
Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review
ERIC Educational Resources Information Center
Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael
2014-01-01
Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…
ERIC Educational Resources Information Center
Hendon, Michalina
2016-01-01
This quantitative non-experimental correlational research analyzes the relationship between emotional intelligence and communication due to the lack of this research on information technology professionals in the U.S. One hundred and eleven (111) participants completed a survey that measures both the emotional intelligence and communication…
78 FR 2480 - Reports, Forms, and Record Keeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-11
... research, will also help to inform a quantitative survey that will explore potential redesigns for the New... hours. Number of Respondents: 80. The results of this research will be used to inform a quantitative survey that will explore potential redesigns for the New Car Assessment Program's Government 5-Star...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
... Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity: Comment Request AGENCY... techniques or the use of other forms of information technology. Titles a. Veterans Online Survey, VA Form 10-0513: b. Veterans Family Online Survey, VA Form 10-0513a. c. Veterans Primary Care Provider Online...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... provides useful insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that can be generalized to the population of study. This feedback will provide insights... used for quantitative information collections that are designed to yield reliably actionable results...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
... collecting qualitative and quantitative information. To support the qualitative analysis, HRSA will conduct... sites in order to gain a deeper understanding of the program's implementation. Finally, quantitative... forms; and 3. Client satisfaction surveys. ORHP is seeking approval from OMB for the three methods of...
Ponsford, Ruth; Ford, Jennifer; Korjonen, Helena; Hughes, Emma; Keswani, Asha; Pliakas, Triantafyllos; Egan, Matt
2017-07-21
Improving mechanisms for knowledge translation (KT) and connecting decision-makers to each other and the information and evidence they consider relevant to their work remains a priority for public health. Virtual communities of practices (CoPs) potentially offer an affordable and flexible means of encouraging connection and sharing of evidence, information and learning among the public health community in ways that transgress traditional geographical, professional, institutional and time boundaries. The suitability of online CoPs in public health, however, has rarely been tested. This paper explores the reasons why particular online CoP for alcohol harm reduction hosted by the UK Health Forum failed to generate sufficient interest from the group of public health professionals at which it was aimed. The study utilises online web-metrics demonstrating a lack of online activity on the CoP. One hundred and twenty seven responses to an online questionnaire were used to explore whether the lack of activity could be explained by the target audience's existing information and evidence practices and needs. Qualitative interviews with 10 members describe in more detail the factors that shape and inhibit use of the virtual CoP by those at which it was targeted. Quantitative and qualitative data confirm that the target audience had an interest in the kind of information and evidence the CoP was set up to share and generate discussion about, but also that participants considered themselves to already have relatively good access to the information and evidence they needed to inform their work. Qualitative data revealed that the main barriers to using the CoP were a proliferation of information sources meaning that participants preferred to utilise trusted sources that were already established within their daily routines and a lack of time to engage with new online tools that required any significant commitment. Specialist online CoPs are competing for space in an already crowded market. A target audience that regards itself as busy and over-supplied is unlikely to commit to a new service without the assurance that the service will provide unique and valuable well-summarised information, which would reduce the need to spend time accessing competing resources.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
From themes to hypotheses: following up with quantitative methods.
Morgan, David L
2015-06-01
One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.
Advances in Genetical Genomics of Plants
Joosen, R.V.L.; Ligterink, W.; Hilhorst, H.W.M.; Keurentjes, J.J.B.
2009-01-01
Natural variation provides a valuable resource to study the genetic regulation of quantitative traits. In quantitative trait locus (QTL) analyses this variation, captured in segregating mapping populations, is used to identify the genomic regions affecting these traits. The identification of the causal genes underlying QTLs is a major challenge for which the detection of gene expression differences is of major importance. By combining genetics with large scale expression profiling (i.e. genetical genomics), resulting in expression QTLs (eQTLs), great progress can be made in connecting phenotypic variation to genotypic diversity. In this review we discuss examples from human, mouse, Drosophila, yeast and plant research to illustrate the advances in genetical genomics, with a focus on understanding the regulatory mechanisms underlying natural variation. With their tolerance to inbreeding, short generation time and ease to generate large families, plants are ideal subjects to test new concepts in genetics. The comprehensive resources which are available for Arabidopsis make it a favorite model plant but genetical genomics also found its way to important crop species like rice, barley and wheat. We discuss eQTL profiling with respect to cis and trans regulation and show how combined studies with other ‘omics’ technologies, such as metabolomics and proteomics may further augment current information on transcriptional, translational and metabolomic signaling pathways and enable reconstruction of detailed regulatory networks. The fast developments in the ‘omics’ area will offer great potential for genetical genomics to elucidate the genotype-phenotype relationships for both fundamental and applied research. PMID:20514216
Urine Sample Preparation in 96-Well Filter Plates for Quantitative Clinical Proteomics
2015-01-01
Urine is an important, noninvasively collected body fluid source for the diagnosis and prognosis of human diseases. Liquid chromatography mass spectrometry (LC-MS) based shotgun proteomics has evolved as a sensitive and informative technique to discover candidate disease biomarkers from urine specimens. Filter-aided sample preparation (FASP) generates peptide samples from protein mixtures of cell lysate or body fluid origin. Here, we describe a FASP method adapted to 96-well filter plates, named 96FASP. Soluble urine concentrates containing ∼10 μg of total protein were processed by 96FASP and LC-MS resulting in 700–900 protein identifications at a 1% false discovery rate (FDR). The experimental repeatability, as assessed by label-free quantification and Pearson correlation analysis for shared proteins among replicates, was high (R ≥ 0.97). Application to urinary pellet lysates which is of particular interest in the context of urinary tract infection analysis was also demonstrated. On average, 1700 proteins (±398) were identified in five experiments. In a pilot study using 96FASP for analysis of eight soluble urine samples, we demonstrated that protein profiles of technical replicates invariably clustered; the protein profiles for distinct urine donors were very different from each other. Robust, highly parallel methods to generate peptide mixtures from urine and other body fluids are critical to increase cost-effectiveness in clinical proteomics projects. This 96FASP method has potential to become a gold standard for high-throughput quantitative clinical proteomics. PMID:24797144
ERIC Educational Resources Information Center
Sawicki, Charles A.
1996-01-01
Describes a simple, inexpensive system that allows students to have hands-on contact with simple experiments involving forces generated by induced currents. Discusses the use of a dynamic force sensor in making quantitative measurements of the forces generated. (JRH)
NASA Astrophysics Data System (ADS)
Koma, Zsófia; Székely, Balázs; Dorninger, Peter; Rasztovits, Sascha; Roncat, Andreas; Zámolyi, András; Krawczyk, Dominik; Pfeifer, Norbert
2014-05-01
Aerial imagery derivatives collected by the Unmanned Aerial Vehicle (UAV) technology can be used as input for generation of high resolution digital terrain model (DTM) data along with the Terrestrial Laser Scanning (TLS) method. Both types of datasets are suitable for detailed geological and geomorphometric analysis, because the data provide micro-topographical and structural geological information. Our study focuses on the comparison of the possibilities of the extracted geological information, which is available from high resolution DTMs. This research attempts to find an answer which technology is more effective for geological and geomorphological analysis. The measurements were taken at the Doren landslide (Vorarlberg, Austria), a complex rotational land slide situated in the Alpine molasse foreland. Several formations (Kojen Formation, Würmian glacial moraine sediments, Weissach Formation) were tectonized there in the course of the alpine orogeny (Oberhauser et al, 2007). The typical fault direction is WSW-ENE. The UAV measurements that were carried out simultaneously with the TLS campaign focused on the landslide scarp. The original image resolution was 4 mm/pixel. Image matching was implemented in pyramid level 2 and the achieved resolution of the DTM was 0.05 meter. The TLS dataset includes 18 scan positions and more than 300 million points for the whole landslide area. The achieved DTM has 0.2 meter resolution. The steps of the geological and geomorphological analysis were: (1) visual interpretation based on field work and geological maps, (2) quantitative DTM analysis. In the quantitative analysis input data provided by the different kinds of DTMs were used for further parameter calculations (e.g. slope, aspect, sigmaZ). In the next step an automatic classification method was used for the detection of faults and classification of different parts of the landslide. The conclusion was that for geological visualization interpretation UAV datasets are better, because the high resolution texture information allows for the extraction of the digital geomorphology indicators. For quantitative analysis both datasets are informative, but the TLS DTM has an advantage of accessing additional information on faults beneath the vegetation cover. These studies were carried out partly in the framework of Hybrid 3D project financed by the Austrian Research Promotion Agency (FFG) and Von-Oben and 4D-IT; the contribution of ZsK was partly funded by Campus Hungary Internship TÁMOP-424B1; BSz contributed partly as an Alexander von Humboldt Research Fellow.
Robert Wood Johnson Foundation Nurse Faculty Scholars program leadership training.
Campbell, Jacquelyn C; McBride, Angela Barron; Etcher, LuAnn; Deming, Katie
The Robert Wood Johnson Foundation Nurse Faculty Scholars program was created to address the nursing shortage via development of the next generation of national leaders in academic nursing. The leadership training combined development at the scholar's home institution with in-person didactic and interactive sessions with notable leaders in nursing and other disciplines. A curriculum matrix, organized by six domains, was evaluated quantitatively and qualitatively. What set this program apart is that it immersed junior faculty in concerted leadership development with regard to all aspects of the faculty role so that teaching interactively, making use of the latest in information technology, giving testimony before a policy-making group, participating in strategic planning, and figuring out how to reduce the budget without jeopardizing quality were all envisioned as part of the faculty role. The domains covered by this program could easily be used as the framework to plan other leadership-development programs for the next generation of academic leaders. Copyright © 2017 Elsevier Inc. All rights reserved.
Decision support systems for clinical radiological practice — towards the next generation
Stivaros, S M; Gledson, A; Nenadic, G; Zeng, X-J; Keane, J; Jackson, A
2010-01-01
The huge amount of information that needs to be assimilated in order to keep pace with the continued advances in modern medical practice can form an insurmountable obstacle to the individual clinician. Within radiology, the recent development of quantitative imaging techniques, such as perfusion imaging, and the development of imaging-based biomarkers in modern therapeutic assessment has highlighted the need for computer systems to provide the radiological community with support for academic as well as clinical/translational applications. This article provides an overview of the underlying design and functionality of radiological decision support systems with examples tracing the development and evolution of such systems over the past 40 years. More importantly, we discuss the specific design, performance and usage characteristics that previous systems have highlighted as being necessary for clinical uptake and routine use. Additionally, we have identified particular failings in our current methodologies for data dissemination within the medical domain that must be overcome if the next generation of decision support systems is to be implemented successfully. PMID:20965900
Fujimori, Shigeo; Hirai, Naoya; Ohashi, Hiroyuki; Masuoka, Kazuyo; Nishikimi, Akihiko; Fukui, Yoshinori; Washio, Takanori; Oshikubo, Tomohiro; Yamashita, Tatsuhiro; Miyamoto-Sato, Etsuko
2012-01-01
Next-generation sequencing (NGS) has been applied to various kinds of omics studies, resulting in many biological and medical discoveries. However, high-throughput protein-protein interactome datasets derived from detection by sequencing are scarce, because protein-protein interaction analysis requires many cell manipulations to examine the interactions. The low reliability of the high-throughput data is also a problem. Here, we describe a cell-free display technology combined with NGS that can improve both the coverage and reliability of interactome datasets. The completely cell-free method gives a high-throughput and a large detection space, testing the interactions without using clones. The quantitative information provided by NGS reduces the number of false positives. The method is suitable for the in vitro detection of proteins that interact not only with the bait protein, but also with DNA, RNA and chemical compounds. Thus, it could become a universal approach for exploring the large space of protein sequences and interactome networks. PMID:23056904
Second harmonic generation microscopy for quantitative analysis of collagen fibrillar structure
Chen, Xiyi; Nadiarynkh, Oleg; Plotnikov, Sergey; Campagnola, Paul J
2013-01-01
Second-harmonic generation (SHG) microscopy has emerged as a powerful modality for imaging fibrillar collagen in a diverse range of tissues. Because of its underlying physical origin, it is highly sensitive to the collagen fibril/fiber structure, and, importantly, to changes that occur in diseases such as cancer, fibrosis and connective tissue disorders. We discuss how SHG can be used to obtain more structural information on the assembly of collagen in tissues than is possible by other microscopy techniques. We first provide an overview of the state of the art and the physical background of SHG microscopy, and then describe the optical modifications that need to be made to a laser-scanning microscope to enable the measurements. Crucial aspects for biomedical applications are the capabilities and limitations of the different experimental configurations. We estimate that the setup and calibration of the SHG instrument from its component parts will require 2–4 weeks, depending on the level of the user’s experience. PMID:22402635
Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro
2017-01-01
In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software “Kongoh” for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1–4 persons’ contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI’s contribution in true contributors and non-contributors by using 2–4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI’s contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples. PMID:29149210
Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji
2017-01-01
In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.
A comparison of cosegregation analysis methods for the clinical setting.
Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H
2018-04-01
Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.
Quantitative microbiome profiling links gut community variation to microbial load.
Vandeputte, Doris; Kathagen, Gunter; D'hoe, Kevin; Vieira-Silva, Sara; Valles-Colomer, Mireia; Sabino, João; Wang, Jun; Tito, Raul Y; De Commer, Lindsey; Darzi, Youssef; Vermeire, Séverine; Falony, Gwen; Raes, Jeroen
2017-11-23
Current sequencing-based analyses of faecal microbiota quantify microbial taxa and metabolic pathways as fractions of the sample sequence library generated by each analysis. Although these relative approaches permit detection of disease-associated microbiome variation, they are limited in their ability to reveal the interplay between microbiota and host health. Comparative analyses of relative microbiome data cannot provide information about the extent or directionality of changes in taxa abundance or metabolic potential. If microbial load varies substantially between samples, relative profiling will hamper attempts to link microbiome features to quantitative data such as physiological parameters or metabolite concentrations. Saliently, relative approaches ignore the possibility that altered overall microbiota abundance itself could be a key identifier of a disease-associated ecosystem configuration. To enable genuine characterization of host-microbiota interactions, microbiome research must exchange ratios for counts. Here we build a workflow for the quantitative microbiome profiling of faecal material, through parallelization of amplicon sequencing and flow cytometric enumeration of microbial cells. We observe up to tenfold differences in the microbial loads of healthy individuals and relate this variation to enterotype differentiation. We show how microbial abundances underpin both microbiota variation between individuals and covariation with host phenotype. Quantitative profiling bypasses compositionality effects in the reconstruction of gut microbiota interaction networks and reveals that the taxonomic trade-off between Bacteroides and Prevotella is an artefact of relative microbiome analyses. Finally, we identify microbial load as a key driver of observed microbiota alterations in a cohort of patients with Crohn's disease, here associated with a low-cell-count Bacteroides enterotype (as defined through relative profiling).
Google glass based immunochromatographic diagnostic test analysis
NASA Astrophysics Data System (ADS)
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2015-03-01
Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.
Decoding 2D-PAGE complex maps: relevance to proteomics.
Pietrogrande, Maria Chiara; Marchetti, Nicola; Dondi, Francesco; Righetti, Pier Giorgio
2006-03-20
This review describes two mathematical approaches useful for decoding the complex signal of 2D-PAGE maps of protein mixtures. These methods are helpful for interpreting the large amount of data of each 2D-PAGE map by extracting all the analytical information hidden therein by spot overlapping. Here the basic theory and application to 2D-PAGE maps are reviewed: the means for extracting information from the experimental data and their relevance to proteomics are discussed. One method is based on the quantitative theory of statistical model of peak overlapping (SMO) using the spot experimental data (intensity and spatial coordinates). The second method is based on the study of the 2D-autocovariance function (2D-ACVF) computed on the experimental digitised map. They are two independent methods that are able to extract equal and complementary information from the 2D-PAGE map. Both methods permit to obtain fundamental information on the sample complexity and the separation performance and to single out ordered patterns present in spot positions: the availability of two independent procedures to compute the same separation parameters is a powerful tool to estimate the reliability of the obtained results. The SMO procedure is an unique tool to quantitatively estimate the degree of spot overlapping present in the map, while the 2D-ACVF method is particularly powerful in simply singling out the presence of order in the spot position from the complexity of the whole 2D map, i.e., spot trains. The procedures were validated by extensive numerical computation on computer-generated maps describing experimental 2D-PAGE gels of protein mixtures. Their applicability to real samples was tested on reference maps obtained from literature sources. The review describes the most relevant information for proteomics: sample complexity, separation performance, overlapping extent, identification of spot trains related to post-translational modifications (PTMs).
ERIC Educational Resources Information Center
Wagner, Richard K.
2005-01-01
The transition from first-generation to second-generation studies of genetic and environmental influences on the development of reading is underway. The first generation of quantitative genetic studies yielded an extraordinary conclusion: Fifty percent or more of the variance in most constructs, including reading, is attributable to genetic…
ERIC Educational Resources Information Center
DePinto, Ross M.
2013-01-01
Much of the relevant literature in the domains of leadership development, succession planning, and cross-generational issues that discusses learning paradigms associated with emerging generational cohorts has been based on qualitative research and anecdotal evidence. In contrast, this study employed quantitative research methods using a validated…
Quantitative Literacy for Undergraduate Business Students in the 21st Century
ERIC Educational Resources Information Center
McClure, Richard; Sircar, Sumit
2008-01-01
The current business environment is awash in vast amounts of data that ongoing transactions continually generate. Leading-edge corporations are using business analytics to achieve competitive advantage. However, educators are not adequately preparing business school students in quantitative methods to meet this challenge. For more than half a…
ERIC Educational Resources Information Center
Caglayan, Günhan
2013-01-01
This study is about prospective secondary mathematics teachers' understanding and sense making of representational quantities generated by algebra tiles, the quantitative units (linear vs. areal) inherent in the nature of these quantities, and the quantitative addition and multiplication operations--referent preserving versus referent…
NASA Astrophysics Data System (ADS)
Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin
2016-03-01
How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.
NASA Astrophysics Data System (ADS)
Privat-Maldonado, Angela; O'Connell, Deborah; Welch, Emma; Vann, Roddy; van der Woude, Marjan W.
2016-10-01
Low temperature plasmas (LTPs) generate a cocktail of reactive nitrogen and oxygen species (RNOS) with bactericidal activity. The RNOS however are spatially unevenly distributed in the plasma. Here we test the hypothesis that this distribution will affect the mechanisms underpinning plasma bactericidal activity focussing on the level of DNA damage in situ. For the first time, a quantitative, single cell approach was applied to assess the level of DNA damage in bacteria as a function of the radial distance from the centre of the plasma jet. Salmonella enterica on a solid, dry surface was treated with two types of LTP: an atmospheric-pressure dielectric barrier discharge plasma jet (charged and neutral species) and a radio-frequency atmospheric-pressure plasma jet (neutral species). In both cases, there was an inverse correlation between the degree of DNA damage and the radial distance from the centre of the plasma, with the highest DNA damage occurring directly under the plasma. This trend was also observed with Staphylococcus aureus. LTP-generated UV radiation was eliminated as a contributing factor. Thus valuable mechanistic information can be obtained from assays on biological material, which can inform the development of LTP as a complementary or alternative therapy for (topical) bacterial infections.
Nesvizhskii, Alexey I.
2013-01-01
Analysis of protein interaction networks and protein complexes using affinity purification and mass spectrometry (AP/MS) is among most commonly used and successful applications of proteomics technologies. One of the foremost challenges of AP/MS data is a large number of false positive protein interactions present in unfiltered datasets. Here we review computational and informatics strategies for detecting specific protein interaction partners in AP/MS experiments, with a focus on incomplete (as opposite to genome-wide) interactome mapping studies. These strategies range from standard statistical approaches, to empirical scoring schemes optimized for a particular type of data, to advanced computational frameworks. The common denominator among these methods is the use of label-free quantitative information such as spectral counts or integrated peptide intensities that can be extracted from AP/MS data. We also discuss related issues such as combining multiple biological or technical replicates, and dealing with data generated using different tagging strategies. Computational approaches for benchmarking of scoring methods are discussed, and the need for generation of reference AP/MS datasets is highlighted. Finally, we discuss the possibility of more extended modeling of experimental AP/MS data, including integration with external information such as protein interaction predictions based on functional genomics data. PMID:22611043
Privat-Maldonado, Angela; O’Connell, Deborah; Welch, Emma; Vann, Roddy; van der Woude, Marjan W.
2016-01-01
Low temperature plasmas (LTPs) generate a cocktail of reactive nitrogen and oxygen species (RNOS) with bactericidal activity. The RNOS however are spatially unevenly distributed in the plasma. Here we test the hypothesis that this distribution will affect the mechanisms underpinning plasma bactericidal activity focussing on the level of DNA damage in situ. For the first time, a quantitative, single cell approach was applied to assess the level of DNA damage in bacteria as a function of the radial distance from the centre of the plasma jet. Salmonella enterica on a solid, dry surface was treated with two types of LTP: an atmospheric-pressure dielectric barrier discharge plasma jet (charged and neutral species) and a radio-frequency atmospheric-pressure plasma jet (neutral species). In both cases, there was an inverse correlation between the degree of DNA damage and the radial distance from the centre of the plasma, with the highest DNA damage occurring directly under the plasma. This trend was also observed with Staphylococcus aureus. LTP-generated UV radiation was eliminated as a contributing factor. Thus valuable mechanistic information can be obtained from assays on biological material, which can inform the development of LTP as a complementary or alternative therapy for (topical) bacterial infections. PMID:27759098
Mallak, Shadi Kafi; Bakri Ishak, Mohd; Mohamed, Ahmad Fariz
2016-09-13
Malaysia is facing an increasing trend in industrial solid waste generation due to industrial development.Thus there is a paramount need in taking a serious action to move toward sustainable industrial waste management. The main aim of this study is to assess practicing solid waste minimization by manufacturing firms in Shah Alam industrial state, Malaysia. This paper presents a series of descriptive and inferential statistical analysis regarding the level and effects of practicing waste minimization methods, and seriousness of barriers preventing industries from practicing waste minimization methods. For this purpose the survey questions were designed such that both quantitative (questionnaire) and qualitative (semi-structures interview) data were collected concurrently. Analysis showed that, the majority of firms (92%) dispose their wastes rather than practice other sustainable waste management options. Also waste minimization methods such as segregation of wastes, on-site recycle and reuse, improve housekeeping and equipment modification were found to have significant contribution in waste reduction (p<0.05). Lack of expertise (M=3.50), lack of enough information (M= 3.54), lack of equipment modification (M= 3.16) and lack of specific waste minimization guidelines (M=3.49) have higher mean scores comparing with other barriers in different categories. These data were interpreted for elaborating of SWOT and TOWS matrix to highlight strengths, weaknesses, threats and opportunities. Accordingly, ten policies were recommended for improvement of practicing waste minimization by manufacturing firms as the main aim of this research. Implications This manuscript critically analysis waste minimization practices by manufacturing firms in Malaysia. Both qualitative and quantitative data collection and analysis were conducted to formulate SWOT and TOWS matrix in order to recommend policies and strategies for improvement of solid waste minimization by manufacturing industries. The results contribute to the knowledge and the findings of this study provide a useful baseline information and data on industrial solid waste generation and waste minimization practice.
NASA Astrophysics Data System (ADS)
Sidorov, Pavel; Gaspar, Helena; Marcou, Gilles; Varnek, Alexandre; Horvath, Dragos
2015-12-01
Intuitive, visual rendering—mapping—of high-dimensional chemical spaces (CS), is an important topic in chemoinformatics. Such maps were so far dedicated to specific compound collections—either limited series of known activities, or large, even exhaustive enumerations of molecules, but without associated property data. Typically, they were challenged to answer some classification problem with respect to those same molecules, admired for their aesthetical virtues and then forgotten—because they were set-specific constructs. This work wishes to address the question whether a general, compound set-independent map can be generated, and the claim of "universality" quantitatively justified, with respect to all the structure-activity information available so far—or, more realistically, an exploitable but significant fraction thereof. The "universal" CS map is expected to project molecules from the initial CS into a lower-dimensional space that is neighborhood behavior-compliant with respect to a large panel of ligand properties. Such map should be able to discriminate actives from inactives, or even support quantitative neighborhood-based, parameter-free property prediction (regression) models, for a wide panel of targets and target families. It should be polypharmacologically competent, without requiring any target-specific parameter fitting. This work describes an evolutionary growth procedure of such maps, based on generative topographic mapping, followed by the validation of their polypharmacological competence. Validation was achieved with respect to a maximum of exploitable structure-activity information, covering all of Homo sapiens proteins of the ChEMBL database, antiparasitic and antiviral data, etc. Five evolved maps satisfactorily solved hundreds of activity-based ligand classification challenges for targets, and even in vivo properties independent from training data. They also stood chemogenomics-related challenges, as cumulated responsibility vectors obtained by mapping of target-specific ligand collections were shown to represent validated target descriptors, complying with currently accepted target classification in biology. Therefore, they represent, in our opinion, a robust and well documented answer to the key question "What is a good CS map?"
Sidorov, Pavel; Gaspar, Helena; Marcou, Gilles; Varnek, Alexandre; Horvath, Dragos
2015-12-01
Intuitive, visual rendering--mapping--of high-dimensional chemical spaces (CS), is an important topic in chemoinformatics. Such maps were so far dedicated to specific compound collections--either limited series of known activities, or large, even exhaustive enumerations of molecules, but without associated property data. Typically, they were challenged to answer some classification problem with respect to those same molecules, admired for their aesthetical virtues and then forgotten--because they were set-specific constructs. This work wishes to address the question whether a general, compound set-independent map can be generated, and the claim of "universality" quantitatively justified, with respect to all the structure-activity information available so far--or, more realistically, an exploitable but significant fraction thereof. The "universal" CS map is expected to project molecules from the initial CS into a lower-dimensional space that is neighborhood behavior-compliant with respect to a large panel of ligand properties. Such map should be able to discriminate actives from inactives, or even support quantitative neighborhood-based, parameter-free property prediction (regression) models, for a wide panel of targets and target families. It should be polypharmacologically competent, without requiring any target-specific parameter fitting. This work describes an evolutionary growth procedure of such maps, based on generative topographic mapping, followed by the validation of their polypharmacological competence. Validation was achieved with respect to a maximum of exploitable structure-activity information, covering all of Homo sapiens proteins of the ChEMBL database, antiparasitic and antiviral data, etc. Five evolved maps satisfactorily solved hundreds of activity-based ligand classification challenges for targets, and even in vivo properties independent from training data. They also stood chemogenomics-related challenges, as cumulated responsibility vectors obtained by mapping of target-specific ligand collections were shown to represent validated target descriptors, complying with currently accepted target classification in biology. Therefore, they represent, in our opinion, a robust and well documented answer to the key question "What is a good CS map?"
Wilson, Shelby E.; Morris, Saul S.; Gilbert, Sarah Skye; Mosites, Emily; Hackleman, Rob; Weum, Kristoffer L.M.; Pintye, Jillian; Manhart, Lisa E.; Hawes, Stephen E.
2013-01-01
Aim This paper aims to identify factors that systematically predict why some countries that have tried to scale up oral rehydration solution (ORS) have succeeded, and others have not. Methods We examined ORS coverage over time, across countries, and through case studies. We conducted expert interviews and literature and data searches to better understand the history of ORS scale–up efforts and why they failed or succeeded in nine countries. We used qualitative, pairwise (or three–country) comparisons of geographically or otherwise similar countries that had different outcomes in terms of ORS scale–up. An algorithm was developed which scored country performance across key supply, demand and financing activities to quantitatively assess the scale–up efforts in each country. Results The vast majority of countries have neither particularly low nor encouragingly high ORS use rates. We observed three clearly identifiable contrasts between countries that achieved and sustained high ORS coverage and those that did not. Key partners across sectors have critical roles to play to effectively address supply– and demand–side barriers. Efforts must synchronize demand generation, private provider outreach and public sector work. Many donor funds are either suspended or redirected in the event of political instability, exacerbating the health challenges faced by countries in these contexts. We found little information on the cost of scale–up efforts. Conclusions We identified a number of characteristics of successful ORS scale–up programs, including involvement of a broad range of key players, addressing supply and demand generation together, and working with both public and private sectors. Dedicated efforts are needed to launch and sustain success, including monitoring and evaluation plans to track program costs and impacts. These case studies were designed to inform programmatic decision–making; thus, rigorous academic methods to qualitatively and quantitatively evaluate country ORS scale–up programs might yield additional, critical insights and confirm our conclusions. PMID:23826508
Wilson, Shelby E; Morris, Saul S; Gilbert, Sarah Skye; Mosites, Emily; Hackleman, Rob; Weum, Kristoffer L M; Pintye, Jillian; Manhart, Lisa E; Hawes, Stephen E
2013-06-01
This paper aims to identify factors that systematically predict why some countries that have tried to scale up oral rehydration solution (ORS) have succeeded, and others have not. We examined ORS coverage over time, across countries, and through case studies. We conducted expert interviews and literature and data searches to better understand the history of ORS scale-up efforts and why they failed or succeeded in nine countries. We used qualitative, pairwise (or three-country) comparisons of geographically or otherwise similar countries that had different outcomes in terms of ORS scale-up. An algorithm was developed which scored country performance across key supply, demand and financing activities to quantitatively assess the scale-up efforts in each country. The vast majority of countries have neither particularly low nor encouragingly high ORS use rates. We observed three clearly identifiable contrasts between countries that achieved and sustained high ORS coverage and those that did not. Key partners across sectors have critical roles to play to effectively address supply- and demand-side barriers. Efforts must synchronize demand generation, private provider outreach and public sector work. Many donor funds are either suspended or redirected in the event of political instability, exacerbating the health challenges faced by countries in these contexts. We found little information on the cost of scale-up efforts. We identified a number of characteristics of successful ORS scale-up programs, including involvement of a broad range of key players, addressing supply and demand generation together, and working with both public and private sectors. Dedicated efforts are needed to launch and sustain success, including monitoring and evaluation plans to track program costs and impacts. These case studies were designed to inform programmatic decision-making; thus, rigorous academic methods to qualitatively and quantitatively evaluate country ORS scale-up programs might yield additional, critical insights and confirm our conclusions.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2002-06-01
Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.
NASA Astrophysics Data System (ADS)
Cofré, Aarón; Vargas, Asticio; Torres-Ruiz, Fabián A.; Campos, Juan; Lizana, Angel; del Mar Sánchez-López, María; Moreno, Ignacio
2017-11-01
We present a quantitative analysis of the performance of a complete snapshot polarimeter based on a polarization diffraction grating (PDGr). The PDGr is generated in a common path polarization interferometer with a Z optical architecture that uses two liquid-crystal on silicon (LCoS) displays to imprint two different phase-only diffraction gratings onto two orthogonal linear states of polarization. As a result, we obtain a programmable PDGr capable to act as a simultaneous polarization state generator (PSG), yielding diffraction orders with different states of polarization. The same system is also shown to operate as a polarization state analyzer (PSA), therefore useful for the realization of a snapshot polarimeter. We analyze its performance using quantitative metrics such as the conditional number, and verify its reliability for the detection of states of polarization.
23 CFR 1200.22 - State traffic safety information system improvements grants.
Code of Federal Regulations, 2013 CFR
2013-04-01
... measures to be used to demonstrate quantitative progress in the accuracy, completeness, timeliness... to implement, provides an explanation. (d) Requirement for quantitative improvement. A State shall demonstrate quantitative improvement in the data attributes of accuracy, completeness, timeliness, uniformity...
23 CFR 1200.22 - State traffic safety information system improvements grants.
Code of Federal Regulations, 2014 CFR
2014-04-01
... measures to be used to demonstrate quantitative progress in the accuracy, completeness, timeliness... to implement, provides an explanation. (d) Requirement for quantitative improvement. A State shall demonstrate quantitative improvement in the data attributes of accuracy, completeness, timeliness, uniformity...
77 FR 18793 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-28
... customary compensation for their participation. For the quantitative research, the Bureau plans to contract with a consumer research firm to formulate a quantitative testing plan, recruit respondents, as well as... soliciting comments concerning the information collection efforts relating to Quantitative Testing of...
Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C
2016-01-01
The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Multiplexed MRM-based assays for the quantitation of proteins in mouse plasma and heart tissue.
Percy, Andrew J; Michaud, Sarah A; Jardim, Armando; Sinclair, Nicholas J; Zhang, Suping; Mohammed, Yassene; Palmer, Andrea L; Hardie, Darryl B; Yang, Juncong; LeBlanc, Andre M; Borchers, Christoph H
2017-04-01
The mouse is the most commonly used laboratory animal, with more than 14 million mice being used for research each year in North America alone. The number and diversity of mouse models is increasing rapidly through genetic engineering strategies, but detailed characterization of these models is still challenging because most phenotypic information is derived from time-consuming histological and biochemical analyses. To expand the biochemists' toolkit, we generated a set of targeted proteomic assays for mouse plasma and heart tissue, utilizing bottom-up LC/MRM-MS with isotope-labeled peptides as internal standards. Protein quantitation was performed using reverse standard curves, with LC-MS platform and curve performance evaluated by quality control standards. The assays comprising the final panel (101 peptides for 81 proteins in plasma; 227 peptides for 159 proteins in heart tissue) have been rigorously developed under a fit-for-purpose approach and utilize stable-isotope labeled peptides for every analyte to provide high-quality, precise relative quantitation. In addition, the peptides have been tested to be interference-free and the assay is highly multiplexed, with reproducibly determined protein concentrations spanning >4 orders of magnitude. The developed assays have been used in a small pilot study to demonstrate their application to molecular phenotyping or biomarker discovery/verification studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Ye, Qimiao; Chen, Lin; Qiu, Wenqi; Lin, Liangjie; Sun, Huijun; Cai, Shuhui; Wei, Zhiliang; Chen, Zhong
2017-01-01
Nuclear magnetic resonance (NMR) spectroscopy serves as an important tool for both qualitative and quantitative analyses of various systems in chemistry, biology, and medicine. However, applications of one-dimensional 1H NMR are often restrained by the presence of severe overlap among different resonances. The advent of two-dimensional (2D) 1H NMR constitutes a promising alternative by extending the crowded resonances into a plane and thereby alleviating the spectral congestions. However, the enhanced ability in discriminating resonances is achieved at the cost of extended experimental duration due to necessity of various scans with progressive delays to construct the indirect dimension. Therefore, in this study, we propose a selective coherence transfer (SECOT) method to accelerate acquisitions of 2D correlation spectroscopy by converting chemical shifts into spatial positions within the effective sample length and then performing an echo planar spectroscopic imaging module to record the spatial and spectral information, which generates 2D correlation spectrum after 2D Fourier transformation. The feasibility and effectiveness of SECOT have been verified by a set of experiments under both homogeneous and inhomogeneous magnetic fields. Moreover, evaluations of SECOT for quantitative analyses are carried out on samples with a series of different concentrations. Based on these experimental results, the SECOT may open important perspectives for fast, accurate, and stable investigations of various chemical systems both qualitatively and quantitatively.
Kernel-based whole-genome prediction of complex traits: a review.
Morota, Gota; Gianola, Daniel
2014-01-01
Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.
NASA Astrophysics Data System (ADS)
Marchand, Paul J.; Bouwens, Arno; Shamaei, Vincent; Nguyen, David; Extermann, Jerome; Bolmont, Tristan; Lasser, Theo
2016-03-01
Magnetic Resonance Imaging has revolutionised our understanding of brain function through its ability to image human cerebral structures non-invasively over the entire brain. By exploiting the different magnetic properties of oxygenated and deoxygenated blood, functional MRI can indirectly map areas undergoing neural activation. Alongside the development of fMRI, powerful statistical tools have been developed in an effort to shed light on the neural pathways involved in processing of sensory and cognitive information. In spite of the major improvements made in fMRI technology, the obtained spatial resolution of hundreds of microns prevents MRI in resolving and monitoring processes occurring at the cellular level. In this regard, Optical Coherence Microscopy is an ideal instrumentation as it can image at high spatio-temporal resolution. Moreover, by measuring the mean and the width of the Doppler spectra of light scattered by moving particles, OCM allows extracting the axial and lateral velocity components of red blood cells. The ability to assess quantitatively total blood velocity, as opposed to classical axial velocity Doppler OCM, is of paramount importance in brain imaging as a large proportion of cortical vascular is oriented perpendicularly to the optical axis. We combine here quantitative blood flow imaging with extended-focus Optical Coherence Microscopy and Statistical Parametric Mapping tools to generate maps of stimuli-evoked cortical hemodynamics at the capillary level.
Polak, Louisa; Green, Judith
2015-04-01
A large literature informs guidance for GPs about communicating quantitative risk information so as to facilitate shared decision making. However, relatively little has been written about how patients utilise such information in practice. To understand the role of quantitative risk information in patients' accounts of decisions about taking statins. This was a qualitative study, with participants recruited and interviewed in community settings. Semi-structured interviews were conducted with 34 participants aged >50 years, all of whom had been offered statins. Data were analysed thematically, using elements of the constant comparative method. Interviewees drew frequently on numerical test results to explain their decisions about preventive medication. In contrast, they seldom mentioned quantitative risk information, and never offered it as a rationale for action. Test results were spoken of as objects of concern despite an often-explicit absence of understanding, so lack of understanding seems unlikely to explain the non-use of risk estimates. Preventive medication was seen as 'necessary' either to treat test results, or because of personalised, unequivocal advice from a doctor. This study's findings call into question the assumption that people will heed and use numerical risk information once they understand it; these data highlight the need to consider the ways in which different kinds of knowledge are used in practice in everyday contexts. There was little evidence from this study that understanding probabilistic risk information was a necessary or valued condition for making decisions about statin use. © British Journal of General Practice 2015.
Sharpening advanced land imager multispectral data using a sensor model
Lemeshewsky, G.P.; ,
2005-01-01
The Advanced Land Imager (ALI) instrument on NASA's Earth Observing One (EO-1) satellite provides for nine spectral bands at 30m ground sample distance (GSD) and a 10m GSD panchromatic band. This report describes an image sharpening technique where the higher spatial resolution information of the panchromatic band is used to increase the spatial resolution of ALI multispectral (MS) data. To preserve the spectral characteristics, this technique combines reported deconvolution deblurring methods for the MS data with highpass filter-based fusion methods for the Pan data. The deblurring process uses the point spread function (PSF) model of the ALI sensor. Information includes calculation of the PSF from pre-launch calibration data. Performance was evaluated using simulated ALI MS data generated by degrading the spatial resolution of high resolution IKONOS satellite MS data. A quantitative measure of performance was the error between sharpened MS data and high resolution reference. This report also compares performance with that of a reported method that includes PSF information. Preliminary results indicate improved sharpening with the method reported here.
The role of informatics in patient-centered care and personalized medicine.
Hanna, Matthew G; Pantanowitz, Liron
2017-06-01
The practice of cytopathology has dramatically changed due to advances in genomics and information technology. Cytology laboratories have accordingly become increasingly dependent on pathology informatics support to meet the emerging demands of precision medicine. Pathology informatics deals with information technology in the laboratory, and the impact of this technology on workflow processes and staff who interact with these tools. This article covers the critical role that laboratory information systems, electronic medical records, and digital imaging plays in patient-centered personalized medicine. The value of integrated diagnostic reports, clinical decision support, and the use of whole-slide imaging to better evaluate cytology samples destined for molecular testing is discussed. Image analysis that offers more precise and quantitative measurements in cytology is addressed, as well as the role of bioinformatics tools to cope with Big Data from next-generation sequencing. This article also highlights the barriers to the widespread adoption of these disruptive technologies due to regulatory obstacles, limited commercial solutions, poor interoperability, and lack of standardization. Cancer Cytopathol 2017;125(6 suppl):494-501. © 2017 American Cancer Society. © 2017 American Cancer Society.
Rapid Exploitation and Analysis of Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buttler, D J; Andrzejewski, D; Stevens, K D
Analysts are overwhelmed with information. They have large archives of historical data, both structured and unstructured, and continuous streams of relevant messages and documents that they need to match to current tasks, digest, and incorporate into their analysis. The purpose of the READ project is to develop technologies to make it easier to catalog, classify, and locate relevant information. We approached this task from multiple angles. First, we tackle the issue of processing large quantities of information in reasonable time. Second, we provide mechanisms that allow users to customize their queries based on latent topics exposed from corpus statistics. Third,more » we assist users in organizing query results, adding localized expert structure over results. Forth, we use word sense disambiguation techniques to increase the precision of matching user generated keyword lists with terms and concepts in the corpus. Fifth, we enhance co-occurrence statistics with latent topic attribution, to aid entity relationship discovery. Finally we quantitatively analyze the quality of three popular latent modeling techniques to examine under which circumstances each is useful.« less
Information entropy of humpback whale songs.
Suzuki, Ryuji; Buck, John R; Tyack, Peter L
2006-03-01
The structure of humpback whale (Megaptera novaeangliae) songs was examined using information theory techniques. The song is an ordered sequence of individual sound elements separated by gaps of silence. Song samples were converted into sequences of discrete symbols by both human and automated classifiers. This paper analyzes the song structure in these symbol sequences using information entropy estimators and autocorrelation estimators. Both parametric and nonparametric entropy estimators are applied to the symbol sequences representing the songs. The results provide quantitative evidence consistent with the hierarchical structure proposed for these songs by Payne and McVay [Science 173, 587-597 (1971)]. Specifically, this analysis demonstrates that: (1) There is a strong structural constraint, or syntax, in the generation of the songs, and (2) the structural constraints exhibit periodicities with periods of 6-8 and 180-400 units. This implies that no empirical Markov model is capable of representing the songs' structure. The results are robust to the choice of either human or automated song-to-symbol classifiers. In addition, the entropy estimates indicate that the maximum amount of information that could be communicated by the sequence of sounds made is less than 1 bit per second.
Wimmer, Marina C; Howe, Mark L
2010-09-01
In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for all ages. For false recognition, when information was encoded on a shallow level, we found a different pattern for young children compared with that for older children and adults. False recognition rates were related to the overall amount of correctly remembered information for 7-year-olds, whereas no such association was found for the other age groups. In the second experiment, divided attention decreased true recognition for all ages. In contrast, children's (7- and 11-year-olds) false recognition rates were again dependent on the overall amount of correctly remembered information, whereas adults' false recognition was left unaffected. Overall, children's false recognition rates changed when levels of processing or divided attention was manipulated in comparison with adults. Together, these results suggest that there may be both quantitative and qualitative changes in false memory rates with age. Copyright 2010 Elsevier Inc. All rights reserved.
Sherrouse, Benson C.; Semmens, Darius J.; Clement, Jessica M.
2014-01-01
Despite widespread recognition that social-value information is needed to inform stakeholders and decision makers regarding trade-offs in environmental management, it too often remains absent from ecosystem service assessments. Although quantitative indicators of social values need to be explicitly accounted for in the decision-making process, they need not be monetary. Ongoing efforts to map such values demonstrate how they can also be made spatially explicit and relatable to underlying ecological information. We originally developed Social Values for Ecosystem Services (SolVES) as a tool to assess, map, and quantify nonmarket values perceived by various groups of ecosystem stakeholders. With SolVES 2.0 we have extended the functionality by integrating SolVES with Maxent maximum entropy modeling software to generate more complete social-value maps from available value and preference survey data and to produce more robust models describing the relationship between social values and ecosystems. The current study has two objectives: (1) evaluate how effectively the value index, a quantitative, nonmonetary social-value indicator calculated by SolVES, reproduces results from more common statistical methods of social-survey data analysis and (2) examine how the spatial results produced by SolVES provide additional information that could be used by managers and stakeholders to better understand more complex relationships among stakeholder values, attitudes, and preferences. To achieve these objectives, we applied SolVES to value and preference survey data collected for three national forests, the Pike and San Isabel in Colorado and the Bridger–Teton and the Shoshone in Wyoming. Value index results were generally consistent with results found through more common statistical analyses of the survey data such as frequency, discriminant function, and correlation analyses. In addition, spatial analysis of the social-value maps produced by SolVES provided information that was useful for explaining relationships between stakeholder values and forest uses. Our results suggest that SolVES can effectively reproduce information derived from traditional statistical analyses while adding spatially explicit, social-value information that can contribute to integrated resource assessment, planning, and management of forests and other ecosystems.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-11
... Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys) Under OMB Review AGENCY... techniques or the use of other forms of information technology. Titles a. Veterans Online Survey, VA Form 10-0513. b. Veterans Family Online Survey, VA Form 10-0513a. c. Veterans Primary Care Provider Online...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-30
..., motivations, and feelings than do quantitative studies. Focus groups serve the narrowly defined need for direct and informal opinion on a specific topic and as a qualitative research tool have three major... quantitative studies, To better understand consumers' attitudes and emotions in response to topics and concepts...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
...' attitudes, beliefs, motivations, and feelings than do quantitative studies. Focus groups serve the narrowly defined need for direct and informal opinion on a specific topic and as a qualitative research tool have... quantitative studies, To better understand people's attitudes and emotions in response to topics and concepts...
The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health
2016-10-01
AWARD NUMBER: W81XWH-15-1-0669 TITLE: The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health PRINCIPAL INVESTIGATOR...3. DATES COVERED 30 Sep 2015 - 29 Sep 2016 4. TITLE AND SUBTITLE The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health 5a...amputation and subsequently evaluate the utility of non-invasive imaging for evaluating the impact of next-generation socket technologies on the health of
Doubleday, Alison F; Wille, Sarah J
2014-01-01
Video and photography are often used for delivering content within the anatomical sciences. However, instructors typically produce these resources to provide instructional or procedural information. Although the benefits of learner-generated content have been explored within educational research, virtually no studies have investigated the use of learner-generated video and photograph content within anatomy dissection laboratories. This study outlines an activity involving learner-generated video diaries and learner-generated photograph assignments produced during anatomy laboratory sessions. The learner-generated photographs and videos provided instructors with a means of formative assessment and allowed instructors to identify evidence of collaborative behavior in the laboratory. Student questionnaires (n = 21) and interviews (n = 5), as well as in-class observations, were conducted to examine student perspectives on the laboratory activities. The quantitative and qualitative data were examined using the framework of activity theory to identify contradictions between student expectations of, and engagement with, the activity and the actual experiences of the students. Results indicate that learner-generated photograph and video content can act as a rich source of data on student learning processes and can be used for formative assessment, for observing collaborative behavior, and as a starting point for class discussions. This study stresses the idea that technology choice for activities must align with instructional goals. This research also highlights the utility of activity theory as a framework for assessing classroom and laboratory activities, demonstrating that this approach can guide the development of laboratory activities. © 2014 American Association of Anatomists.
78 FR 70059 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... (as opposed to quantitative statistical methods). In consultation with research experts, we have... qualitative interviews (as opposed to quantitative statistical methods). In consultation with research experts... utilization of qualitative interviews (as opposed to quantitative statistical methods). In consultation with...
NASA Astrophysics Data System (ADS)
Zhang, Gaoming; Hung, David L. S.; Xu, Min
2014-08-01
Flash boiling sprays of liquid injection under superheated conditions provide the novel solutions of fast vaporization and better air-fuel mixture formation for internal combustion engines. However, the physical mechanisms of flash boiling spray vaporization are more complicated than the droplet surface vaporization due to the unique bubble generation and boiling process inside a superheated bulk liquid, which are not well understood. In this study, the vaporization of flash boiling sprays was investigated experimentally through the quantitative measurements of vapor concentration and liquid temperature. Specifically, the laser-induced exciplex fluorescence technique was applied to distinguish the liquid and vapor distributions. Quantitative vapor concentration was obtained by correlating the intensity of vapor-phase fluorescence with vapor concentration through systematic corrections and calibrations. The intensities of two wavelengths were captured simultaneously from the liquid-phase fluorescence spectra, and their intensity ratios were correlated with liquid temperature. The results show that both liquid and vapor phase of multi-hole sprays collapse toward the centerline of the spray with different mass distributions under the flash boiling conditions. Large amount of vapor aggregates along the centerline of the spray to form a "gas jet" structure, whereas the liquid distributes more uniformly with large vortexes formed in the vicinity of the spray tip. The vaporization process under the flash boiling condition is greatly enhanced due to the intense bubble generation and burst. The liquid temperature measurements show strong temperature variations inside the flash boiling sprays with hot zones present in the "gas jet" structure and vortex region. In addition, high vapor concentration and closed vortex motion seem to have inhibited the heat and mass transfer in these regions. In summary, the vapor concentration and liquid temperature provide detailed information concerning the heat and mass transfer inside flash boiling sprays, which is important for the understanding of its unique vaporization process.
First-year success in a nursing baccalaureate plan of study: A descriptive research study.
Ott, Vivian; Thomas, Jessica A; Fernando, Harshini
2018-08-01
Predicting students' aptitude for post-secondary success remains a widely studied topic. This descriptive study explored demographic variables contributing to success in quantitative courses required by the nursing degree plan. Identification of an "at risk" student profile may inform interventions with which to support attainment of an academic degree. The purpose of this study was to examine the associations between demographic characteristics and successful completion of baccalaureate nursing courses thought to enhance quantitative reasoning skills: first-year math, first-year chemistry, and second-year pathopharmacology nursing. This retrospective analysis accessed 4521 academic records of students who took these three courses at a United States university sometime between Fall 2008 and Fall 2015. De-identified student data included course grades, gender, full-time study, income, marital status, first generation, secondary school (also known as high school) location, dual credit, and high school and university grade point averages. Descriptive statistical analysis was conducted to describe the important features of the data. Of the 4521 records, 2556 undergraduates (57%) passed the courses in which they were enrolled. Among successful students, females outnumbered males (66%), ages ranged from 20 to 24 years, 86% were classified as low income, 54% fit the designation of first generation, and 12% earned dual credit (university credit during secondary school). Our data demonstrate a positive relationship between dual credit and success, with the strongest correlation (0.62) noted for students in pathopharmacology. In the baccalaureate-nursing plan of study, courses thought to enhance students' quantitative reasoning skills remain difficult for some to successfully complete. We conclude that the more successful students tend to be older, have a higher income, and a higher high school grade point average, while those less successful are directly out of high school and have not earned dual credit. Copyright © 2018 Elsevier Ltd. All rights reserved.
Dulin-Keita, Akilah; Clay, Olivio; Whittaker, Shannon; Hannon, Lonnie; Adams, Ingrid K; Rogers, Michelle; Gans, Kim
2015-08-01
This study uses a mixed methods approach to 1) identify surrounding residents' perceived expectations for Housing Opportunities for People Everywhere (HOPE VI) policy on physical activity outcomes and to 2) quantitatively examine the odds of neighborhood-based physical activity pre-/post-HOPE VI in a low socioeconomic status, predominantly African American community in Birmingham, Alabama. To address aim one, we used group concept mapping which is a structured approach for data collection and analyses that produces pictures/maps of ideas. Fifty-eight residents developed statements about potential influences of HOPE VI on neighborhood-based physical activity. In the quantitative study, we examined whether these potential influences increased the odds of neighborhood walking/jogging. We computed block entry logistic regression models with a larger cohort of residents at baseline (n = 184) and six-months (n = 142, 77% retention; n = 120 for all informative variables). We examined perceived neighborhood disorder (perceived neighborhood disorder scale), walkability and aesthetics (Neighborhood Environment Walkability Scale) and HOPE VI-related community safety and safety for physical activity as predictors. During concept mapping, residents generated statements that clustered into three distinct concepts, "Increased Leisure Physical Activity," "Safe Play Areas," and "Generating Health Promoting Resources." The quantitative analyses indicated that changes in neighborhood walkability increased the odds of neighborhood-based physical activity (p = 0.04). When HOPE VI-related safety for physical activity was entered into the model, it was associated with increased odds of physical activity (p = 0.04). Walkability was no longer statistically significant. These results suggest that housing policies that create walkable neighborhoods and that improve perceptions of safety for physical activity may increase neighborhood-based physical activity. However, the longer term impacts of neighborhood-level policies on physical activity require more longitudinal evidence to determine whether increased participation in physical activity is sustained. Copyright © 2015 Elsevier Ltd. All rights reserved.