Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.
Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan
2017-01-01
Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.
USDA-ARS?s Scientific Manuscript database
Agricultural research increasingly is expected to provide precise, quantitative information with an explicit geographic coverage. Limited availability of continuous daily meteorological records often constrains efforts to provide such information through integrated use of simulation models, spatial ...
Comparison of satellite-derived dynamical quantities for the stratosphere of the Southern Hemisphere
NASA Technical Reports Server (NTRS)
Miles, Thomas (Editor); Oneill, Alan (Editor)
1989-01-01
As part of the international Middle Atmosphere Program (MAP), a project was instituted to study the dynamics of the Middle Atmosphere in the Southern Hemisphere (MASH). A pre-MASH workshop was held with two aims: comparison of Southern Hemisphere dynamical quantities derived from various archives of satellite data; and assessing the impact of different base-level height information on such derived quantities. The dynamical quantities examined included geopotential height, zonal wind, potential vorticity, eddy heat and momentum fluxes, and Eliassen-Palm fluxes. It was found that while there was usually qualitative agreement between the different sets of fields, substantial quantitative differences were evident, particularly in high latitudes. The fidelity of the base-level analysis was found to be of prime importance in calculating derived quantities - especially the Eliassen-Palm flux divergence and potential vorticity. Improvements in base-level analyses are recommended. In particular, quality controls should be introduced to remove spurious localized features from analyses, and information from all Antarctic radiosondes should be utilized where possible. Caution in drawing quantitative inferences from satellite data for the middle atmosphere of the Southern Hemisphere is advised.
The redoubtable ecological periodic table
Ecological periodic tables are repositories of reliable information on quantitative, predictably recurring (periodic) habitat–community patterns and their uncertainty, scaling and transferability. Their reliability derives from their grounding in sound ecological principle...
Code of Federal Regulations, 2012 CFR
2012-07-01
... the SIM mode at a scan rate of 1.5 scans/second to maximize the linear quantitative range and... Research Group, Texas A&M University, 833 Graham Rd., College Station, TX, 77845, (409) 690-0095. 8... following information is contained in the detailed quantitative reports: average RRF derived from the...
Quantitative imaging methods in osteoporosis.
Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G
2016-12-01
Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.
Scientific and Technical Information in Canada, Part II, Chapter 7: Economics.
ERIC Educational Resources Information Center
Science Council of Canada, Ottawa (Ontario).
In this report the various economic indicators of the extent and value of scientific and technical information transfer in Canada in government, industry, and education are described and expressed in quantitative terms derived from available statistical figures. The main thesis of the report is that the transfer of scientific and technical…
Evaluation of the capabilities of satellite imagery for monitoring regional air pollution episodes
NASA Technical Reports Server (NTRS)
Barnes, J. C.; Bowley, C. J.; Burke, H. H. K.
1979-01-01
A comparative analysis of satellite visible channel imagery and ground based aerosol measurements is carried out for three cases representing a significant pollution episodes based on low surface visibility and high sulfate levels. The feasibility of detecting pollution episodes from space is also investigated using a simulation model. The model results are compared to quantitative information derived from digitized satellite data. The results show that when levels are or = 30 micrograms/cu, a haze pattern that correlates closely with the area of reported low surface visibilities and high micrograms sulfate levels can be detected in satellite visible channel imagery. The model simulation demonstrates the potential of the satellite to monitor the magnitude and areal extent of pollution episodes. Quantitative information on total aerosol amount derived from the satellite digitized data using the atmospheric radiative transfer model agrees well with the results obtained from the ground based measurements.
Transforming Verbal Counts in Reports of Qualitative Descriptive Studies Into Numbers
Chang, YunKyung; Voils, Corrine I.; Sandelowski, Margarete; Hasselblad, Vic; Crandell, Jamie L.
2009-01-01
Reports of qualitative studies typically do not offer much information on the numbers of respondents linked to any one finding. This information may be especially useful in reports of basic, or minimally interpretive, qualitative descriptive studies focused on surveying a range of experiences in a target domain, and its lack may limit the ability to synthesize the results of such studies with quantitative results in systematic reviews. Accordingly, the authors illustrate strategies for deriving plausible ranges of respondents expressing a finding in a set of reports of basic qualitative descriptive studies on antiretroviral adherence and suggest how the results might be used. These strategies have limitations and are never appropriate for use with findings from interpretive qualitative studies. Yet they offer a temporary workaround for preserving and maximizing the value of information from basic qualitative descriptive studies for systematic reviews. They show also why quantitizing is never simply quantitative. PMID:19448052
Comprehensive chlorophyll composition in the main edible seaweeds.
Chen, Kewei; Ríos, José Julián; Pérez-Gálvez, Antonio; Roca, María
2017-08-01
Natural chlorophylls present in seaweeds have been studied regarding their biological activities and health benefit effects. However, detailed studies regarding characterization of the complete chlorophyll profile either qualitatively and quantitatively are scarce. This work deals with the comprehensive spectrometric study of the chlorophyll derivatives present in the five main coloured edible seaweeds. The novel complete MS 2 characterization of five chlorophyll derivatives: chlorophyll c 2 , chlorophyll c 1 , purpurin-18 a, pheophytin d and phytyl-purpurin-18 a has allowed to obtain fragmentation patterns associated with their different structural features. New chlorophyll derivatives have been identified and quantified by first time in red, green and brown seaweeds, including some oxidative structures. Quantitative data of the chlorophyll content comes to achieve significant information for food composition databases in bioactive compounds. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of LANDSAT Derived Forest Cover Information for Integration into Adirondack Park GIS
NASA Technical Reports Server (NTRS)
Curran, R. P.; Banta, J. S.
1982-01-01
Based upon observed changes in timber harvest practices partially attributable to forest biomass removable for energy supply purposes, the Adirondack Park Agency began in 1979 a multi-year project to implement a digital geographic information system (GIS). An initial developmental task was an inventory of forest cover information and analysis of forest resource change and availability. While developing the GIS, a pilot project was undertaken to evaluate the usefulness of LANDSAT derived land cover information for this purpose, and to explore the integration of LANDSAT data into the GIS. The prototype LANDSAT analysis project involved: (1) the use of both recent and historic data to derive land cover information for two dates; and (2) comparison of land cover over time to determine quantitative and geographic changes. The "recent data," 1978 full foliage data over portions of four LANDSAT scenes, was classified, using ground truth derived training samples in various forested and non-forested categories. Forested categories include the following: northern hardwoods, pine, spruce-fir, and pine plantation, while nonforested categories include wet-conifer, pasture, grassland, urban, exposed soil, agriculture, and water.
Quantitative approaches to information recovery from black holes
NASA Astrophysics Data System (ADS)
Balasubramanian, Vijay; Czech, Bartłomiej
2011-08-01
The evaporation of black holes into apparently thermal radiation poses a serious conundrum for theoretical physics: at face value, it appears that in the presence of a black hole, quantum evolution is non-unitary and destroys information. This information loss paradox has its seed in the presence of a horizon causally separating the interior and asymptotic regions in a black hole spacetime. A quantitative resolution of the paradox could take several forms: (a) a precise argument that the underlying quantum theory is unitary, and that information loss must be an artifact of approximations in the derivation of black hole evaporation, (b) an explicit construction showing how information can be recovered by the asymptotic observer, (c) a demonstration that the causal disconnection of the black hole interior from infinity is an artifact of the semiclassical approximation. This review summarizes progress on all these fronts.
Investigation of BOLD fMRI Resonance Frequency Shifts and Quantitative Susceptibility Changes at 7 T
Bianciardi, Marta; van Gelderen, Peter; Duyn, Jeff H.
2013-01-01
Although blood oxygenation level dependent (BOLD) functional magnetic resonance imaging (fMRI) experiments of brain activity generally rely on the magnitude of the signal, they also provide frequency information that can be derived from the phase of the signal. However, because of confounding effects of instrumental and physiological origin, BOLD related frequency information is difficult to extract and therefore rarely used. Here, we explored the use of high field (7 T) and dedicated signal processing methods to extract frequency information and use it to quantify and interpret blood oxygenation and blood volume changes. We found that optimized preprocessing improves detection of task-evoked and spontaneous changes in phase signals and resonance frequency shifts over large areas of the cortex with sensitivity comparable to that of magnitude signals. Moreover, our results suggest the feasibility of mapping BOLD quantitative susceptibility changes in at least part of the activated area and its largest draining veins. Comparison with magnitude data suggests that the observed susceptibility changes originate from neuronal activity through induced blood volume and oxygenation changes in pial and intracortical veins. Further, from frequency shifts and susceptibility values, we estimated that, relative to baseline, the fractional oxygen saturation in large vessels increased by 0.02–0.05 during stimulation, which is consistent to previously published estimates. Together, these findings demonstrate that valuable information can be derived from fMRI imaging of BOLD frequency shifts and quantitative susceptibility changes. PMID:23897623
Rappaz, Benjamin; Cano, Elena; Colomb, Tristan; Kühn, Jonas; Depeursinge, Christian; Simanis, Viesturs; Magistretti, Pierre J; Marquet, Pierre
2009-01-01
Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.
Heijtel, D F R; Petersen, E T; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; van Bavel, E T; Boellaard, R; Lammertsma, A A; Nederveen, A J
2016-04-01
The purpose of this study was to assess whether there was an agreement between quantitative cerebral blood flow (CBF) and arterial cerebral blood volume (CBVA) measurements by [(15)O]H2O positron emission tomography (PET) and model-free QUASAR MRI. Twelve healthy subjects were scanned within a week in separate MRI and PET imaging sessions, after which quantitative and qualitative agreement between both modalities was assessed for gray matter, white matter and whole brain region of interests (ROI). The correlation between CBF measurements obtained with both modalities was moderate to high (r(2): 0.28-0.60, P < 0.05), although QUASAR significantly underestimated CBF by 30% (P < 0.001). CBVA was moderately correlated (r(2): 0.28-0.43, P < 0.05), with QUASAR yielding values that were only 27% of the [(15)O]H2O-derived values (P < 0.001). Group-wise voxel statistics identified minor areas with significant contrast differences between [(15)O]H2O PET and QUASAR MRI, indicating similar qualitative CBVA and CBF information by both modalities. In conclusion, the results of this study demonstrate that QUASAR MRI and [(15)O]H2O PET provide similar CBF and CBVA information, but with systematic quantitative discrepancies. Copyright © 2016 John Wiley & Sons, Ltd.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-09
... External Review Draft of the Guidance for Applying Quantitative Data To Develop Data-Derived Extrapolation... Applying Quantitative Data to Develop Data-Derived Extrapolation Factors for Interspecies and Intraspecies... Applying Quantitative Data to Develop Data-Derived Extrapolation Factors for Interspecies and Intraspecies...
A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi
Hodgkinson, A.
1971-01-01
A better understanding of the physico-chemical principles underlying the formation of calculus has led to a need for more precise information on the chemical composition of stones. A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi which is suitable for routine use is presented. The procedure involves five simple qualitative tests followed by the quantitative determination of calcium, magnesium, inorganic phosphate, and oxalate. These data are used to calculate the composition of the stone in terms of calcium oxalate, apatite, and magnesium ammonium phosphate. Analytical results and derived values for five representative types of calculi are presented. PMID:5551382
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
CLICK: The new USGS center for LIDAR information coordination and knowledge
Stoker, Jason M.; Greenlee, Susan K.; Gesch, Dean B.; Menig, Jordan C.
2006-01-01
Elevation data is rapidly becoming an important tool for the visualization and analysis of geographic information. The creation and display of three-dimensional models representing bare earth, vegetation, and structures have become major requirements for geographic research in the past few years. Light Detection and Ranging (lidar) has been increasingly accepted as an effective and accurate technology for acquiring high-resolution elevation data for bare earth, vegetation, and structures. Lidar is an active remote sensing system that records the distance, or range, of a laser fi red from an airborne or space borne platform such as an airplane, helicopter or satellite to objects or features on the Earth’s surface. By converting lidar data into bare ground topography and vegetation or structural morphologic information, extremely accurate, high-resolution elevation models can be derived to visualize and quantitatively represent scenes in three dimensions. In addition to high-resolution digital elevation models (Evans et al., 2001), other lidar-derived products include quantitative estimates of vegetative features such as canopy height, canopy closure, and biomass (Lefsky et al., 2002), and models of urban areas such as building footprints and three-dimensional city models (Maas, 2001).
A statistical framework for protein quantitation in bottom-up MS-based proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less
Application of automatic image analysis in wood science
Charles W. McMillin
1982-01-01
In this paper I describe an image analysis system and illustrate with examples the application of automatic quantitative measurement to wood science. Automatic image analysis, a powerful and relatively new technology, uses optical, video, electronic, and computer components to rapidly derive information from images with minimal operator interaction. Such instruments...
Huang, Xiao Yan; Shan, Zhi Jie; Zhai, Hong Lin; Li, Li Na; Zhang, Xiao Yun
2011-08-22
Heat shock protein 90 (Hsp90) takes part in the developments of several cancers. Novobiocin, a typically C-terminal inhibitor for Hsp90, will probably used as an important anticancer drug in the future. In this work, we explored the valuable information and designed new novobiocin derivatives based on a three-dimensional quantitative structure-activity relationship (3D QSAR). The comparative molecular field analysis and comparative molecular similarity indices analysis models with high predictive capability were established, and their reliabilities are supported by the statistical parameters. Based on the several important influence factors obtained from these models, six new novobiocin derivatives with higher inhibitory activities were designed and confirmed by the molecular simulation with our models, which provide the potential anticancer drug leads for further research.
NASA Astrophysics Data System (ADS)
Consonni, Viviana; Todeschini, Roberto
In the last decades, several scientific researches have been focused on studying how to encompass and convert - by a theoretical pathway - the information encoded in the molecular structure into one or more numbers used to establish quantitative relationships between structures and properties, biological activities, or other experimental properties. Molecular descriptors are formally mathematical representations of a molecule obtained by a well-specified algorithm applied to a defined molecular representation or a well-specified experimental procedure. They play a fundamental role in chemistry, pharmaceutical sciences, environmental protection policy, toxicology, ecotoxicology, health research, and quality control. Evidence of the interest of the scientific community in the molecular descriptors is provided by the huge number of descriptors proposed up today: more than 5000 descriptors derived from different theories and approaches are defined in the literature and most of them can be calculated by means of dedicated software applications. Molecular descriptors are of outstanding importance in the research fields of quantitative structure-activity relationships (QSARs) and quantitative structure-property relationships (QSPRs), where they are the independent chemical information used to predict the properties of interest. Along with the definition of appropriate molecular descriptors, the molecular structure representation and the mathematical tools for deriving and assessing models are other fundamental components of the QSAR/QSPR approach. The remarkable progress during the last few years in chemometrics and chemoinformatics has led to new strategies for finding mathematical meaningful relationships between the molecular structure and biological activities, physico-chemical, toxicological, and environmental properties of chemicals. Different approaches for deriving molecular descriptors here reviewed and some of the most relevant descriptors are presented in detail with numerical examples.
Valverde, Juan; This, Hervé
2008-01-23
Using 1H nuclear magnetic resonance spectroscopy (1D and 2D), the two types of photosynthetic pigments (chlorophylls, their derivatives, and carotenoids) of "green beans" (immature pods of Phaseolus vulgaris L.) were analyzed. Compared to other analytical methods (light spectroscopy or chromatography), 1H NMR spectroscopy is a fast analytical way that provides more information on chlorophyll derivatives (allomers and epimers) than ultraviolet-visible spectroscopy. Moreover, it gives a large amount of data without prior chromatographic separation.
Ivens, Katherine O; Baumert, Joseph L; Taylor, Steve L
2016-07-01
Numerous commercial enzyme-linked immunosorbent assay (ELISA) kits exist to quantitatively detect bovine milk residues in foods. Milk contains many proteins that can serve as ELISA targets including caseins (α-, β-, or κ-casein) and whey proteins (α-lactalbumin or β-lactoglobulin). Nine commercially-available milk ELISA kits were selected to compare the specificity and sensitivity with 5 purified milk proteins and 3 milk-derived ingredients. All of the milk kits were capable of quantifying nonfat dry milk (NFDM), but did not necessarily detect all individual protein fractions. While milk-derived ingredients were detected by the kits, their quantitation may be inaccurate due to the use of different calibrators, reference materials, and antibodies in kit development. The establishment of a standard reference material for the calibration of milk ELISA kits is increasingly important. The appropriate selection and understanding of milk ELISA kits for food analysis is critical to accurate quantification of milk residues and informed risk management decisions. © 2016 Institute of Food Technologists®
NASA Astrophysics Data System (ADS)
Ghikas, Demetris P. K.; Oikonomou, Fotios D.
2018-04-01
Using the generalized entropies which depend on two parameters we propose a set of quantitative characteristics derived from the Information Geometry based on these entropies. Our aim, at this stage, is to construct first some fundamental geometric objects which will be used in the development of our geometrical framework. We first establish the existence of a two-parameter family of probability distributions. Then using this family we derive the associated metric and we state a generalized Cramer-Rao Inequality. This gives a first two-parameter classification of complex systems. Finally computing the scalar curvature of the information manifold we obtain a further discrimination of the corresponding classes. Our analysis is based on the two-parameter family of generalized entropies of Hanel and Thurner (2011).
In situ spectroradiometric quantification of ERTS data. [Prescott and Phoenix, Arizona
NASA Technical Reports Server (NTRS)
Yost, E. F. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Analyses of ERTS-1 photographic data were made to quantitatively relate ground reflectance measurements to photometric characteristics of the images. Digital image processing of photographic data resulted in a nomograph to correct for atmospheric effects over arid terrain. Optimum processing techniques to derive maximum geologic information from desert areas were established. Additive color techniques to provide quantitative measurements of surface water between different orbits were developed which were accepted as the standard flood mapping techniques using ERTS.
Multiplexed, quantitative, and targeted metabolite profiling by LC-MS/MRM.
Wei, Ru; Li, Guodong; Seymour, Albert B
2014-01-01
Targeted metabolomics, which focuses on a subset of known metabolites representative of biologically relevant metabolic pathways, is a valuable tool to discover biomarkers and link disease phenotypes to underlying mechanisms or therapeutic modes of action. A key advantage of targeted metabolomics, compared to discovery metabolomics, is its immediate readiness for extracting biological information derived from known metabolites and quantitative measurements. However, simultaneously analyzing hundreds of endogenous metabolites presents a challenge due to their diverse chemical structures and properties. Here we report a method which combines different chromatographic separation conditions, optimal ionization polarities, and the most sensitive triple-quadrupole MS-based data acquisition mode, multiple reaction monitoring (MRM), to quantitatively profile 205 endogenous metabolites in 10 min.
Background: Quantitative high-throughput screening (qHTS) assays are increasingly being employed to inform chemical hazard identification. Hundreds of chemicals have been tested in dozens of cell lines across extensive concentration ranges by the National Toxicology Program in co...
Algebra for Everyone? Student Perceptions of Tracking in Mathematics
ERIC Educational Resources Information Center
Spielhagen, Frances R.
2010-01-01
This research explored the experiences of students in a school district that limited early access to the study of algebra and to inform education policymakers of the impact of such tracking policies on the lives and futures of the students. Quantitative analysis had already yielded a snapshot of inequities deriving from the policies surrounding…
Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio
2017-01-01
Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698
Protein Signaling Networks from Single Cell Fluctuations and Information Theory Profiling
Shin, Young Shik; Remacle, F.; Fan, Rong; Hwang, Kiwook; Wei, Wei; Ahmad, Habib; Levine, R.D.; Heath, James R.
2011-01-01
Protein signaling networks among cells play critical roles in a host of pathophysiological processes, from inflammation to tumorigenesis. We report on an approach that integrates microfluidic cell handling, in situ protein secretion profiling, and information theory to determine an extracellular protein-signaling network and the role of perturbations. We assayed 12 proteins secreted from human macrophages that were subjected to lipopolysaccharide challenge, which emulates the macrophage-based innate immune responses against Gram-negative bacteria. We characterize the fluctuations in protein secretion of single cells, and of small cell colonies (n = 2, 3,···), as a function of colony size. Measuring the fluctuations permits a validation of the conditions required for the application of a quantitative version of the Le Chatelier's principle, as derived using information theory. This principle provides a quantitative prediction of the role of perturbations and allows a characterization of a protein-protein interaction network. PMID:21575571
Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter
2017-02-01
Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.
Carreno-Quintero, Natalia; Acharjee, Animesh; Maliepaard, Chris; Bachem, Christian W.B.; Mumm, Roland; Bouwmeester, Harro; Visser, Richard G.F.; Keurentjes, Joost J.B.
2012-01-01
Recent advances in -omics technologies such as transcriptomics, metabolomics, and proteomics along with genotypic profiling have permitted dissection of the genetics of complex traits represented by molecular phenotypes in nonmodel species. To identify the genetic factors underlying variation in primary metabolism in potato (Solanum tuberosum), we have profiled primary metabolite content in a diploid potato mapping population, derived from crosses between S. tuberosum and wild relatives, using gas chromatography-time of flight-mass spectrometry. In total, 139 polar metabolites were detected, of which we identified metabolite quantitative trait loci for approximately 72% of the detected compounds. In order to obtain an insight into the relationships between metabolic traits and classical phenotypic traits, we also analyzed statistical associations between them. The combined analysis of genetic information through quantitative trait locus coincidence and the application of statistical learning methods provide information on putative indicators associated with the alterations in metabolic networks that affect complex phenotypic traits. PMID:22223596
Marquet, Pierre; Depeursinge, Christian; Magistretti, Pierre J.
2014-01-01
Abstract. Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed. PMID:26157976
Marquet, Pierre; Depeursinge, Christian; Magistretti, Pierre J
2014-10-01
Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed.
NASA Astrophysics Data System (ADS)
Gramaccioni, C.; Procopio, A.; Farruggia, G.; Malucelli, E.; Iotti, S.; Notargiacomo, A.; Fratini, M.; Yang, Y.; Pacureanu, A.; Cloetens, P.; Bohic, S.; Massimi, L.; Cutone, A.; Valenti, P.; Rosa, L.; Berlutti, F.; Lagomarsino, S.
2017-06-01
X-ray fluorescence microscopy (XRFM) is a powerful technique to detect and localize elements in cells. To derive information useful for biology and medicine, it is essential not only to localize, but also to map quantitatively the element concentration. Here we applied quantitative XRFM to iron in phagocytic cells. Iron, a primary component of living cells, can become toxic when present in excess. In human fluids, free iron is maintained at 10-18 M concentration thanks to iron binding proteins as lactoferrin (Lf). The iron homeostasis, involving the physiological ratio of iron between tissues/secretions and blood, is strictly regulated by ferroportin, the sole protein able to export iron from cells to blood. Inflammatory processes induced by lipopolysaccharide (LPS) or bacterial pathoge inhibit ferroportin synthesis in epithelial and phagocytic cells thus hindering iron export, increasing intracellular iron and bacterial multiplication. In this respect, Lf is emerging as an important regulator of both iron and inflammatory homeostasis. Here we studied phagocytic cells inflamed by bacterial LPS and untreated or treated with milk derived bovine Lf. Quantitative mapping of iron concentration and mass fraction at high spatial resolution is obtained combining X-ray fluorescence microscopy, atomic force microscopy and synchrotron phase contrast imaging.
Li, Yuqin; You, Guirong; Jia, Baoxiu; Si, Hongzong; Yao, Xiaojun
2014-01-01
Quantitative structure-activity relationships (QSAR) were developed to predict the inhibition ratio of pyrrolidine derivatives on matrix metalloproteinase via heuristic method (HM) and gene expression programming (GEP). The descriptors of 33 pyrrolidine derivatives were calculated by the software CODESSA, which can calculate quantum chemical, topological, geometrical, constitutional, and electrostatic descriptors. HM was also used for the preselection of 5 appropriate molecular descriptors. Linear and nonlinear QSAR models were developed based on the HM and GEP separately and two prediction models lead to a good correlation coefficient (R (2)) of 0.93 and 0.94. The two QSAR models are useful in predicting the inhibition ratio of pyrrolidine derivatives on matrix metalloproteinase during the discovery of new anticancer drugs and providing theory information for studying the new drugs.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Quantitative assessment of AOD from 17 CMIP5 models based on satellite-derived AOD over India
DOE Office of Scientific and Technical Information (OSTI.GOV)
Misra, Amit; Kanawade, Vijay P.; Tripathi, Sachchida Nand
Aerosol optical depth (AOD) values from 17 CMIP5 models are compared with Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) derived AODs over India. The objective is to identify the cases of successful AOD simulation by CMIP5 models, considering satellite-derived AOD as a benchmark. Six years of AOD data (2000–2005) from MISR and MODIS are processed to create quality-assured gridded AOD maps over India, which are compared with corresponding maps of 17 CMIP5 models at the same grid resolution. Intercomparison of model and satellite data shows that model-AOD is better correlated with MISR-derived AOD than MODIS. The correlation between model-AOD andmore » MISR-AOD is used to segregate the models into three categories identifying their performance in simulating the AOD over India. Maps of correlation between model-AOD and MISR-/MODIS-AOD are generated to provide quantitative information about the intercomparison. The two sets of data are examined for different seasons and years to examine the seasonal and interannual variation in the correlation coefficients. In conclusion, latitudinal and longitudinal variations in AOD as simulated by models are also examined and compared with corresponding variations observed by satellites.« less
Quantitative assessment of AOD from 17 CMIP5 models based on satellite-derived AOD over India
Misra, Amit; Kanawade, Vijay P.; Tripathi, Sachchida Nand
2016-08-03
Aerosol optical depth (AOD) values from 17 CMIP5 models are compared with Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) derived AODs over India. The objective is to identify the cases of successful AOD simulation by CMIP5 models, considering satellite-derived AOD as a benchmark. Six years of AOD data (2000–2005) from MISR and MODIS are processed to create quality-assured gridded AOD maps over India, which are compared with corresponding maps of 17 CMIP5 models at the same grid resolution. Intercomparison of model and satellite data shows that model-AOD is better correlated with MISR-derived AOD than MODIS. The correlation between model-AOD andmore » MISR-AOD is used to segregate the models into three categories identifying their performance in simulating the AOD over India. Maps of correlation between model-AOD and MISR-/MODIS-AOD are generated to provide quantitative information about the intercomparison. The two sets of data are examined for different seasons and years to examine the seasonal and interannual variation in the correlation coefficients. In conclusion, latitudinal and longitudinal variations in AOD as simulated by models are also examined and compared with corresponding variations observed by satellites.« less
Utility of qualitative research findings in evidence-based public health practice.
Jack, Susan M
2006-01-01
Epidemiological data, derived from quantitative studies, provide important information about the causes, prevalence, risk correlates, treatment and prevention of diseases, and health issues at a population level. However, public health issues are complex in nature and quantitative research findings are insufficient to support practitioners and administrators in making evidence-informed decisions. Upshur's Synthetic Model of Evidence (2001) situates qualitative research findings as a credible source of evidence for public health practice. This article answers the following questions: (1) where does qualitative research fit within the paradigm of evidence-based practice and (2) how can qualitative research be used by public health professionals? Strategies for using qualitative research findings instrumentally, conceptually, and symbolically are identified by applying Estabrooks' (1999) conceptual structure of research utilization. Different research utilization strategies are illustrated through the use of research examples from the field of work on intimate partner violence against women. Recommendations for qualitative researchers disseminating findings and for public health practitioners/policy makers considering the use of qualitative findings as evidence to inform decisions are provided.
Uncovering the end uses of the rare earth elements.
Du, Xiaoyue; Graedel, T E
2013-09-01
The rare earth elements (REE) are a group of fifteen elements with unique properties that make them indispensable for a wide variety of emerging and conventional established technologies. However, quantitative knowledge of REE remains sparse, despite the current heightened interest in future availability of the resources. Mining is heavily concentrated in China, whose monopoly position and potential restriction of exports render primary supply vulnerable to short term disruption. We have drawn upon the published literature and unpublished materials in different languages to derive the first quantitative annual domestic production by end use of individual rare earth elements from 1995 to 2007. The information is illustrated in Sankey diagrams for the years 1995 and 2007. Other years are available in the supporting information. Comparing 1995 and 2007, the production of the rare earth elements in China, Japan, and the US changed dramatically in quantities and structure. The information can provide a solid foundation for industries, academic institutions and governments to make decisions and develop strategies. Copyright © 2013 Elsevier B.V. All rights reserved.
Influence of study goals on study design and execution.
Kirklin, J W; Blackstone, E H; Naftel, D C; Turner, M E
1997-12-01
From the viewpoint of a clinician who makes recommendations to patients about choosing from the multiple possible management schemes, quantitative information derived from statistical analyses of observational studies is useful. Although random assignment of therapy is optimal, appropriately performed studies in which therapy has been nonrandomly "assigned" are considered acceptable, albeit occasionally with limitations in inferences. The analyses are considered most useful when they generate multivariable equations suitable for predicting time-related outcomes in individual patients. Graphic presentations improve communication with patients and facilitate truly informed consent.
Pharmacometabolomics Informs Quantitative Radiomics for Glioblastoma Diagnostic Innovation.
Katsila, Theodora; Matsoukas, Minos-Timotheos; Patrinos, George P; Kardamakis, Dimitrios
2017-08-01
Applications of omics systems biology technologies have enormous promise for radiology and diagnostics in surgical fields. In this context, the emerging fields of radiomics (a systems scale approach to radiology using a host of technologies, including omics) and pharmacometabolomics (use of metabolomics for patient and disease stratification and guiding precision medicine) offer much synergy for diagnostic innovation in surgery, particularly in neurosurgery. This synthesis of omics fields and applications is timely because diagnostic accuracy in central nervous system tumors still challenges decision-making. Considering the vast heterogeneity in brain tumors, disease phenotypes, and interindividual variability in surgical and chemotherapy outcomes, we believe that diagnostic accuracy can be markedly improved by quantitative radiomics coupled to pharmacometabolomics and related health information technologies while optimizing economic costs of traditional diagnostics. In this expert review, we present an innovation analysis on a systems-level multi-omics approach toward diagnostic accuracy in central nervous system tumors. For this, we suggest that glioblastomas serve as a useful application paradigm. We performed a literature search on PubMed for articles published in English between 2006 and 2016. We used the search terms "radiomics," "glioblastoma," "biomarkers," "pharmacogenomics," "pharmacometabolomics," "pharmacometabonomics/pharmacometabolomics," "collaborative informatics," and "precision medicine." A list of the top 4 insights we derived from this literature analysis is presented in this study. For example, we found that (i) tumor grading needs to be better refined, (ii) diagnostic precision should be improved, (iii) standardization in radiomics is lacking, and (iv) quantitative radiomics needs to prove clinical implementation. We conclude with an interdisciplinary call to the metabolomics, pharmacy/pharmacology, radiology, and surgery communities that pharmacometabolomics coupled to information technologies (chemoinformatics tools, databases, collaborative systems) can inform quantitative radiomics, thus translating Big Data and information growth to knowledge growth, rational drug development and diagnostics innovation for glioblastomas, and possibly in other brain tumors.
Zecchin, Chiara; Facchinetti, Andrea; Sparacino, Giovanni; Dalla Man, Chiara; Manohar, Chinmay; Levine, James A; Basu, Ananda; Kudva, Yogish C; Cobelli, Claudio
2013-10-01
In type 1 diabetes mellitus (T1DM), physical activity (PA) lowers the risk of cardiovascular complications but hinders the achievement of optimal glycemic control, transiently boosting insulin action and increasing hypoglycemia risk. Quantitative investigation of relationships between PA-related signals and glucose dynamics, tracked using, for example, continuous glucose monitoring (CGM) sensors, have been barely explored. In the clinic, 20 control and 19 T1DM subjects were studied for 4 consecutive days. They underwent low-intensity PA sessions daily. PA was tracked by the PA monitoring system (PAMS), a system comprising accelerometers and inclinometers. Variations on glucose dynamics were tracked estimating first- and second-order time derivatives of glucose concentration from CGM via Bayesian smoothing. Short-time effects of PA on glucose dynamics were quantified through the partial correlation function in the interval (0, 60 min) after starting PA. Correlation of PA with glucose time derivatives is evident. In T1DM, the negative correlation with the first-order glucose time derivative is maximal (absolute value) after 15 min of PA, whereas the positive correlation is maximal after 40-45 min. The negative correlation between the second-order time derivative and PA is maximal after 5 min, whereas the positive correlation is maximal after 35-40 min. Control subjects provided similar results but with positive and negative correlation peaks anticipated of 5 min. Quantitative information on correlation between mild PA and short-term glucose dynamics was obtained. This represents a preliminary important step toward incorporation of PA information in more realistic physiological models of the glucose-insulin system usable in T1DM simulators, in development of closed-loop artificial pancreas control algorithms, and in CGM-based prediction algorithms for generation of hypoglycemic alerts.
Fast quantitative optical detection of heat dissipation by surface plasmon polaritons.
Möller, Thomas B; Ganser, Andreas; Kratt, Martina; Dickreuter, Simon; Waitz, Reimar; Scheer, Elke; Boneberg, Johannes; Leiderer, Paul
2018-06-13
Heat management at the nanoscale is an issue of increasing importance. In optoelectronic devices the transport and decay of plasmons contribute to the dissipation of heat. By comparison of experimental data and simulations we demonstrate that it is possible to gain quantitative information about excitation, propagation and decay of surface plasmon polaritons (SPPs) in a thin gold stripe supported by a silicon membrane. The temperature-dependent optical transmissivity of the membrane is used to determine the temperature distribution around the metal stripe with high spatial and temporal resolution. This method is complementary to techniques where the propagation of SPPs is monitored optically, and provides additional information which is not readily accessible by other means. In particular, we demonstrate that the thermal conductivity of the membrane can also be derived from our analysis. The results presented here show the high potential of this tool for heat management studies in nanoscale devices.
NASA Astrophysics Data System (ADS)
Jukić, Marijana; Rastija, Vesna; Opačak-Bernardi, Teuta; Stolić, Ivana; Krstulović, Luka; Bajić, Miroslav; Glavaš-Obrovac, Ljubica
2017-04-01
The aim of this study was to evaluate nine newly synthesized amidine derivatives of 3,4- ethylenedioxythiophene (3,4-EDOT) for their cytotoxic activity against a panel of human cancer cell lines and to perform a quantitative structure-activity relationship (QSAR) analysis for the antitumor activity of a total of 27 3,4-ethylenedioxythiophene derivatives. Induction of apoptosis was investigated on the selected compounds, along with delivery options for the optimization of activity. The best obtained QSAR models include the following group of descriptors: BCUT, WHIM, 2D autocorrelations, 3D-MoRSE, GETAWAY descriptors, 2D frequency fingerprint and information indices. Obtained QSAR models should be relieved in elucidation of important physicochemical and structural requirements for this biological activity. Highly potent molecules have a symmetrical arrangement of substituents along the x axis, high frequency of distance between N and O atoms at topological distance 9, as well as between C and N atoms at topological distance 10, and more C atoms located at topological distances 6 and 3. Based on the conclusion given in the QSAR analysis, a new compound with possible great activity was proposed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-ORD-2009-0694; FRL-9442-8] Notice of Availability of the External Review Draft of the Guidance for Applying Quantitative Data to Develop Data-Derived Extrapolation... Quantitative Data to Develop Data-Derived Extrapolation Factors for Interspecies and Intraspecies Extrapolation...
Tallarida, Ronald J.; Raffa, Robert B.
2014-01-01
In this review we show that the concept of dose equivalence for two drugs, the theoretical basis of the isobologram, has a wider use in the analysis of pharmacological data derived from single and combination drug use. In both its application to drug combination analysis with isoboles and certain other actions, listed below, the determination of doses, or receptor occupancies, that yield equal effects provide useful metrics that can be used to obtain quantitative information on drug actions without postulating any intimate mechanism of action. These other drug actions discussed here include (1) combinations of agonists that produce opposite effects, (2) analysis of inverted U-shaped dose effect curves of single agents, (3) analysis on the effect scale as an alternative to isoboles and (4) the use of occupation isoboles to examine competitive antagonism in the dual receptor case. New formulas derived to assess the statistical variance for additive combinations are included, and the more detailed mathematical topics are included in the appendix. PMID:20546783
Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering
NASA Astrophysics Data System (ADS)
Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.
2004-05-01
Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.
ERIC Educational Resources Information Center
Kelly, James D.
A study tested the data-ink ratio theory, which holds that a reader's recall of quantitative data displayed in a graph containing a substantial amount of non-data-ink will be significantly less than recall from a graph containing little non-data-ink, as it might apply to graphics used in mass circulation newspapers. The experiment employed a…
The effect of social interactions in the primary consumption life cycle of motion pictures
NASA Astrophysics Data System (ADS)
Hidalgo R, César A.; Castro, Alejandra; Rodriguez-Sickert, Carlos
2006-04-01
We develop a 'basic principles' model which accounts for the primary life cycle consumption of films as a social coordination problem in which information transmission is governed by word of mouth. We fit the analytical solution of such a model to aggregated consumption data from the film industry and derive a quantitative estimator of its quality based on the structure of the life cycle.
PEITH(Θ): perfecting experiments with information theory in Python with GPU support.
Dony, Leander; Mackerodt, Jonas; Ward, Scott; Filippi, Sarah; Stumpf, Michael P H; Liepe, Juliane
2018-04-01
Different experiments provide differing levels of information about a biological system. This makes it difficult, a priori, to select one of them beyond mere speculation and/or belief, especially when resources are limited. With the increasing diversity of experimental approaches and general advances in quantitative systems biology, methods that inform us about the information content that a given experiment carries about the question we want to answer, become crucial. PEITH(Θ) is a general purpose, Python framework for experimental design in systems biology. PEITH(Θ) uses Bayesian inference and information theory in order to derive which experiments are most informative in order to estimate all model parameters and/or perform model predictions. https://github.com/MichaelPHStumpf/Peitho. m.stumpf@imperial.ac.uk or juliane.liepe@mpibpc.mpg.de.
Wang, Yi; Peng, Hsin-Chieh; Liu, Jingyue; Huang, Cheng Zhi; Xia, Younan
2015-02-11
Kinetic control is a powerful means for maneuvering the twin structure and shape of metal nanocrystals and thus optimizing their performance in a variety of applications. However, there is only a vague understanding of the explicit roles played by reaction kinetics due to the lack of quantitative information about the kinetic parameters. With Pd as an example, here we demonstrate that kinetic parameters, including rate constant and activation energy, can be derived from spectroscopic measurements and then used to calculate the initial reduction rate and further have this parameter quantitatively correlated with the twin structure of a seed and nanocrystal. On a quantitative basis, we were able to determine the ranges of initial reduction rates required for the formation of nanocrystals with a specific twin structure, including single-crystal, multiply twinned, and stacking fault-lined. This work represents a major step forward toward the deterministic syntheses of colloidal noble-metal nanocrystals with specific twin structures and shapes.
NASA Astrophysics Data System (ADS)
Spada, M.; Bianchi, I.; Kissling, E.; Agostinetti, N. Piana; Wiemer, S.
2013-08-01
The accurate definition of 3-D crustal structures and, in primis, the Moho depth, are the most important requirement for seismological, geophysical and geodynamic modelling in complex tectonic regions. In such areas, like the Mediterranean region, various active and passive seismic experiments are performed, locally reveal information on Moho depth, average and gradient crustal Vp velocity and average Vp/Vs velocity ratios. Until now, the most reliable information on crustal structures stems from controlled-source seismology experiments. In most parts of the Alpine region, a relatively large number of controlled-source seismology information are available though the overall coverage in the central Mediterranean area is still sparse due to high costs of such experiments. Thus, results from other seismic methodologies, such as local earthquake tomography, receiver functions and ambient noise tomography can be used to complement the controlled-source seismology information to increase coverage and thus the quality of 3-D crustal models. In this paper, we introduce a methodology to directly combine controlled-source seismology and receiver functions information relying on the strengths of each method and in relation to quantitative uncertainty estimates for all data to derive a well resolved Moho map for Italy. To obtain a homogeneous elaboration of controlled-source seismology and receiver functions results, we introduce a new classification/weighting scheme based on uncertainty assessment for receiver functions data. In order to tune the receiver functions information quality, we compare local receiver functions Moho depths and uncertainties with a recently derived well-resolved local earthquake tomography-derived Moho map and with controlled-source seismology information. We find an excellent correlation in the Moho information obtained by these three methodologies in Italy. In the final step, we interpolate the controlled-source seismology and receiver functions information to derive the map of Moho topography in Italy and surrounding regions. Our results show high-frequency undulation in the Moho topography of three different Moho interfaces, the European, the Adriatic-Ionian, and the Liguria-Corsica-Sardinia-Tyrrhenia, reflecting the complexity of geodynamical evolution.
Analog to digital workflow improvement: a quantitative study.
Wideman, Catherine; Gallet, Jacqueline
2006-01-01
This study tracked a radiology department's conversion from utilization of a Kodak Amber analog system to a Kodak DirectView DR 5100 digital system. Through the use of ProModel Optimization Suite, a workflow simulation software package, significant quantitative information was derived from workflow process data measured before and after the change to a digital system. Once the digital room was fully operational and the radiology staff comfortable with the new system, average patient examination time was reduced from 9.24 to 5.28 min, indicating that a higher patient throughput could be achieved. Compared to the analog system, chest examination time for modality specific activities was reduced by 43%. The percentage of repeat examinations experienced with the digital system also decreased to 8% vs. the level of 9.5% experienced with the analog system. The study indicated that it is possible to quantitatively study clinical workflow and productivity by using commercially available software.
Pesavento, James J; Mizzen, Craig A; Kelleher, Neil L
2006-07-01
Here we show that fragment ion abundances from dissociation of ions created from mixtures of multiply modified histone H4 (11 kDa) or of N-terminal synthetic peptides (2 kDa) correspond to their respective intact ion abundances measured by Fourier transform mass spectrometry. Isomeric mixtures of modified forms of the same protein are resolved and quantitated with a precision of =5% using the relative ratios of their fragment ions, with intact protein ions created by electrospray greatly easing many of the systematic biases that more strongly affect small peptides (e.g., differences in ionization efficiency and ion m/z values). The ion fragmentation methods validated here are directly extensible to intact human proteins to derive quantitative information on the highly related and often isomeric protein forms created by combinatorial arrays of posttranslational modifications.
Information Content of Aerosol Retrievals in the Sunglint Region
NASA Technical Reports Server (NTRS)
Ottaviani, M.; Knobelspiesse, K.; Cairns, B.; Mishchenko, M.
2013-01-01
We exploit quantitative metrics to investigate the information content in retrievals of atmospheric aerosol parameters (with a focus on single-scattering albedo), contained in multi-angle and multi-spectral measurements with sufficient dynamical range in the sunglint region. The simulations are performed for two classes of maritime aerosols with optical and microphysical properties compiled from measurements of the Aerosol Robotic Network. The information content is assessed using the inverse formalism and is compared to that deriving from observations not affected by sunglint. We find that there indeed is additional information in measurements containing sunglint, not just for single-scattering albedo, but also for aerosol optical thickness and the complex refractive index of the fine aerosol size mode, although the amount of additional information varies with aerosol type.
NASA Astrophysics Data System (ADS)
Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua
Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D =
NASA Astrophysics Data System (ADS)
Koch, Franziska; Schmid, Lino; Prasch, Monika; Heilig, Achim; Eisen, Olaf; Schweizer, Jürg; Mauser, Wolfram
2015-04-01
The temporal evolution of Alpine snowpacks is important for assessing water supply, hydropower generation, flood predictions and avalanche forecasts. Especially in high mountain regions with an extremely varying topography, it is until now often difficult to derive continuous and non-destructive information on snow parameters. Since autumn 2012, we are running a new low-cost GPS (Global Positioning System) snow measurement experiment at the high alpine study site Weissfluhjoch (2450 m a.s.l.) in Switzerland. The globally and freely broadcasted GPS L1-band (1.57542 GHz) was continuously recorded with GPS antennas, which are installed at the ground surface underneath the snowpack. GPS raw data, containing carrier-to-noise power density ratio (C/N0) as well as elevation and azimuth angle information for each time step of 1 s, was stored and analyzed for all 32 GPS satellites. Since the dielectric permittivity of an overlying wet snowpack influences microwave radiation, the bulk volumetric liquid water content as well as daily melt-freeze cycles can be derived non-destructively from GPS signal strength losses and external snow height information. This liquid water content information is qualitatively in good accordance with meteorological and snow-hydrological data and quantitatively highly agrees with continuous data derived from an upward-looking ground-penetrating radar (upGPR) working in a similar frequency range. As a promising novelty, we combined the GPS signal strength data with upGPR travel-time information of active impulse radar rays to the snow surface and back from underneath the snow cover. This combination allows determining liquid water content, snow height and snow water equivalent from beneath the snow cover without using any other external information. The snow parameters derived by combining upGPR and GPS data are in good agreement with conventional sensors as e.g. laser distance gauges or snow pillows. As the GPS sensors are cheap, they can easily be installed in parallel with further upGPR systems or as sensor networks to monitor the snowpack evolution in avalanche paths or at a larger scale in an entire hydrological basin to derive distributed melt-water runoff information.
NASA Technical Reports Server (NTRS)
Arvidson, R. E.
1992-01-01
Magellan synthetic aperture radar (SAR) and altimetry data were analyzed to determine the nature and extent of surface modification for venusian plains in the Sedna Planitia, Alpha Regio, and western Ovda Regio areas. Specific cross sections derived from the SAR data were also compared to similar data for dry terrestrial basaltic lava flows (Lunar Crater and Cima volcanic fields) and playas (Lunar and Lavic Lakes) for which microtopographic profiles (i.e., quantitative roughness information) were available.
The perception of three-dimensionality across continuous surfaces
NASA Technical Reports Server (NTRS)
Stevens, Kent A.
1989-01-01
The apparent three-dimensionality of a viewed surface presumably corresponds to several internal preceptual quantities, such as surface curvature, local surface orientation, and depth. These quantities are mathematically related for points within the silhouette bounds of a smooth, continuous surface. For instance, surface curvature is related to the rate of change of local surface orientation, and surface orientation is related to the local gradient of distance. It is not clear to what extent these 3D quantities are determined directly from image information rather than indirectly from mathematically related forms, by differentiation or by integration within boundary constraints. An open empirical question, for example, is to what extent surface curvature is perceived directly, and to what extent it is quantitative rather than qualitative. In addition to surface orientation and curvature, one derives an impression of depth, i.e., variations in apparent egocentric distance. A static orthographic image is essentially devoid of depth information, and any quantitative depth impression must be inferred from surface orientation and other sources. Such conversion of orientation to depth does appear to occur, and even to prevail over stereoscopic depth information under some circumstances.
NASA Astrophysics Data System (ADS)
Li, Xuan; Liu, Zhiping; Jiang, Xiaoli; Lodewijks, Gabrol
2018-01-01
Eddy current pulsed thermography (ECPT) is well established for non-destructive testing of electrical conductive materials, featuring the advantages of contactless, intuitive detecting and efficient heating. The concept of divergence characterization of the damage rate of carbon fibre-reinforced plastic (CFRP)-steel structures can be extended to ECPT thermal pattern characterization. It was found in this study that the use of ECPT technology on CFRP-steel structures generated a sizeable amount of valuable information for comprehensive material diagnostics. The relationship between divergence and transient thermal patterns can be identified and analysed by deploying mathematical models to analyse the information about fibre texture-like orientations, gaps and undulations in these multi-layered materials. The developed algorithm enabled the removal of information about fibre texture and the extraction of damage features. The model of the CFRP-glue-steel structures with damage was established using COMSOL Multiphysics® software, and quantitative non-destructive damage evaluation from the ECPT image areas was derived. The results of this proposed method illustrate that damaged areas are highly affected by available information about fibre texture. This proposed work can be applied for detection of impact induced damage and quantitative evaluation of CFRP structures.
An approach to the systematic analysis of urinary steroids
Menini, E.; Norymberski, J. K.
1965-01-01
1. Human urine, its extracts, extracts of urine pretreated with enzyme preparations containing β-glucuronidase and steroid sulphatase or β-glucuronidase alone, and products derived from the specific solvolysis of urinary steroid sulphates, were submitted to the following sequence of operations: reduction with borohydride; oxidation with a glycol-cleaving agent (bismuthate or periodate); separation of the products into ketones and others; oxidation of each fraction with tert.-butyl chromate, resolution of the end products by means of paper chromatography or gas–liquid chromatography or both. 2. Qualitative experiments indicated the kind of information the method and some of its modifications can provide. Quantitative experiments were restricted to the direct treatment of urine by the basic procedure outlined. It was partly shown and partly argued that the quantitative results were probably as informative about the composition of the major neutral urinary steroids (and certainly about their presumptive secretory precursors) as those obtained by a number of established analytical procedures. 3. A possible extension of the scope of the reported method was indicated. 4. A simple technique was introduced for the quantitative deposition of a solid sample on to a gas–liquid-chromatographic column. PMID:14333557
Kessler, Larry G; Barnhart, Huiman X; Buckler, Andrew J; Choudhury, Kingshuk Roy; Kondratovich, Marina V; Toledano, Alicia; Guimaraes, Alexander R; Filice, Ross; Zhang, Zheng; Sullivan, Daniel C
2015-02-01
The development and implementation of quantitative imaging biomarkers has been hampered by the inconsistent and often incorrect use of terminology related to these markers. Sponsored by the Radiological Society of North America, an interdisciplinary group of radiologists, statisticians, physicists, and other researchers worked to develop a comprehensive terminology to serve as a foundation for quantitative imaging biomarker claims. Where possible, this working group adapted existing definitions derived from national or international standards bodies rather than invent new definitions for these terms. This terminology also serves as a foundation for the design of studies that evaluate the technical performance of quantitative imaging biomarkers and for studies of algorithms that generate the quantitative imaging biomarkers from clinical scans. This paper provides examples of research studies and quantitative imaging biomarker claims that use terminology consistent with these definitions as well as examples of the rampant confusion in this emerging field. We provide recommendations for appropriate use of quantitative imaging biomarker terminological concepts. It is hoped that this document will assist researchers and regulatory reviewers who examine quantitative imaging biomarkers and will also inform regulatory guidance. More consistent and correct use of terminology could advance regulatory science, improve clinical research, and provide better care for patients who undergo imaging studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
2017-01-01
Background The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Objective Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. Methods A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Results Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). Conclusions This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. PMID:28292738
Quantitative analysis of histopathological findings using image processing software.
Horai, Yasushi; Kakimoto, Tetsuhiro; Takemoto, Kana; Tanaka, Masaharu
2017-10-01
In evaluating pathological changes in drug efficacy and toxicity studies, morphometric analysis can be quite robust. In this experiment, we examined whether morphometric changes of major pathological findings in various tissue specimens stained with hematoxylin and eosin could be recognized and quantified using image processing software. Using Tissue Studio, hypertrophy of hepatocytes and adrenocortical cells could be quantified based on the method of a previous report, but the regions of red pulp, white pulp, and marginal zones in the spleen could not be recognized when using one setting condition. Using Image-Pro Plus, lipid-derived vacuoles in the liver and mucin-derived vacuoles in the intestinal mucosa could be quantified using two criteria (area and/or roundness). Vacuoles derived from phospholipid could not be quantified when small lipid deposition coexisted in the liver and adrenal cortex. Mononuclear inflammatory cell infiltration in the liver could be quantified to some extent, except for specimens with many clustered infiltrating cells. Adipocyte size and the mean linear intercept could be quantified easily and efficiently using morphological processing and the macro tool equipped in Image-Pro Plus. These methodologies are expected to form a base system that can recognize morphometric features and analyze quantitatively pathological findings through the use of information technology.
Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.
Lee, Won Hee; Bullmore, Ed; Frangou, Sophia
2017-02-01
There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
The physical and biological basis of quantitative parameters derived from diffusion MRI
2012-01-01
Diffusion magnetic resonance imaging is a quantitative imaging technique that measures the underlying molecular diffusion of protons. Diffusion-weighted imaging (DWI) quantifies the apparent diffusion coefficient (ADC) which was first used to detect early ischemic stroke. However this does not take account of the directional dependence of diffusion seen in biological systems (anisotropy). Diffusion tensor imaging (DTI) provides a mathematical model of diffusion anisotropy and is widely used. Parameters, including fractional anisotropy (FA), mean diffusivity (MD), parallel and perpendicular diffusivity can be derived to provide sensitive, but non-specific, measures of altered tissue structure. They are typically assessed in clinical studies by voxel-based or region-of-interest based analyses. The increasing recognition of the limitations of the diffusion tensor model has led to more complex multi-compartment models such as CHARMED, AxCaliber or NODDI being developed to estimate microstructural parameters including axonal diameter, axonal density and fiber orientations. However these are not yet in routine clinical use due to lengthy acquisition times. In this review, I discuss how molecular diffusion may be measured using diffusion MRI, the biological and physical bases for the parameters derived from DWI and DTI, how these are used in clinical studies and the prospect of more complex tissue models providing helpful micro-structural information. PMID:23289085
NASA Astrophysics Data System (ADS)
Zidikheri, Meelis J.; Lucas, Christopher; Potts, Rodney J.
2017-08-01
Airborne volcanic ash is a hazard to aviation. There is an increasing demand for quantitative forecasts of ash properties such as ash mass load to allow airline operators to better manage the risks of flying through airspace likely to be contaminated by ash. In this paper we show how satellite-derived mass load information at times prior to the issuance of the latest forecast can be used to estimate various model parameters that are not easily obtained by other means such as the distribution of mass of the ash column at the volcano. This in turn leads to better forecasts of ash mass load. We demonstrate the efficacy of this approach using several case studies.
NASA Technical Reports Server (NTRS)
Husson, N.; Barbe, A.; Brown, L. R.; Carli, B.; Goldman, A.; Pickett, H. M.; Roche, A. E.; Rothman, L. S.; Smith, M. A. H.
1985-01-01
Several aspects of quantitative atmospheric spectroscopy are considered, using a classification of the molecules according to the gas amounts in the stratosphere and upper troposphere, and reviews of quantitative atmospheric high-resolution spectroscopic measurements and field measurements systems are given. Laboratory spectroscopy and spectral analysis and prediction are presented with a summary of current laboratory spectroscopy capabilities. Spectroscopic data requirements for accurate derivation of atmospheric composition are discussed, where examples are given for space-based remote sensing experiments of the atmosphere: the ATMOS (Atmospheric Trace Molecule) and UARS (Upper Atmosphere Research Satellite) experiment. A review of the basic parameters involved in the data compilations; a summary of information on line parameter compilations already in existence; and a summary of current laboratory spectroscopy studies are used to assess the data base.
Visualizing dispersive features in 2D image via minimum gradient method
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Yu; Wang, Yan; Shen, Zhi -Xun
Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less
Visualizing dispersive features in 2D image via minimum gradient method
He, Yu; Wang, Yan; Shen, Zhi -Xun
2017-07-24
Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less
2001-09-30
data needed , and completing and reviewing the collection of information Send comments regarding this burden estimate or any other aspect of this...gradually from .0014 m2/m2 at the SWI to 0 at ~25 cm depth.] The stochastic model has also been used to calculate nonlocal irrigation ...2001), however, the stochastic simulation results do not decrease with depth as quickly as the chemically-derived irrigation coefficients
Sarabipour, Sarvenaz; Hristova, Kalina
2016-01-01
The G380R mutation in the transmembrane domain of FGFR3 is a germline mutation responsible for most cases of Achondroplasia, a common form of human dwarfism. Here we use quantitative Föster Resonance Energy Transfer (FRET) and osmotically derived plasma membrane vesicles to study the effect of the achondroplasia mutation on the early stages of FGFR3 signaling in response to the ligands fgf1 and fgf2. Using a methodology that allows us to capture structural changes on the cytoplasmic side of the membrane in response to ligand binding to the extracellular domain of FGFR3, we observe no measurable effects of the G380R mutation on FGFR3 ligand-bound dimer configurations. Instead, the most notable effect of the achondroplasia mutation is increased propensity for FGFR3 dimerization in the absence of ligand. This work reveals new information about the molecular events that underlie the achondroplasia phenotype, and highlights differences in FGFR3 activation due to different single amino-acid pathogenic mutations. PMID:27040652
Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh
2016-03-01
The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p < 0.035). Indistinct margins and heterogeneous enhancement were independent predictors (AUC, 0.822). Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p < 0.05; AUC, 0.682-0.716). A combined quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.
Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.
Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P
2013-12-16
Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and Quality Control. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Monesi, C.; Meneghini, C.; Bardelli, F.; Benfatto, M.; Mobilio, S.; Manju, U.; Sarma, D. D.
2005-11-01
Hole-doped perovskites such as La1-xCaxMnO3 present special magnetic and magnetotransport properties, and it is commonly accepted that the local atomic structure around Mn ions plays a crucial role in determining these peculiar features. Therefore experimental techniques directly probing the local atomic structure, like x-ray absorption spectroscopy (XAS), have been widely exploited to deeply understand the physics of these compounds. Quantitative XAS analysis usually concerns the extended region [extended x-ray absorption fine structure (EXAFS)] of the absorption spectra. The near-edge region [x-ray absorption near-edge spectroscopy (XANES)] of XAS spectra can provide detailed complementary information on the electronic structure and local atomic topology around the absorber. However, the complexity of the XANES analysis usually prevents a quantitative understanding of the data. This work exploits the recently developed MXAN code to achieve a quantitative structural refinement of the Mn K -edge XANES of LaMnO3 and CaMnO3 compounds; they are the end compounds of the doped manganite series LaxCa1-xMnO3 . The results derived from the EXAFS and XANES analyses are in good agreement, demonstrating that a quantitative picture of the local structure can be obtained from XANES in these crystalline compounds. Moreover, the quantitative XANES analysis provides topological information not directly achievable from EXAFS data analysis. This work demonstrates that combining the analysis of extended and near-edge regions of Mn K -edge XAS spectra could provide a complete and accurate description of Mn local atomic environment in these compounds.
NASA Astrophysics Data System (ADS)
Jardine, M. A.; Miller, J. A.; Becker, M.
2018-02-01
Texture is one of the most basic descriptors used in the geological sciences. The value derived from textural characterisation extends into engineering applications associated with mining, mineral processing and metal extraction where quantitative textural information is required for models predicting the response of the ore through a particular process. This study extends the well-known 2D grey level co-occurrence matrices methodology into 3D as a method for image analysis of 3D x-ray computed tomography grey scale volumes of drill core. Subsequent interrogation of the information embedded within the grey level occurrence matrices (GLCM) indicates they are sensitive to changes in mineralogy and texture of samples derived from a magmatic nickel sulfide ore. The position of the peaks in the GLCM is an indication of the relative density (specific gravity, SG) of the minerals and when interpreted using a working knowledge of the mineralogy of the ore presented a means to determine the relative abundance of the sulfide minerals (SG > 4), dense silicate minerals (SG > 3), and lighter silicate minerals (SG < 3). The spread of the peaks in the GLCM away from the diagonal is an indication of the degree of grain boundary interaction with wide peaks representing fine grain sizes and narrow peaks representing coarse grain sizes. The method lends itself to application as part of a generic methodology for routine use on large XCT volumes providing quantitative, timely, meaningful and automated information on mineralogy and texture in 3D.
Objects and processes: Two notions for understanding biological information.
Mercado-Reyes, Agustín; Padilla-Longoria, Pablo; Arroyo-Santos, Alfonso
2015-09-07
In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon׳s theory are denounced as metaphoric. We perform a computational experiment to explore whether Shannon׳s information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that quantitative theoretical frameworks do not account fully for the complex phenomenon that the term "information" refers to. We propose a restructuring of the concept into two related, but independent notions, and conclude that a complete theory of biological information must account completely not only for both notions, but also for the relationship between them. Copyright © 2015 Elsevier Ltd. All rights reserved.
Use of epidemiologic data in Integrated Risk Information System (IRIS) assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Persad, Amanda S.; Cooper, Glinda S.
2008-11-15
In human health risk assessment, information from epidemiologic studies is typically utilized in the hazard identification step of the risk assessment paradigm. However, in the assessment of many chemicals by the Integrated Risk Information System (IRIS), epidemiologic data, both observational and experimental, have also been used in the derivation of toxicological risk estimates (i.e., reference doses [RfD], reference concentrations [RfC], oral cancer slope factors [CSF] and inhalation unit risks [IUR]). Of the 545 health assessments posted on the IRIS database as of June 2007, 44 assessments derived non-cancer or cancer risk estimates based on human data. RfD and RfC calculationsmore » were based on a spectrum of endpoints from changes in enzyme activity to specific neurological or dermal effects. There are 12 assessments with IURs based on human data, two assessments that extrapolated human inhalation data to derive CSFs and one that used human data to directly derive a CSF. Lung or respiratory cancer is the most common endpoint for cancer assessments based on human data. To date, only one chemical, benzene, has utilized human data for derivation of all three quantitative risk estimates (i.e., RfC, RfD, and dose-response modeling for cancer assessment). Through examples from the IRIS database, this paper will demonstrate how epidemiologic data have been used in IRIS assessments for both adding to the body of evidence in the hazard identification process and in the quantification of risk estimates in the dose-response component of the risk assessment paradigm.« less
Prediction of sedimentary facies of x-oilfield in northwest of China by geostatistical inversion
NASA Astrophysics Data System (ADS)
Lei, Zhao; Ling, Ke; Tingting, He
2017-03-01
In the early stage of oilfield development, there are only a few wells and well spacing can reach several kilometers. for the alluvial fans and other heterogeneous reservoirs, information from wells alone is not sufficient to derive detailed reservoir information. In this paper, the method of calculating sand thickness through geostatistics inversion is studied, and quantitative relationships between each sedimentary micro-facies are analyzed by combining with single well sedimentary facies. Further, the sedimentary facies plane distribution based on seismic inversion is obtained by combining with sedimentary model, providing the geological basis for the next exploration and deployment.
Observation of the immune response of cells and tissue through multimodal label-free microscopy
NASA Astrophysics Data System (ADS)
Pavillon, Nicolas; Smith, Nicholas I.
2017-02-01
We present applications of a label-free approach to assess the immune response based on the combination of interferometric microscopy and Raman spectroscopy, which makes it possible to simultaneously acquire morphological and molecular information of live cells. We employ this approach to derive statistical models for predicting the activation state of macrophage cells based both on morphological parameters extracted from the high-throughput full-field quantitative phase imaging, and on the molecular content information acquired through Raman spectroscopy. We also employ a system for 3D imaging based on coherence gating, enabling specific targeting of the Raman channel to structures of interest within tissue.
NASA Astrophysics Data System (ADS)
Su, Zhaofeng; Li, Lvzhou; Ling, Jie
2018-04-01
Nonlocality is an important resource for quantum information processing. Genuine tripartite nonlocality, which is sufficiently confirmed by the violation of Svetlichny inequality, is a kind of more precious resource than the standard one. The genuine tripartite nonlocality is usually quantified by the amount of maximal violation of Svetlichny inequality. The problem of detecting and quantifying the genuine tripartite nonlocality of quantum states is of practical significance but still open for the case of general three-qubit quantum states. In this paper, we quantitatively investigate the genuine nonlocality of three-qubit states, which not only include pure states but also include mixed states. Firstly, we derive a simplified formula for the genuine nonlocality of a general three-qubit state, which is a function of the corresponding three correlation matrices. Secondly, we develop three properties of the genuine nonlocality which can help us to analyze the genuine nonlocality of complex states and understand the nature of quantum nonlocality. Further, we get analytical results of genuine nonlocality for two classes of three-qubit states which have special correlation matrices. In particular, the genuine nonlocality of generalized three-qubit GHZ states, which is derived by Ghose et al. (Phys. Rev. Lett. 102, 250404, 2009), and that of three-qubit GHZ-symmetric states, which is derived by Paul et al. (Phys. Rev. A 94, 032101, 2016), can be easily derived by applying the strategy and properties developed in this paper.
Multimodality Data Integration in Epilepsy
Muzik, Otto; Chugani, Diane C.; Zou, Guangyu; Hua, Jing; Lu, Yi; Lu, Shiyong; Asano, Eishi; Chugani, Harry T.
2007-01-01
An important goal of software development in the medical field is the design of methods which are able to integrate information obtained from various imaging and nonimaging modalities into a cohesive framework in order to understand the results of qualitatively different measurements in a larger context. Moreover, it is essential to assess the various features of the data quantitatively so that relationships in anatomical and functional domains between complementing modalities can be expressed mathematically. This paper presents a clinically feasible software environment for the quantitative assessment of the relationship among biochemical functions as assessed by PET imaging and electrophysiological parameters derived from intracranial EEG. Based on the developed software tools, quantitative results obtained from individual modalities can be merged into a data structure allowing a consistent framework for advanced data mining techniques and 3D visualization. Moreover, an effort was made to derive quantitative variables (such as the spatial proximity index, SPI) characterizing the relationship between complementing modalities on a more generic level as a prerequisite for efficient data mining strategies. We describe the implementation of this software environment in twelve children (mean age 5.2 ± 4.3 years) with medically intractable partial epilepsy who underwent both high-resolution structural MR and functional PET imaging. Our experiments demonstrate that our approach will lead to a better understanding of the mechanisms of epileptogenesis and might ultimately have an impact on treatment. Moreover, our software environment holds promise to be useful in many other neurological disorders, where integration of multimodality data is crucial for a better understanding of the underlying disease mechanisms. PMID:17710251
Trust-Based Security Level Evaluation Using Bayesian Belief Networks
NASA Astrophysics Data System (ADS)
Houmb, Siv Hilde; Ray, Indrakshi; Ray, Indrajit; Chakraborty, Sudip
Security is not merely about technical solutions and patching vulnerabilities. Security is about trade-offs and adhering to realistic security needs, employed to support core business processes. Also, modern systems are subject to a highly competitive market, often demanding rapid development cycles, short life-time, short time-to-market, and small budgets. Security evaluation standards, such as ISO 14508 Common Criteria and ISO/IEC 27002, are not adequate for evaluating the security of many modern systems for resource limitations, time-to-market, and other constraints. Towards this end, we propose an alternative time and cost effective approach for evaluating the security level of a security solution, system or part thereof. Our approach relies on collecting information from different sources, who are trusted to varying degrees, and on using a trust measure to aggregate available information when deriving security level. Our approach is quantitative and implemented as a Bayesian Belief Network (BBN) topology, allowing us to reason over uncertain information and seemingly aggregating disparate information. We illustrate our approach by deriving the security level of two alternative Denial of Service (DoS) solutions. Our approach can also be used in the context of security solution trade-off analysis.
2016-09-05
46 was performed on an LTQ-Orbitrap Elite MS and the final quantitation was derived by 47 comparing the relative response of the 200 fmol AQUA...shown in Figure 3B, the final quantitation is derived by comparing the 527 relative response of the 200 fmol AQUA standards (SEE and IRSEE: Set 1) to...measure of eVLP quality, the western blot 553 and LC-HRMS quantitation results were compared to survival data in mice for each of these 554 eVLP vaccine
Quantitative analysis of protein-ligand interactions by NMR.
Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji
2016-08-01
Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used to analyze population-averaged NMR quantities. Essentially, to apply NMR successfully, both the type of experiment and equation to fit the data must be carefully and specifically chosen for the protein-ligand interaction under analysis. In this review, we first explain the exchange regimes and kinetic models of protein-ligand interactions, and then describe the NMR methods that quantitatively analyze these specific interactions. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantitative structure activity relationship studies of mushroom tyrosinase inhibitors
NASA Astrophysics Data System (ADS)
Xue, Chao-Bin; Luo, Wan-Chun; Ding, Qi; Liu, Shou-Zhu; Gao, Xing-Xiang
2008-05-01
Here, we report our results from quantitative structure-activity relationship studies on tyrosinase inhibitors. Interactions between benzoic acid derivatives and tyrosinase active sites were also studied using a molecular docking method. These studies indicated that one possible mechanism for the interaction between benzoic acid derivatives and the tyrosinase active site is the formation of a hydrogen-bond between the hydroxyl (aOH) and carbonyl oxygen atoms of Tyr98, which stabilized the position of Tyr98 and prevented Tyr98 from participating in the interaction between tyrosinase and ORF378. Tyrosinase, also known as phenoloxidase, is a key enzyme in animals, plants and insects that is responsible for catalyzing the hydroxylation of tyrosine into o-diphenols and the oxidation of o-diphenols into o-quinones. In the present study, the bioactivities of 48 derivatives of benzaldehyde, benzoic acid, and cinnamic acid compounds were used to construct three-dimensional quantitative structure-activity relationship (3D-QSAR) models using comparative molecular field (CoMFA) and comparative molecular similarity indices (CoMSIA) analyses. After superimposition using common substructure-based alignments, robust and predictive 3D-QSAR models were obtained from CoMFA ( q 2 = 0.855, r 2 = 0.978) and CoMSIA ( q 2 = 0.841, r 2 = 0.946), with 6 optimum components. Chemical descriptors, including electronic (Hammett σ), hydrophobic (π), and steric (MR) parameters, hydrogen bond acceptor (H-acc), and indicator variable ( I), were used to construct a 2D-QSAR model. The results of this QSAR indicated that π, MR, and H-acc account for 34.9, 31.6, and 26.7% of the calculated biological variance, respectively. The molecular interactions between ligand and target were studied using a flexible docking method (FlexX). The best scored candidates were docked flexibly, and the interaction between the benzoic acid derivatives and the tyrosinase active site was elucidated in detail. We believe that the QSAR models built here provide important information necessary for the design of novel tyrosinase inhibitors.
Systems Biology, Neuroimaging, Neuropsychology, Neuroconnectivity and Traumatic Brain Injury
Bigler, Erin D.
2016-01-01
The patient who sustains a traumatic brain injury (TBI) typically undergoes neuroimaging studies, usually in the form of computed tomography (CT) and magnetic resonance imaging (MRI). In most cases the neuroimaging findings are clinically assessed with descriptive statements that provide qualitative information about the presence/absence of visually identifiable abnormalities; though little if any of the potential information in a scan is analyzed in any quantitative manner, except in research settings. Fortunately, major advances have been made, especially during the last decade, in regards to image quantification techniques, especially those that involve automated image analysis methods. This review argues that a systems biology approach to understanding quantitative neuroimaging findings in TBI provides an appropriate framework for better utilizing the information derived from quantitative neuroimaging and its relation with neuropsychological outcome. Different image analysis methods are reviewed in an attempt to integrate quantitative neuroimaging methods with neuropsychological outcome measures and to illustrate how different neuroimaging techniques tap different aspects of TBI-related neuropathology. Likewise, how different neuropathologies may relate to neuropsychological outcome is explored by examining how damage influences brain connectivity and neural networks. Emphasis is placed on the dynamic changes that occur following TBI and how best to capture those pathologies via different neuroimaging methods. However, traditional clinical neuropsychological techniques are not well suited for interpretation based on contemporary and advanced neuroimaging methods and network analyses. Significant improvements need to be made in the cognitive and behavioral assessment of the brain injured individual to better interface with advances in neuroimaging-based network analyses. By viewing both neuroimaging and neuropsychological processes within a systems biology perspective could represent a significant advancement for the field. PMID:27555810
Development of an agricultural job-exposure matrix for British Columbia, Canada.
Wood, David; Astrakianakis, George; Lang, Barbara; Le, Nhu; Bert, Joel
2002-09-01
Farmers in British Columbia (BC), Canada have been shown to have unexplained elevated proportional mortality rates for several cancers. Because agricultural exposures have never been documented systematically in BC, a quantitative agricultural Job-exposure matrix (JEM) was developed containing exposure assessments from 1950 to 1998. This JEM was developed to document historical exposures and to facilitate future epidemiological studies. Available information regarding BC farming practices was compiled and checklists of potential exposures were produced for each crop. Exposures identified included chemical, biological, and physical agents. Interviews with farmers and agricultural experts were conducted using the checklists as a starting point. This allowed the creation of an initial or 'potential' JEM based on three axes: exposure agent, 'type of work' and time. The 'type of work' axis was determined by combining several variables: region, crop, job title and task. This allowed for a complete description of exposures. Exposure assessments were made quantitatively, where data allowed, or by a dichotomous variable (exposed/unexposed). Quantitative calculations were divided into re-entry and application scenarios. 'Re-entry' exposures were quantified using a standard exposure model with some modification while application exposure estimates were derived using data from the North American Pesticide Handlers Exposure Database (PHED). As expected, exposures differed between crops and job titles both quantitatively and qualitatively. Of the 290 agents included in the exposure axis; 180 were pesticides. Over 3000 estimates of exposure were conducted; 50% of these were quantitative. Each quantitative estimate was at the daily absorbed dose level. Exposure estimates were then rated as high, medium, or low based on comparing them with their respective oral chemical reference dose (RfD) or Acceptable Daily Intake (ADI). This data was mainly obtained from the US Environmental Protection Agency (EPA) Integrated Risk Information System database. Of the quantitative estimates, 74% were rated as low (< 100%) and only 10% were rated as high (>500%). The JEM resulting from this study fills a void concerning exposures for BC farmers and farm workers. While only limited validation of assessments were possible, this JEM can serve as a benchmark for future studies. Preliminary analysis at the BC Cancer Agency (BCCA) using the JEM with prostate cancer records from a large cancer and occupation study/survey has already shown promising results. Development of this JEM provides a useful model for developing historical quantitative exposure estimates where is very little documented information available.
Provenance of Earth Science Datasets - How Deep Should One Go?
NASA Astrophysics Data System (ADS)
Ramapriyan, H.; Manipon, G. J. M.; Aulenbach, S.; Duggan, B.; Goldstein, J.; Hua, H.; Tan, D.; Tilmes, C.; Wilson, B. D.; Wolfe, R.; Zednik, S.
2015-12-01
For credibility of scientific research, transparency and reproducibility are essential. This fundamental tenet has been emphasized for centuries, and has been receiving increased attention in recent years. The Office of Management and Budget (2002) addressed reproducibility and other aspects of quality and utility of information from federal agencies. Specific guidelines from NASA (2002) are derived from the above. According to these guidelines, "NASA requires a higher standard of quality for information that is considered influential. Influential scientific, financial, or statistical information is defined as NASA information that, when disseminated, will have or does have clear and substantial impact on important public policies or important private sector decisions." For information to be compliant, "the information must be transparent and reproducible to the greatest possible extent." We present how the principles of transparency and reproducibility have been applied to NASA data supporting the Third National Climate Assessment (NCA3). The depth of trace needed of provenance of data used to derive conclusions in NCA3 depends on how the data were used (e.g., qualitatively or quantitatively). Given that the information is diligently maintained in the agency archives, it is possible to trace from a figure in the publication through the datasets, specific files, algorithm versions, instruments used for data collection, and satellites, as well as the individuals and organizations involved in each step. Such trace back permits transparency and reproducibility.
Parush, Avi; Kramer, Chelsea; Foster-Hunt, Tara; Momtahan, Kathryn; Hunter, Aren; Sohmer, Benjamin
2011-06-01
Team Situation Awareness (TSA) is one of the critical factors in effective Operating Room (OR) teamwork and can impact patient safety and quality of care. While previous research showed a relationship between situation awareness, as measured by communication events, and team performance, the implications for developing technology to augment and facilitate TSA were not examined. This research aims to further study situation-related communications in the cardiac OR in order to uncover potential degradation in TSA which may lead to adverse events. The communication loop construct-the full cycle of information flow between the participants in the sequence-was used to assess susceptibility to breakdown. Previous research and the findings here suggest that communication loops that are open, non-directed, or with delayed closure, can be susceptible to information loss. These were quantitatively related to communication indicators of TSA such as questions, replies, and announcements. Taken together, both qualitative and quantitative analyses suggest that a high proportion of TSA-related communication (63%) can be characterized as susceptible to information loss. The findings were then used to derive requirements and design a TSA augmentative display. The design principles and potential benefits of such a display are outlined and discussed. Copyright © 2010 Elsevier Inc. All rights reserved.
Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images
Frey, Eric C.; Humm, John L.; Ljungberg, Michael
2012-01-01
The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429
Sardiu, Mihaela E; Gilmore, Joshua M; Carrozza, Michael J; Li, Bing; Workman, Jerry L; Florens, Laurence; Washburn, Michael P
2009-10-06
Protein complexes are key molecular machines executing a variety of essential cellular processes. Despite the availability of genome-wide protein-protein interaction studies, determining the connectivity between proteins within a complex remains a major challenge. Here we demonstrate a method that is able to predict the relationship of proteins within a stable protein complex. We employed a combination of computational approaches and a systematic collection of quantitative proteomics data from wild-type and deletion strain purifications to build a quantitative deletion-interaction network map and subsequently convert the resulting data into an interdependency-interaction model of a complex. We applied this approach to a data set generated from components of the Saccharomyces cerevisiae Rpd3 histone deacetylase complexes, which consists of two distinct small and large complexes that are held together by a module consisting of Rpd3, Sin3 and Ume1. The resulting representation reveals new protein-protein interactions and new submodule relationships, providing novel information for mapping the functional organization of a complex.
NASA Astrophysics Data System (ADS)
Ragno, Rino; Ballante, Flavio; Pirolli, Adele; Wickersham, Richard B.; Patsilinakos, Alexandros; Hesse, Stéphanie; Perspicace, Enrico; Kirsch, Gilbert
2015-08-01
Vascular endothelial growth factor receptor-2, (VEGFR-2), is a key element in angiogenesis, the process by which new blood vessels are formed, and is thus an important pharmaceutical target. Here, 3-D quantitative structure-activity relationship (3-D QSAR) were used to build a quantitative screening and pharmacophore model of the VEGFR-2 receptors for design of inhibitors with improved activities. Most of available experimental data information has been used as training set to derive optimized and fully cross-validated eight mono-probe and a multi-probe quantitative models. Notable is the use of 262 molecules, aligned following both structure-based and ligand-based protocols, as external test set confirming the 3-D QSAR models' predictive capability and their usefulness in design new VEGFR-2 inhibitors. From a survey on literature, this is the first generation of a wide-ranging computational medicinal chemistry application on VEGFR2 inhibitors.
Niwa, Masahiro; Hiraishi, Yasuhiro
2014-01-30
Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets. Copyright © 2013 Elsevier B.V. All rights reserved.
Linkage disequilibrium interval mapping of quantitative trait loci.
Boitard, Simon; Abdallah, Jihad; de Rochambeau, Hubert; Cierco-Ayrolles, Christine; Mangin, Brigitte
2006-03-16
For many years gene mapping studies have been performed through linkage analyses based on pedigree data. Recently, linkage disequilibrium methods based on unrelated individuals have been advocated as powerful tools to refine estimates of gene location. Many strategies have been proposed to deal with simply inherited disease traits. However, locating quantitative trait loci is statistically more challenging and considerable research is needed to provide robust and computationally efficient methods. Under a three-locus Wright-Fisher model, we derived approximate expressions for the expected haplotype frequencies in a population. We considered haplotypes comprising one trait locus and two flanking markers. Using these theoretical expressions, we built a likelihood-maximization method, called HAPim, for estimating the location of a quantitative trait locus. For each postulated position, the method only requires information from the two flanking markers. Over a wide range of simulation scenarios it was found to be more accurate than a two-marker composite likelihood method. It also performed as well as identity by descent methods, whilst being valuable in a wider range of populations. Our method makes efficient use of marker information, and can be valuable for fine mapping purposes. Its performance is increased if multiallelic markers are available. Several improvements can be developed to account for more complex evolution scenarios or provide robust confidence intervals for the location estimates.
An information measure for class discrimination. [in remote sensing of crop observation
NASA Technical Reports Server (NTRS)
Shen, S. S.; Badhwar, G. D.
1986-01-01
This article describes a separability measure for class discrimination. This measure is based on the Fisher information measure for estimating the mixing proportion of two classes. The Fisher information measure not only provides a means to assess quantitatively the information content in the features for separating classes, but also gives the lower bound for the variance of any unbiased estimate of the mixing proportion based on observations of the features. Unlike most commonly used separability measures, this measure is not dependent on the form of the probability distribution of the features and does not imply a specific estimation procedure. This is important because the probability distribution function that describes the data for a given class does not have simple analytic forms, such as a Gaussian. Results of applying this measure to compare the information content provided by three Landsat-derived feature vectors for the purpose of separating small grains from other crops are presented.
NASA Astrophysics Data System (ADS)
Deng, Fangfang; Xie, Meihong; Zhang, Xiaoyun; Li, Peizhen; Tian, Yueli; Zhai, Honglin; Li, Yang
2014-06-01
3,4-Dihydro-2H,6H-pyrimido[1,2-c][1,3]benzothiazin-6-imine is an antiretroviral agent, which can act against human immunodeficiency virus (HIV) infection, but the mechanism of action of pyrimido[1,2-c][1,3]benzothiazin-6-imine derivatives remained ambiguous. In this study, multiple linear regression (MLR) was applied to establish a quite reliable model with the squared correlation coefficient (R2) of 0.8079. We also used chemical information descriptors based on the simplified molecular input line entry system (SMILES) to get a better model with R2 of 0.9086 for the training set, and R2 of 0.8031 for the test set. Molecular docking was utilized to provide more useful information between pyrimido[1,2-c][1,3]benzothiazin-6-imine derivatives and HIV-1 protease, such as active site, binding mode and important residues. Molecular dynamics simulation was employed to further validate the docking results. This work may lead to a better understanding of the mechanism of action and aid to design novel and more potent anti-HIV drugs.
Strain engineering of the elasticity and the Raman shift of nanostructured TiO2
NASA Astrophysics Data System (ADS)
Liu, X. J.; Pan, L. K.; Sun, Z.; Chen, Y. M.; Yang, X. X.; Yang, L. W.; Zhou, Z. F.; Sun, Chang Q.
2011-08-01
Correlation between the elastic modulus (B) and the Raman shift (Δω) of TiO2 and their responses to the variation of crystal size, applied pressure, and measuring temperature have been established as a function depending on the order, length, and energy of a representative bond for the entire specimen. In addition to the derived fundamental information of the atomic cohesive energy, binding energy density, Debye temperature and nonlinear compressibility, theoretical reproduction of the observations clarified that (i) the size effect arises from the under-coordination induced cohesive energy loss and the energy density gain in the surface up to skin depth; (ii) the thermally softened B and Δω results from bond expansion and bond weakening due to vibration; and, (iii) the mechanically stiffened B and Δω results from bond compression and bond strengthening due to mechanical work hardening. With the developed premise, one can predict the changing trends of the concerned properties with derivatives of quantitative information as such from any single measurement alone.
Mazzarelli, Joan M; Brestelli, John; Gorski, Regina K; Liu, Junmin; Manduchi, Elisabetta; Pinney, Deborah F; Schug, Jonathan; White, Peter; Kaestner, Klaus H; Stoeckert, Christian J
2007-01-01
EPConDB (http://www.cbil.upenn.edu/EPConDB) is a public web site that supports research in diabetes, pancreatic development and beta-cell function by providing information about genes expressed in cells of the pancreas. EPConDB displays expression profiles for individual genes and information about transcripts, promoter elements and transcription factor binding sites. Gene expression results are obtained from studies examining tissue expression, pancreatic development and growth, differentiation of insulin-producing cells, islet or beta-cell injury, and genetic models of impaired beta-cell function. The expression datasets are derived using different microarray platforms, including the BCBC PancChips and Affymetrix gene expression arrays. Other datasets include semi-quantitative RT-PCR and MPSS expression studies. For selected microarray studies, lists of differentially expressed genes, derived from PaGE analysis, are displayed on the site. EPConDB provides database queries and tools to examine the relationship between a gene, its transcriptional regulation, protein function and expression in pancreatic tissues.
NASA Astrophysics Data System (ADS)
Liang, Q.; Chipperfield, M.; Daniel, J. S.; Burkholder, J. B.; Rigby, M. L.; Velders, G. J. M.
2015-12-01
The hydroxyl radical (OH) is the major oxidant in the atmosphere. Reaction with OH is the primary removal process for many non-CO2greenhouse gases (GHGs), ozone-depleting substances (ODSs) and their replacements, e.g. hydrochlorofluorocarbons (HCFCs) and hydrofluorocarbons (HFCs). Traditionally, the global OH abundance is inferred using the observed atmospheric rate of change for methyl chloroform (MCF). Due to the Montreal Protocol regulation, the atmospheric abundance of MCF has been decreasing rapidly to near-zero values. It is becoming critical to find an alternative reference compound to continue to provide quantitative information for the global OH abundance. Our model analysis using the NASA 3-D GEOS-5 Chemistry Climate Model suggests that the inter-hemispheric gradients (IHG) of the HCFCs and HFCs show a strong linear correlation with their global emissions. Therefore it is possible to use (i) the observed IHGs of HCFCs and HFCs to estimate their global emissions, and (ii) use the derived emissions and the observed long-term trend to calculate their lifetimes and to infer the global OH abundance. Preliminary analysis using a simple global two-box model (one box for each hemisphere) and information from the global 3-D model suggests that the quantitative relationship between IHG and global emissions varies slightly among individual compounds depending on their lifetime, their emissions history and emission fractions from the two hemispheres. While each compound shows different sensitivity to the above quantities, the combined suite of the HCFCs and HFCs provides a means to derive global OH abundance and the corresponding atmospheric lifetimes of long-lived gases with respect to OH (tOH). The fact that the OH partial lifetimes of these compounds are highly correlated, with the ratio of tOH equal to the reverse ratio of their OH thermal reaction rates at 272K, provides an additional constraint that can greatly reduce the uncertainty in the OH abundance and tOH estimates. We will use the observed IHGs and long-term trends of three major HCFCs and six major HFCs in the two-box model to derive their global emissions and atmospheric lifetimes as well as the global OH abundance. The derived global OH abundance between 2000 and 2014 will be compared with that derived using MCF for consistency.
Badran, Hani; Pluye, Pierre; Grad, Roland
2017-03-14
The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n=45,394], respectively). In part 2 (qualitative results), 22 items were deemed representative, while 1 item was not representative. In part 3 (mixing quantitative and qualitative results), the content validity of 21 items was confirmed, and the 2 nonrelevant items were excluded. A fully validated version was generated (IAM-v2014). This study produced a content validated IAM questionnaire that is used by clinicians and information providers to assess the clinical information delivered in continuing education programs. ©Hani Badran, Pierre Pluye, Roland Grad. Originally published in JMIR Medical Education (http://mededu.jmir.org), 14.03.2017.
Objectives and metrics for wildlife monitoring
Sauer, J.R.; Knutson, M.G.
2008-01-01
Monitoring surveys allow managers to document system status and provide the quantitative basis for management decision-making, and large amounts of effort and funding are devoted to monitoring. Still, monitoring surveys often fall short of providing required information; inadequacies exist in survey designs, analyses procedures, or in the ability to integrate the information into an appropriate evaluation of management actions. We describe current uses of monitoring data, provide our perspective on the value and limitations of current approaches to monitoring, and set the stage for 3 papers that discuss current goals and implementation of monitoring programs. These papers were derived from presentations at a symposium at The Wildlife Society's 13th Annual Conference in Anchorage, Alaska, USA. [2006
Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan
2015-06-01
Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.
Aguillo, I
2000-01-01
Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites.
Stern, K I; Malkova, T L
The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.
Diagnostic causal reasoning with verbal information.
Meder, Björn; Mayrhofer, Ralf
2017-08-01
In diagnostic causal reasoning, the goal is to infer the probability of causes from one or multiple observed effects. Typically, studies investigating such tasks provide subjects with precise quantitative information regarding the strength of the relations between causes and effects or sample data from which the relevant quantities can be learned. By contrast, we sought to examine people's inferences when causal information is communicated through qualitative, rather vague verbal expressions (e.g., "X occasionally causes A"). We conducted three experiments using a sequential diagnostic inference task, where multiple pieces of evidence were obtained one after the other. Quantitative predictions of different probabilistic models were derived using the numerical equivalents of the verbal terms, taken from an unrelated study with different subjects. We present a novel Bayesian model that allows for incorporating the temporal weighting of information in sequential diagnostic reasoning, which can be used to model both primacy and recency effects. On the basis of 19,848 judgments from 292 subjects, we found a remarkably close correspondence between the diagnostic inferences made by subjects who received only verbal information and those of a matched control group to whom information was presented numerically. Whether information was conveyed through verbal terms or numerical estimates, diagnostic judgments closely resembled the posterior probabilities entailed by the causes' prior probabilities and the effects' likelihoods. We observed interindividual differences regarding the temporal weighting of evidence in sequential diagnostic reasoning. Our work provides pathways for investigating judgment and decision making with verbal information within a computational modeling framework. Copyright © 2017 Elsevier Inc. All rights reserved.
Automatic Gleason grading of prostate cancer using quantitative phase imaging and machine learning
NASA Astrophysics Data System (ADS)
Nguyen, Tan H.; Sridharan, Shamira; Macias, Virgilia; Kajdacsy-Balla, Andre; Melamed, Jonathan; Do, Minh N.; Popescu, Gabriel
2017-03-01
We present an approach for automatic diagnosis of tissue biopsies. Our methodology consists of a quantitative phase imaging tissue scanner and machine learning algorithms to process these data. We illustrate the performance by automatic Gleason grading of prostate specimens. The imaging system operates on the principle of interferometry and, as a result, reports on the nanoscale architecture of the unlabeled specimen. We use these data to train a random forest classifier to learn textural behaviors of prostate samples and classify each pixel in the image into different classes. Automatic diagnosis results were computed from the segmented regions. By combining morphological features with quantitative information from the glands and stroma, logistic regression was used to discriminate regions with Gleason grade 3 versus grade 4 cancer in prostatectomy tissue. The overall accuracy of this classification derived from a receiver operating curve was 82%, which is in the range of human error when interobserver variability is considered. We anticipate that our approach will provide a clinically objective and quantitative metric for Gleason grading, allowing us to corroborate results across instruments and laboratories and feed the computer algorithms for improved accuracy.
IWGT report on quantitative approaches to genotoxicity risk ...
This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast
NASA Astrophysics Data System (ADS)
Bösmeier, Annette; Glaser, Rüdiger; Stahl, Kerstin; Himmelsbach, Iso; Schönbein, Johannes
2017-04-01
Future estimations of flood hazard and risk for developing optimal coping and adaption strategies inevitably include considerations of the frequency and magnitude of past events. Methods of historical climatology represent one way of assessing flood occurrences beyond the period of instrumental measurements and can thereby substantially help to extend the view into the past and to improve modern risk analysis. Such historical information can be of additional value and has been used in statistical approaches like Bayesian flood frequency analyses during recent years. However, the derivation of quantitative values from vague descriptive information of historical sources remains a crucial challenge. We explored possibilities of parametrization of descriptive flood related data specifically for the assessment of historical floods in a framework that combines a hermeneutical approach with mathematical and statistical methods. This study forms part of the transnational, Franco-German research project TRANSRISK2 (2014 - 2017), funded by ANR and DFG, with the focus on exploring the floods history of the last 300 years for the regions of Upper and Middle Rhine. A broad data base of flood events had been compiled, dating back to AD 1500. The events had been classified based on hermeneutical methods, depending on intensity, spatial dimension, temporal structure, damages and mitigation measures associated with the specific events. This indexed database allowed the exploration of a link between descriptive data and quantitative information for the overlapping time period of classified floods and instrumental measurements since the end of the 19th century. Thereby, flood peak discharges as a quantitative measure of the severity of a flood were used to assess the discharge intervals for flood classes (upper and lower thresholds) within different time intervals for validating the flood classification, as well as examining the trend in the perception threshold over time. Furthermore, within a suitable time period, flood classes and other quantifiable indicators of flood intensity (number of damaged locations mentioned in historical sources, general availability of reports associated with a specific event) were combined with available peak discharges measurements. We argue that this information can be considered as 'expert knowledge' and used it to develop a fuzzy rule based model for deriving peak discharge estimates of pre-instrumental events that can finally be introduced into a flood frequency analysis.
Sarabipour, Sarvenaz; Hristova, Kalina
2016-07-01
The G380R mutation in the transmembrane domain of FGFR3 is a germline mutation responsible for most cases of Achondroplasia, a common form of human dwarfism. Here we use quantitative Fӧster Resonance Energy Transfer (FRET) and osmotically derived plasma membrane vesicles to study the effect of the achondroplasia mutation on the early stages of FGFR3 signaling in response to the ligands fgf1 and fgf2. Using a methodology that allows us to capture structural changes on the cytoplasmic side of the membrane in response to ligand binding to the extracellular domain of FGFR3, we observe no measurable effects of the G380R mutation on FGFR3 ligand-bound dimer configurations. Instead, the most notable effect of the achondroplasia mutation is increased propensity for FGFR3 dimerization in the absence of ligand. This work reveals new information about the molecular events that underlie the achondroplasia phenotype, and highlights differences in FGFR3 activation due to different single amino-acid pathogenic mutations. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Goodman, James Ansell
My research focuses on the development and application of hyperspectral remote sensing as a valuable component in the assessment and management of coral ecosystems. Remote sensing provides an important quantitative ability to investigate the spatial dynamics of coral health and evaluate the impacts of local, regional and global change on this important natural resource. Furthermore, advances in detector capabilities and analysis methods, particularly with respect to hyperspectral remote sensing, are also increasing the accuracy and level of effectiveness of the resulting data products. Using imagery of Kaneohe Bay and French Frigate Shoals in the Hawaiian Islands, acquired in 2000 by NASA's Airborne Visible InfraRed Imaging Spectrometer (AVIRIS), I developed, applied and evaluated algorithms for analyzing coral reefs using hyperspectral remote sensing data. Research included developing methods for acquiring in situ underwater reflectance, collecting spectral measurements of the dominant bottom components in Kaneohe Bay, applying atmospheric correction and sunglint removal algorithms, employing a semianalytical optimization model to derive bathymetry and aquatic optical properties, and developing a linear unmixing approach for deriving bottom composition. Additionally, algorithm development focused on using fundamental scientific principles to facilitate the portability of methods to diverse geographic locations and across variable environmental conditions. Assessments of this methodology compared favorably with available field measurements and habitat information, and the overall analysis demonstrated the capacity to derive information on water properties, bathymetry and habitat composition. Thus, results illustrated a successful approach for extracting environmental information and habitat composition from a coral reef environment using hyperspectral remote sensing.
New service interface for River Forecasting Center derived quantitative precipitation estimates
Blodgett, David L.
2013-01-01
For more than a decade, the National Weather Service (NWS) River Forecast Centers (RFCs) have been estimating spatially distributed rainfall by applying quality-control procedures to radar-indicated rainfall estimates in the eastern United States and other best practices in the western United States to producea national Quantitative Precipitation Estimate (QPE) (National Weather Service, 2013). The availability of archives of QPE information for analytical purposes has been limited to manual requests for access to raw binary file formats that are difficult for scientists who are not in the climatic sciences to work with. The NWS provided the QPE archives to the U.S. Geological Survey (USGS), and the contents of the real-time feed from the RFCs are being saved by the USGS for incorporation into the archives. The USGS has applied time-series aggregation and added latitude-longitude coordinate variables to publish the RFC QPE data. Web services provide users with direct (index-based) data access, rendered visualizations of the data, and resampled raster representations of the source data in common geographic information formats.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lohmann, R.C.
1992-01-01
Three-dimensional geologic outcrop studies which quantitatively describe the geologic architecture of deposits of a specific depositional environment are a necessary requirement for characterization of the permeability structure of an aquifer. The objective of this study is to address this need for quantitative, three-dimensional outcrop studies. For this study, a 10,000 m{sup 2} by 25 m high outcrop of Pliocene-Pleistocene Sierra Ladrones Formation located near Belen, New Mexico was mapped in detail, and the geologic architecture was quantified using geostatistical variogram analysis. In general, the information contained in this study should be useful for hydrologists working on the characterization of aquifersmore » from similar depositional environments such as this one. However, for the permeability correlation study to be truly useful, the within-element correlation structure needs to be superimposed on the elements themselves instead of using mean log (k) values, as was done for this study. Such information is derived from outcrop permeability sampling such as the work of Davis (1990) and Goggin et al. (1988).« less
Vanmeert, Frederik; De Nolf, Wout; Dik, Joris; Janssens, Koen
2018-06-05
At or below the surface of painted works of art, valuable information is present that provides insights into an object's past, such as the artist's technique and the creative process that was followed or its conservation history but also on its current state of preservation. Various noninvasive techniques have been developed over the past 2 decades that can probe this information either locally (via point analysis) or on a macroscopic scale (e.g., full-field imaging and raster scanning). Recently macroscopic X-ray powder diffraction (MA-XRPD) mapping using laboratory X-ray sources was developed. This method can visualize highly specific chemical distributions at the macroscale (dm 2 ). In this work we demonstrate the synergy between the quantitative aspects of powder diffraction and the noninvasive scanning capability of MA-XRPD highlighting the potential of the method to reveal new types of information. Quantitative data derived from a 15th/16th century illuminated sheet of parchment revealed three lead white pigments with different hydrocerussite-cerussite compositions in specific pictorial elements, while quantification analysis of impurities in the blue azurite pigment revealed two distinct azurite types: one rich in barite and one in quartz. Furthermore, on the same artifact, the depth-selective possibilities of the method that stem from an exploitation of the shift of the measured diffraction peaks with respect to reference data are highlighted. The influence of different experimental parameters on the depth-selective analysis results is briefly discussed. Promising stratigraphic information could be obtained, even though the analysis is hampered by not completely understood variations in the unit cell dimensions of the crystalline pigment phases.
Forest Connectivity Regions of Canada Using Circuit Theory and Image Analysis
Pelletier, David; Lapointe, Marc-Élie; Wulder, Michael A.; White, Joanne C.; Cardille, Jeffrey A.
2017-01-01
Ecological processes are increasingly well understood over smaller areas, yet information regarding interconnections and the hierarchical nature of ecosystems remains less studied and understood. Information on connectivity over large areas with high resolution source information provides for both local detail and regional context. The emerging capacity to apply circuit theory to create maps of omnidirectional connectivity provides an opportunity for improved and quantitative depictions of forest connectivity, supporting the formation and testing of hypotheses about the density of animal movement, ecosystem structure, and related links to natural and anthropogenic forces. In this research, our goal was to delineate regions where connectivity regimes are similar across the boreal region of Canada using new quantitative analyses for characterizing connectivity over large areas (e.g., millions of hectares). Utilizing the Earth Observation for Sustainable Development of forests (EOSD) circa 2000 Landsat-derived land-cover map, we created and analyzed a national-scale map of omnidirectional forest connectivity at 25m resolution over 10000 tiles of 625 km2 each, spanning the forested regions of Canada. Using image recognition software to detect corridors, pinch points, and barriers to movements at multiple spatial scales in each tile, we developed a simple measure of the structural complexity of connectivity patterns in omnidirectional connectivity maps. We then mapped the Circuitscape resistance distance measure and used it in conjunction with the complexity data to study connectivity characteristics in each forested ecozone. Ecozone boundaries masked substantial systematic patterns in connectivity characteristics that are uncovered using a new classification of connectivity patterns that revealed six clear groups of forest connectivity patterns found in Canada. The resulting maps allow exploration of omnidirectional forest connectivity patterns at full resolution while permitting quantitative analyses of connectivity over broad areas, informing modeling, planning and monitoring efforts. PMID:28146573
Plant leaf chlorophyll content retrieval based on a field imaging spectroscopy system.
Liu, Bo; Yue, Yue-Min; Li, Ru; Shen, Wen-Jing; Wang, Ke-Lin
2014-10-23
A field imaging spectrometer system (FISS; 380-870 nm and 344 bands) was designed for agriculture applications. In this study, FISS was used to gather spectral information from soybean leaves. The chlorophyll content was retrieved using a multiple linear regression (MLR), partial least squares (PLS) regression and support vector machine (SVM) regression. Our objective was to verify the performance of FISS in a quantitative spectral analysis through the estimation of chlorophyll content and to determine a proper quantitative spectral analysis method for processing FISS data. The results revealed that the derivative reflectance was a more sensitive indicator of chlorophyll content and could extract content information more efficiently than the spectral reflectance, which is more significant for FISS data compared to ASD (analytical spectral devices) data, reducing the corresponding RMSE (root mean squared error) by 3.3%-35.6%. Compared with the spectral features, the regression methods had smaller effects on the retrieval accuracy. A multivariate linear model could be the ideal model to retrieve chlorophyll information with a small number of significant wavelengths used. The smallest RMSE of the chlorophyll content retrieved using FISS data was 0.201 mg/g, a relative reduction of more than 30% compared with the RMSE based on a non-imaging ASD spectrometer, which represents a high estimation accuracy compared with the mean chlorophyll content of the sampled leaves (4.05 mg/g). Our study indicates that FISS could obtain both spectral and spatial detailed information of high quality. Its image-spectrum-in-one merit promotes the good performance of FISS in quantitative spectral analyses, and it can potentially be widely used in the agricultural sector.
Plant Leaf Chlorophyll Content Retrieval Based on a Field Imaging Spectroscopy System
Liu, Bo; Yue, Yue-Min; Li, Ru; Shen, Wen-Jing; Wang, Ke-Lin
2014-01-01
A field imaging spectrometer system (FISS; 380–870 nm and 344 bands) was designed for agriculture applications. In this study, FISS was used to gather spectral information from soybean leaves. The chlorophyll content was retrieved using a multiple linear regression (MLR), partial least squares (PLS) regression and support vector machine (SVM) regression. Our objective was to verify the performance of FISS in a quantitative spectral analysis through the estimation of chlorophyll content and to determine a proper quantitative spectral analysis method for processing FISS data. The results revealed that the derivative reflectance was a more sensitive indicator of chlorophyll content and could extract content information more efficiently than the spectral reflectance, which is more significant for FISS data compared to ASD (analytical spectral devices) data, reducing the corresponding RMSE (root mean squared error) by 3.3%–35.6%. Compared with the spectral features, the regression methods had smaller effects on the retrieval accuracy. A multivariate linear model could be the ideal model to retrieve chlorophyll information with a small number of significant wavelengths used. The smallest RMSE of the chlorophyll content retrieved using FISS data was 0.201 mg/g, a relative reduction of more than 30% compared with the RMSE based on a non-imaging ASD spectrometer, which represents a high estimation accuracy compared with the mean chlorophyll content of the sampled leaves (4.05 mg/g). Our study indicates that FISS could obtain both spectral and spatial detailed information of high quality. Its image-spectrum-in-one merit promotes the good performance of FISS in quantitative spectral analyses, and it can potentially be widely used in the agricultural sector. PMID:25341439
Novel benzanthrone probes for membrane and protein studies
NASA Astrophysics Data System (ADS)
Ryzhova, Olga; Vus, Kateryna; Trusova, Valeriya; Kirilova, Elena; Kirilov, Georgiy; Gorbenko, Galyna; Kinnunen, Paavo
2016-09-01
The applicability of a series of novel benzanthrone dyes to monitoring the changes in physicochemical properties of lipid bilayer and to differentiating between the native and aggregated protein states has been evaluated. Based on the quantitative parameters of the dye-membrane and dye-protein binding derived from the fluorimetric titration data, the most prospective membrane probes and amyloid tracers have been selected from the group of examined compounds. Analysis of the red edge excitation shifts of the membrane- and amyloid-bound dyes provided information on the properties of benzanthrone binding sites within the lipid and protein matrixes. To understand how amyloid specificity of benzanthrones correlates with their structure, quantitative structure activity relationship (QSAR) analysis was performed involving a range of quantum chemical molecular descriptors. A statistically significant model was obtained for predicting the sensitivity of novel benzanthrone dyes to amyloid fibrils.
A unified perspective on robot control - The energy Lyapunov function approach
NASA Technical Reports Server (NTRS)
Wen, John T.
1990-01-01
A unified framework for the stability analysis of robot tracking control is presented. By using an energy-motivated Lyapunov function candidate, the closed-loop stability is shown for a large family of control laws sharing a common structure of proportional and derivative feedback and a model-based feedforward. The feedforward can be zero, partial or complete linearized dynamics, partial or complete nonlinear dynamics, or linearized or nonlinear dynamics with parameter adaptation. As result, the dichotomous approaches to the robot control problem based on the open-loop linearization and nonlinear Lyapunov analysis are both included in this treatment. Furthermore, quantitative estimates of the trade-offs between different schemes in terms of the tracking performance, steady state error, domain of convergence, realtime computation load and required a prior model information are derived.
Wang, Junhua; Westenskow, Peter D.; Fang, Mingliang; Friedlander, Martin
2016-01-01
Photoreceptor degeneration is characteristic of vision-threatening diseases including age-related macular degeneration. Photoreceptors are metabolically demanding cells in the retina, but specific details about their metabolic behaviours are unresolved. The quantitative metabolomics of retinal degeneration could provide valuable insights and inform future therapies. Here, we determined the metabolomic ‘fingerprint’ of healthy and dystrophic retinas in rat models using optimized metabolite extraction techniques. A number of classes of metabolites were consistently dysregulated during degeneration: vitamin A analogues, fatty acid amides, long-chain polyunsaturated fatty acids, acyl carnitines and several phospholipid species. For the first time, a distinct temporal trend of several important metabolites including DHA (4Z,7Z,10Z,13Z,16Z,19Z-docosahexaenoic acid), all-trans-retinal and its toxic end-product N-retinyl-N-retinylidene-ethanolamine were observed between healthy and dystrophic retinas. In this study, metabolomics was further used to determine the temporal effects of the therapeutic intervention of grafting stem cell-derived retinal pigment epithelium (RPE) in dystrophic retinas, which significantly prevented photoreceptor atrophy in our previous studies. The result revealed that lipid levels such as phosphatidylethanolamine in eyes were restored in those animals receiving the RPE grafts. In conclusion, this study provides insight into the metabolomics of retinal degeneration, and further understanding of the efficacy of RPE transplantation. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644974
Iwamoto, Noriko; Shimada, Takashi
2018-05-01
Since the turn of the century, mass spectrometry (MS) technologies have continued to improve dramatically, and advanced strategies that were impossible a decade ago are increasingly becoming available. The basic characteristics behind these advancements are MS resolution, quantitative accuracy, and information science for appropriate data processing. The spectral data from MS contain various types of information. The benefits of improving the resolution of MS data include accurate molecular structural-derived information, and as a result, we can obtain a refined biomolecular structure determination in a sequential and large-scale manner. Moreover, in MS data, not only accurate structural information but also the generated ion amount plays an important rule. This progress has greatly contributed a research field that captures biological events as a system by comprehensively tracing the various changes in biomolecular dynamics. The sequential changes of proteome expression in biological pathways are very essential, and the amounts of the changes often directly become the targets of drug discovery or indicators of clinical efficacy. To take this proteomic approach, it is necessary to separate the individual MS spectra derived from each biomolecule in the complexed biological samples. MS itself is not so infinite to perform the all peak separation, and we should consider improving the methods for sample processing and purification to make them suitable for injection into MS. The above-described characteristics can only be achieved using MS with any analytical instrument. Moreover, MS is expected to be applied and expand into many fields, not only basic life sciences but also forensic medicine, plant sciences, materials, and natural products. In this review, we focus on the technical fundamentals and future aspects of the strategies for accurate structural identification, structure-indicated quantitation, and on the challenges for pharmacokinetics of high-molecular-weight protein biopharmaceuticals. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Seasonal vegetation characteristics of the United States
Reed, Bradley C.; Yang, Limin
1997-01-01
The U.S. Geological Survey's EROS Data Center has created a prototype 1‐km resolution data base of vegetation seasonal characteristics. The characteristics are derived from time‐series NDVI data collected by the AVHRR satellite sensor. Information covering the 5 years 1989–1993 is included in the data base. Although quantitative validation of the seasonal characteristics cannot be made until several evaluation efforts are completed, general observations are possible by viewing images of the seasonal parameters. Figures 2 through 8 show several examples of the seasonal characteristics data base.
Topology of large-scale structure. IV - Topology in two dimensions
NASA Technical Reports Server (NTRS)
Melott, Adrian L.; Cohen, Alexander P.; Hamilton, Andrew J. S.; Gott, J. Richard, III; Weinberg, David H.
1989-01-01
In a recent series of papers, an algorithm was developed for quantitatively measuring the topology of the large-scale structure of the universe and this algorithm was applied to numerical models and to three-dimensional observational data sets. In this paper, it is shown that topological information can be derived from a two-dimensional cross section of a density field, and analytic expressions are given for a Gaussian random field. The application of a two-dimensional numerical algorithm for measuring topology to cross sections of three-dimensional models is demonstrated.
Analysis of nonlinear internal waves observed by Landsat thematic mapper
NASA Astrophysics Data System (ADS)
Artale, V.; Levi, D.; Marullo, S.; Santoleri, R.
1990-09-01
In this work we test the compatibility between the theoretical parameters of a nonlinear wave model and the quantitative information that one can deduce from satellite-derived data. The theoretical parameters are obtained by applying an inverse problem to the solution of the Cauchy problem for the Korteweg-de Vries equation. Our results are applied to the case of internal wave patterns elaborated from two different satellite sensors at the south of Messina (the thematic mapper) and at the north of Messina (the synthetic aperture radar).
Uncertainty vs. Information (Invited)
NASA Astrophysics Data System (ADS)
Nearing, Grey
2017-04-01
Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.
Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia
2013-11-01
Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.
First and second energy derivative analyses for open-shell self-consistent field wavefunctions
NASA Astrophysics Data System (ADS)
Yamaguchi, Yukio; Schaefer, Henry F., III; Frenking, Gernot
A study of first and second derivatives of the orbital, electronic, nuclear and total energies for the self-consistent field (SCF) wavefunction has been applied to general open-shell SCF systems. The diagonal elements of the Lagrangian matrix for the general open-shell SCF wavefunction are adapted as the 'oŕbital' energies. The first and second derivatives of the orbital energies in terms of the normal coordinates are determined via the finite difference method, while those of the electronic, nuclear and total energies are obtained by analytical techniques. Using three low lying states of the CH2 and H2CO molecules as examples, it is demonstrated that the derivatives of the SCF energetic quantities with respect to the normal coordinates provide useful chemical information concerning the respective molecular structures and reactivities. The conventional concept of the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO) has been extended to the molecular vibrational motion, and the terminology of vibrationally active MOs (va-MOs), va-HOMO and va-LUMO has been introduced for each normal coordinate. The energy derivative analysis method may be used as a powerful semi-quantitative modelin understanding and interpreting various chemical phenomena.
Towards precision medicine: from quantitative imaging to radiomics
Acharya, U. Rajendra; Hagiwara, Yuki; Sudarshan, Vidya K.; Chan, Wai Yee; Ng, Kwan Hoong
2018-01-01
Radiology (imaging) and imaging-guided interventions, which provide multi-parametric morphologic and functional information, are playing an increasingly significant role in precision medicine. Radiologists are trained to understand the imaging phenotypes, transcribe those observations (phenotypes) to correlate with underlying diseases and to characterize the images. However, in order to understand and characterize the molecular phenotype (to obtain genomic information) of solid heterogeneous tumours, the advanced sequencing of those tissues using biopsy is required. Thus, radiologists image the tissues from various views and angles in order to have the complete image phenotypes, thereby acquiring a huge amount of data. Deriving meaningful details from all these radiological data becomes challenging and raises the big data issues. Therefore, interest in the application of radiomics has been growing in recent years as it has the potential to provide significant interpretive and predictive information for decision support. Radiomics is a combination of conventional computer-aided diagnosis, deep learning methods, and human skills, and thus can be used for quantitative characterization of tumour phenotypes. This paper discusses the overview of radiomics workflow, the results of various radiomics-based studies conducted using various radiological images such as computed tomography (CT), magnetic resonance imaging (MRI), and positron-emission tomography (PET), the challenges we are facing, and the potential contribution of radiomics towards precision medicine. PMID:29308604
Molecular-level understanding of the carbonisation of polysaccharides.
Shuttleworth, Peter S; Budarin, Vitaliy; White, Robin J; Gun'ko, Vladimir M; Luque, Rafael; Clark, James H
2013-07-08
Understanding of both the textural and functionality changes occurring during (mesoporous) polysaccharide carbonisation at the molecular level provides a deeper insight into the whole spectrum of material properties, from chemical activity to pore shape and surface energy, which is crucial for the successful application of carbonaceous materials in adsorption, catalysis and chromatography. Obtained information will help to identify the most appropriate applications of the carbonaceous material generated during torrefaction and different types of pyrolysis processes and therefore will be important for the development of cost- and energy-efficient zero-waste biorefineries. The presented approach is informative and semi-quantitative with the potential to be extended to the formation of other biomass-derived carbonaceous materials. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Georgi, Laura; Johnson-Cicalese, Jennifer; Honig, Josh; Das, Sushma Parankush; Rajah, Veeran D; Bhattacharya, Debashish; Bassil, Nahla; Rowland, Lisa J; Polashock, James; Vorsa, Nicholi
2013-03-01
The first genetic map of cranberry (Vaccinium macrocarpon) has been constructed, comprising 14 linkage groups totaling 879.9 cM with an estimated coverage of 82.2 %. This map, based on four mapping populations segregating for field fruit-rot resistance, contains 136 distinct loci. Mapped markers include blueberry-derived simple sequence repeat (SSR) and cranberry-derived sequence-characterized amplified region markers previously used for fingerprinting cranberry cultivars. In addition, SSR markers were developed near cranberry sequences resembling genes involved in flavonoid biosynthesis or defense against necrotrophic pathogens, or conserved orthologous set (COS) sequences. The cranberry SSRs were developed from next-generation cranberry genomic sequence assemblies; thus, the positions of these SSRs on the genomic map provide information about the genomic location of the sequence scaffold from which they were derived. The use of SSR markers near COS and other functional sequences, plus 33 SSR markers from blueberry, facilitates comparisons of this map with maps of other plant species. Regions of the cranberry map were identified that showed conservation of synteny with Vitis vinifera and Arabidopsis thaliana. Positioned on this map are quantitative trait loci (QTL) for field fruit-rot resistance (FFRR), fruit weight, titratable acidity, and sound fruit yield (SFY). The SFY QTL is adjacent to one of the fruit weight QTL and may reflect pleiotropy. Two of the FFRR QTL are in regions of conserved synteny with grape and span defense gene markers, and the third FFRR QTL spans a flavonoid biosynthetic gene.
Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations
2010-11-01
from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property
Lu, Shao Hua; Li, Bao Qiong; Zhai, Hong Lin; Zhang, Xin; Zhang, Zhuo Yong
2018-04-25
Terahertz time-domain spectroscopy has been applied to many fields, however, it still encounters drawbacks in multicomponent mixtures analysis due to serious spectral overlapping. Here, an effective approach to quantitative analysis was proposed, and applied on the determination of the ternary amino acids in foxtail millet substrate. Utilizing three parameters derived from the THz-TDS, the images were constructed and the Tchebichef image moments were used to extract the information of target components. Then the quantitative models were obtained by stepwise regression. The correlation coefficients of leave-one-out cross-validation (R loo-cv 2 ) were more than 0.9595. As for external test set, the predictive correlation coefficients (R p 2 ) were more than 0.8026 and the root mean square error of prediction (RMSE p ) were less than 1.2601. Compared with the traditional methods (PLS and N-PLS methods), our approach is more accurate, robust and reliable, and can be a potential excellent approach to quantify multicomponent with THz-TDS spectroscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2013 CFR
2013-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2011 CFR
2011-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2014 CFR
2014-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2012 CFR
2012-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M
2017-08-01
Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.
Loubet, Philippe; Roux, Philippe; Loiseau, Eleonore; Bellon-Maurel, Veronique
2014-12-15
Water is a growing concern in cities, and its sustainable management is very complex. Life cycle assessment (LCA) has been increasingly used to assess the environmental impacts of water technologies during the last 20 years. This review aims at compiling all LCA papers related to water technologies, out of which 18 LCA studies deals with whole urban water systems (UWS). A focus is carried out on these 18 case studies which are analyzed according to criteria derived from the four phases of LCA international standards. The results show that whereas the case studies share a common goal, i.e., providing quantitative information to policy makers on the environmental impacts of urban water systems and their forecasting scenarios, they are based on different scopes, resulting in the selection of different functional units and system boundaries. A quantitative comparison of life cycle inventory and life cycle impact assessment data is provided, and the results are discussed. It shows the superiority of information offered by multi-criteria approaches for decision making compared to that derived from mono-criterion. From this review, recommendations on the way to conduct the environmental assessment of urban water systems are given, e.g., the need to provide consistent mass balances in terms of emissions and water flows. Remaining challenges for urban water system LCAs are identified, such as a better consideration of water users and resources and the inclusion of recent LCA developments (territorial approaches and water-related impacts). Copyright © 2014 Elsevier Ltd. All rights reserved.
A Mode-of-Action Approach for the Identification of Genotoxic Carcinogens
Hernández, Lya G.; van Benthem, Jan; Johnson, George E.
2013-01-01
Distinguishing between clastogens and aneugens is vital in cancer risk assessment because the default assumption is that clastogens and aneugens have linear and non-linear dose-response curves, respectively. Any observed non-linearity must be supported by mode of action (MOA) analyses where biological mechanisms are linked with dose-response evaluations. For aneugens, the MOA has been well characterised as disruptors of mitotic machinery where chromosome loss via micronuclei (MN) formation is an accepted endpoint used in risk assessment. In this study we performed the cytokinesis-block micronucleus assay and immunofluorescence mitotic machinery visualisation in human lymphoblastoid (AHH-1) and Chinese Hamster fibroblast (V79) cell lines after treatment with the aneugen 17-β-oestradiol (E2). Results were compared to previously published data on bisphenol-A (BPA) and Rotenone data. Two concentration-response approaches (the threshold-[Td] and benchmark-dose [BMD] approaches) were applied to derive a point of departure (POD) for in vitro MN induction. BMDs were also derived from the most sensitive carcinogenic endpoint. Ranking comparisons of the PODs from the in vitro MN and the carcinogenicity studies demonstrated a link between these two endpoints for BPA, E2 and Rotenone. This analysis was extended to include 5 additional aneugens, 5 clastogens and 3 mutagens and further concentration and dose-response correlations were observed between PODs from the in vitro MN and carcinogenicity. This approach is promising and may be further extended to other genotoxic carcinogens, where MOA and quantitative information from the in vitro MN studies could be used in a quantitative manner to further inform cancer risk assessment. PMID:23675539
Zhou, Yong; Dong, Guichun; Tao, Yajun; Chen, Chen; Yang, Bin; Wu, Yue; Yang, Zefeng; Liang, Guohua; Wang, Baohe; Wang, Yulong
2016-01-01
Identification of quantitative trait loci (QTLs) associated with rice root morphology provides useful information for avoiding drought stress and maintaining yield production under the irrigation condition. In this study, a set of chromosome segment substitution lines derived from 9311 as the recipient and Nipponbare as donor, were used to analysis root morphology. By combining the resequencing-based bin-map with a multiple linear regression analysis, QTL identification was conducted on root number (RN), total root length (TRL), root dry weight (RDW), maximum root length (MRL), root thickness (RTH), total absorption area (TAA) and root vitality (RV), using the CSSL population grown under hydroponic conditions. A total of thirty-eight QTLs were identified: six for TRL, six for RDW, eight for the MRL, four for RTH, seven for RN, two for TAA, and five for RV. Phenotypic effect variance explained by these QTLs ranged from 2.23% to 37.08%, and four single QTLs had more than 10% phenotypic explanations on three root traits. We also detected the correlations between grain yield (GY) and root traits, and found that TRL, RTH and MRL had significantly positive correlations with GY. However, TRL, RDW and MRL had significantly positive correlations with biomass yield (BY). Several QTLs identified in our population were co-localized with some loci for grain yield or biomass. This information may be immediately exploited for improving rice water and fertilizer use efficiency for molecular breeding of root system architectures.
Atmospheric correction for remote sensing image based on multi-spectral information
NASA Astrophysics Data System (ADS)
Wang, Yu; He, Hongyan; Tan, Wei; Qi, Wenwen
2018-03-01
The light collected from remote sensors taken from space must transit through the Earth's atmosphere. All satellite images are affected at some level by lightwave scattering and absorption from aerosols, water vapor and particulates in the atmosphere. For generating high-quality scientific data, atmospheric correction is required to remove atmospheric effects and to convert digital number (DN) values to surface reflectance (SR). Every optical satellite in orbit observes the earth through the same atmosphere, but each satellite image is impacted differently because atmospheric conditions are constantly changing. A physics-based detailed radiative transfer model 6SV requires a lot of key ancillary information about the atmospheric conditions at the acquisition time. This paper investigates to achieve the simultaneous acquisition of atmospheric radiation parameters based on the multi-spectral information, in order to improve the estimates of surface reflectance through physics-based atmospheric correction. Ancillary information on the aerosol optical depth (AOD) and total water vapor (TWV) derived from the multi-spectral information based on specific spectral properties was used for the 6SV model. The experimentation was carried out on images of Sentinel-2, which carries a Multispectral Instrument (MSI), recording in 13 spectral bands, covering a wide range of wavelengths from 440 up to 2200 nm. The results suggest that per-pixel atmospheric correction through 6SV model, integrating AOD and TWV derived from multispectral information, is better suited for accurate analysis of satellite images and quantitative remote sensing application.
NASA Astrophysics Data System (ADS)
Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei
2016-04-01
Distributed models offer the potential to resolve catchment systems in more detail, and therefore simulate the hydrological impacts of spatial changes in catchment forcing (e.g. landscape change). Such models tend to contain a large number of poorly defined and spatially varying model parameters which are therefore computationally expensive to calibrate. Insufficient data can result in model parameter and structural equifinality, particularly when calibration is reliant on catchment outlet discharge behaviour alone. Evaluating spatial patterns of internal hydrological behaviour has the potential to reveal simulations that, whilst consistent with measured outlet discharge, are qualitatively dissimilar to our perceptual understanding of how the system should behave. We argue that such understanding, which may be derived from stakeholder knowledge across different catchments for certain process dynamics, is a valuable source of information to help reject non-behavioural models, and therefore identify feasible model structures and parameters. The challenge, however, is to convert different sources of often qualitative and/or semi-qualitative information into robust quantitative constraints of model states and fluxes, and combine these sources of information together to reject models within an efficient calibration framework. Here we present the development of a framework to incorporate different sources of data to efficiently calibrate distributed catchment models. For each source of information, an interval or inequality is used to define the behaviour of the catchment system. These intervals are then combined to produce a hyper-volume in state space, which is used to identify behavioural models. We apply the methodology to calibrate the Penn State Integrated Hydrological Model (PIHM) at the Wye catchment, Plynlimon, UK. Outlet discharge behaviour is successfully simulated when perceptual understanding of relative groundwater levels between lowland peat, upland peat and valley slopes within the catchment are used to identify behavioural models. The process of converting qualitative information into quantitative constraints forces us to evaluate the assumptions behind our perceptual understanding in order to derive robust constraints, and therefore fairly reject models and avoid type II errors. Likewise, consideration needs to be given to the commensurability problem when mapping perceptual understanding to constrain model states.
NASA Astrophysics Data System (ADS)
Croft, Holly; Anderson, Karen; Kuhn, Nikolaus J.
2010-05-01
The ability to quantitatively and spatially assess soil surface roughness is important in geomorphology and land degradation studies. Soils can experience rapid structural degradation in response to land cover changes, resulting in increased susceptibility to erosion and a loss of Soil Organic Matter (SOM). Changes in soil surface condition can also alter sediment detachment, transport and deposition processes, infiltration rates and surface runoff characteristics. Deriving spatially distributed quantitative information on soil surface condition for inclusion in hydrological and soil erosion models is therefore paramount. However, due to the time and resources involved in using traditional field sampling techniques, there is a lack of spatially distributed information on soil surface condition. Laser techniques can provide data for a rapid three dimensional representation of the soil surface at a fine spatial resolution. This provides the ability to capture changes at the soil surface associated with aggregate breakdown, flow routing, erosion and sediment re-distribution. Semi-variogram analysis of the laser data can be used to represent spatial dependence within the dataset; providing information about the spatial character of soil surface structure. This experiment details the ability of semi-variogram analysis to spatially describe changes in soil surface condition. Soil for three soil types (silt, silt loam and silty clay) was sieved to produce aggregates between 1 mm and 16 mm in size and placed evenly in sample trays (25 x 20 x 2 cm). Soil samples for each soil type were exposed to five different durations of artificial rainfall, to produce progressively structurally degraded soil states. A calibrated laser profiling instrument was used to measure surface roughness over a central 10 x 10 cm plot of each soil state, at 2 mm sample spacing. The laser data were analysed within a geostatistical framework, where semi-variogram analysis quantitatively represented the change in soil surface structure during crusting. The laser data were also used to create digital surface models (DSM) of the soil states for visual comparison. This research has shown that aggregate breakdown and soil crusting can be shown quantitatively by a decrease in sill variance (silt soil: 11.67 (control) to 1.08 (after 90 mins rainfall)). Features present within semi-variograms were spatially linked to features at the soil surface, such as soil cracks, tillage lines and areas of deposition. Directional semi-variograms were used to provide a spatially orientated component, where the directional sill variance associated with a soil crack was shown to increase from 7.95 to 19.33. Periodicity within semi-variogram was also shown to quantify the spatial scale of soil cracking networks and potentially surface flowpaths; an average distance between soil cracks of 37 mm closely corresponded to the distance of 38 mm shown in the semi-variogram. The results provide a strong basis for the future retrieval of spatio-temporal variations in soil surface condition. Furthermore, the presence of process-based information on hydrological pathways within semi-variograms may work towards an inclusion of geostatisically-derived information in land surface models and the understanding of complex surface processes at different spatial scales.
He, Gu; Qiu, Minghua; Li, Rui; Ouyang, Liang; Wu, Fengbo; Song, Xiangrong; Cheng, Li; Xiang, Mingli; Yu, Luoting
2012-06-01
Aurora-A has been known as one of the most important targets for cancer therapy, and some Aurora-A inhibitors have entered clinical trails. In this study, combination of the ligand-based and structure-based methods is used to clarify the essential quantitative structure-activity relationship of known Aurora-A inhibitors, and multicomplex-based pharmacophore-guided method has been suggested to generate a comprehensive pharmacophore of Aurora-A kinase based on a collection of crystal structures of Aurora-A-inhibitor complex. This model has been successfully used to identify the bioactive conformation and align 37 structurally diverse N-substituted 2'-(aminoaryl)benzothiazoles derivatives. The quantitative structure-activity relationship analyses have been performed on these Aurora-A inhibitors based on multicomplex-based pharmacophore-guided alignment. These results may provide important information for further design and virtual screening of novel Aurora-A inhibitors. © 2012 John Wiley & Sons A/S.
Quantitative imaging of disease signatures through radioactive decay signal conversion
Thorek, Daniel LJ; Ogirala, Anuja; Beattie, Bradley J; Grimm, Jan
2013-01-01
In the era of personalized medicine there is an urgent need for in vivo techniques able to sensitively detect and quantify molecular activities. Sensitive imaging of gamma rays is widely used, but radioactive decay is a physical constant and signal is independent of biological interactions. Here we introduce a framework of novel targeted and activatable probes excited by a nuclear decay-derived signal to identify and measure molecular signatures of disease. This was accomplished utilizing Cerenkov luminescence (CL), the light produced by β-emitting radionuclides such as clinical positron emission tomography (PET) tracers. Disease markers were detected using nanoparticles to produce secondary Cerenkov-induced fluorescence. This approach reduces background signal compared to conventional fluorescence imaging. In addition to information from a PET scan, we demonstrate novel medical utility by quantitatively determining prognostically relevant enzymatic activity. This technique can be applied to monitor other markers and facilitates a shift towards activatable nuclear medicine agents. PMID:24013701
Quantification of local and global benefits from air pollution control in Mexico City.
Mckinley, Galen; Zuk, Miriam; Höjer, Morten; Avalos, Montserrat; González, Isabel; Iniestra, Rodolfo; Laguna, Israel; Martínez, Miguel A; Osnaya, Patricia; Reynales, Luz M; Valdés, Raydel; Martínez, Julia
2005-04-01
Complex sociopolitical, economic, and geographical realities cause the 20 million residents of Mexico City to suffer from some of the worst air pollution conditions in the world. Greenhouse gas emissions from the city are also substantial, and opportunities for joint local-global air pollution control are being sought. Although a plethora of measures to improve local air quality and reduce greenhouse gas emissions have been proposed for Mexico City, resources are not available for implementation of all proposed controls and thus prioritization must occur. Yet policy makers often do not conduct comprehensive quantitative analyses to inform these decisions. We reanalyze a subset of currently proposed control measures, and derive cost and health benefit estimates that are directly comparable. This study illustrates that improved quantitative analysis can change implementation prioritization for air pollution and greenhouse gas control measures in Mexico City.
Falcone, James A.; Carlisle, Daren M.; Wolock, David M.; Meador, Michael R.
2010-01-01
In addition, watersheds were assessed for their reference quality within nine broad regions for use in studies intended to characterize stream flows under conditions minimally influenced by human activities. Three primary criteria were used to assess reference quality: (1) a quantitative index of anthropogenic modification within the watershed based on GIS-derived variables, (2) visual inspection of every stream gage and drainage basin from recent high-resolution imagery and topographic maps, and (3) information about man-made influences from USGS Annual Water Data Reports. From the set of 6785 sites, we identified 1512 as reference-quality stream gages. All data derived for these watersheds as well as the reference condition evaluation are provided as an online data set termed GAGES (geospatial attributes of gages for evaluating stream flow).
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...
NASA Astrophysics Data System (ADS)
Engelmann, Brett Warren
The Src homology 2 (SH2) domains evolved alongside protein tyrosine kinases (PTKs) and phosphatases (PTPs) in metazoans to recognize the phosphotyrosine (pY) post-translational modification. The human genome encodes 121 SH2 domains within 111 SH2 domain containing proteins that represent the primary mechanism for cellular signal transduction immediately downstream of PTKs. Despite pY recognition contributing to roughly half of the binding energy, SH2 domains possess substantial binding specificity, or affinity discrimination between phosphopeptide ligands. This specificity is largely imparted by amino acids (AAs) adjacent to the pY, typically from positions +1 to +4 C-terminal to the pY. Much experimental effort has been undertaken to construct preferred binding motifs for many SH2 domains. However, due to limitations in previous experimental methodologies these motifs do not account for the interplay between AAs. It was therefore not known how AAs within the context of individual peptides function to impart SH2 domain specificity. In this work we identified the critical role context plays in defining SH2 domain specificity for physiological ligands. We also constructed a high quality interactome using 50 SH2 domains and 192 physiological ligands. We next developed a quantitative high-throughput (Q-HTP) peptide microarray platform to assess the affinities four SH2 domains have for 124 physiological ligands. We demonstrated the superior characteristics of our platform relative to preceding approaches and validated our results using established biophysical techniques, literature corroboration, and predictive algorithms. The quantitative information provided by the arrays was leveraged to investigate SH2 domain binding distributions and identify points of binding overlap. Our microarray derived affinity estimates were integrated to produce quantitative interaction motifs capable of predicting interactions. Furthermore, our microarrays proved capable of resolving subtle contextual differences within motifs that modulate interaction affinities. We conclude that contextually informed specificity profiling of protein interaction domains using the methodologies developed in this study can inform efforts to understand the interconnectivity of signaling networks in normal and aberrant states. Three supplementary tables containing detailed lists of peptides, interactions, and sources of corroborative information are provided.
Lankford, Christopher L; Does, Mark D
2018-02-01
Quantitative MRI may require correcting for nuisance parameters which can or must be constrained to independently measured or assumed values. The noise and/or bias in these constraints propagate to fitted parameters. For example, the case of refocusing pulse flip angle constraint in multiple spin echo T 2 mapping is explored. An analytical expression for the mean-squared error of a parameter of interest was derived as a function of the accuracy and precision of an independent estimate of a nuisance parameter. The expression was validated by simulations and then used to evaluate the effects of flip angle (θ) constraint on the accuracy and precision of T⁁2 for a variety of multi-echo T 2 mapping protocols. Constraining θ improved T⁁2 precision when the θ-map signal-to-noise ratio was greater than approximately one-half that of the first spin echo image. For many practical scenarios, constrained fitting was calculated to reduce not just the variance but the full mean-squared error of T⁁2, for bias in θ⁁≲6%. The analytical expression derived in this work can be applied to inform experimental design in quantitative MRI. The example application to T 2 mapping provided specific cases, depending on θ⁁ accuracy and precision, in which θ⁁ measurement and constraint would be beneficial to T⁁2 variance or mean-squared error. Magn Reson Med 79:673-682, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Nondestructive Redox Quantification Reveals Glassmaking of Rare French Gothic Stained Glasses
2017-01-01
The sophisticated colors of medieval glasses arise from their transition metal (TM) impurities and capture information about ancient glassmaking techniques. Beyond the glass chemical composition, the TM redox is also a key factor in the glass color, but its quantification without any sampling is a challenge. We report a combination of nondestructive and noninvasive quantitative analyses of the chemical composition by particle-induced X-ray emission–particle-induced γ-ray emission mappings and of the color and TM element speciation by optical absorption spectroscopy performed on a red-blue-purple striped glass from the stained glass windows of the Sainte-Chapelle in Paris, France, during its restoration. These particular glass pieces must have been produced as a single shot, which guarantees that the chemical variations reflect the recipe in use in a specific medieval workshop. The quantitative elemental mappings demonstrate that the colored glass parts are derived from the same base glass, to which TMs were deliberately added. Optical absorption spectra reveal the origin of the colors: blue from CoII, red from copper nanoparticles, and purple from MnIII. Furthermore, the derivation of the quantitative redox state of each TM in each color shows that the contents of Fe, Cu, and Mn were adjusted to ensure a reducing glass matrix in the red stripe or a metastable overoxidized glass in the purple stripe. We infer that the agility of the medieval glassmaker allowed him to master the redox kinetics in the glass by rapid shaping and cooling to obtain a snapshot of the thermodynamically unstable glass colors. PMID:28494150
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2014 CFR
2014-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2013 CFR
2013-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2011 CFR
2011-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
21 CFR 101.60 - Nutrient content claims for the calorie content of foods.
Code of Federal Regulations, 2012 CFR
2012-04-01
... percent fewer calories than regular cupcakes”); and (B) Quantitative information comparing the level of..., the quantitative information may be located elsewhere on the information panel in accordance with... fewer calories per oz (or 3 oz) than our regular Lasagna”); and (B) Quantitative information comparing...
Quantitative analysis of task selection for brain-computer interfaces
NASA Astrophysics Data System (ADS)
Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.
2014-10-01
Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.
Feared consequences of panic attacks in panic disorder: a qualitative and quantitative analysis.
Raffa, Susan D; White, Kamila S; Barlow, David H
2004-01-01
Cognitions are hypothesized to play a central role in panic disorder (PD). Previous studies have used questionnaires to assess cognitive content, focusing on prototypical cognitions associated with PD; however, few studies have qualitatively examined cognitions associated with the feared consequences of panic attacks. The purpose of this study was to conduct a qualitative and quantitative analysis of feared consequences of panic attacks. The initial, qualitative analysis resulted in the development of 32 categories of feared consequences. The categories were derived from participant responses to a standardized, semi-structured question (n = 207). Five expert-derived categories were then utilized to quantitatively examine the relationship between cognitions and indicators of PD severity. Cognitions did not predict PD severity; however, correlational analyses indicated some predictive validity to the expert-derived categories. The qualitative analysis identified additional areas of patient-reported concern not included in previous research that may be important in the assessment and treatment of PD.
The structure and timescales of heat perception in larval zebrafish.
Haesemeyer, Martin; Robson, Drew N; Li, Jennifer M; Schier, Alexander F; Engert, Florian
2015-11-25
Avoiding temperatures outside the physiological range is critical for animal survival, but how temperature dynamics are transformed into behavioral output is largely not understood. Here, we used an infrared laser to challenge freely swimming larval zebrafish with "white-noise" heat stimuli and built quantitative models relating external sensory information and internal state to behavioral output. These models revealed that larval zebrafish integrate temperature information over a time-window of 400 ms preceding a swimbout and that swimming is suppressed right after the end of a bout. Our results suggest that larval zebrafish compute both an integral and a derivative across heat in time to guide their next movement. Our models put important constraints on the type of computations that occur in the nervous system and reveal principles of how somatosensory temperature information is processed to guide behavioral decisions such as sensitivity to both absolute levels and changes in stimulation.
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 17 Commodity and Securities Exchanges 3 2014-04-01 2014-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...
A multi-agent architecture for geosimulation of moving agents
NASA Astrophysics Data System (ADS)
Vahidnia, Mohammad H.; Alesheikh, Ali A.; Alavipanah, Seyed Kazem
2015-10-01
In this paper, a novel architecture is proposed in which an axiomatic derivation system in the form of first-order logic facilitates declarative explanation and spatial reasoning. Simulation of environmental perception and interaction between autonomous agents is designed with a geographic belief-desire-intention and a request-inform-query model. The architecture has a complementary quantitative component that supports collaborative planning based on the concept of equilibrium and game theory. This new architecture presents a departure from current best practices geographic agent-based modelling. Implementation tasks are discussed in some detail, as well as scenarios for fleet management and disaster management.
Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods
NASA Technical Reports Server (NTRS)
Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.
1990-01-01
Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modem video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.
Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods
NASA Technical Reports Server (NTRS)
Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.
1990-01-01
Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modern video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.
Wave field restoration using three-dimensional Fourier filtering method.
Kawasaki, T; Takai, Y; Ikuta, T; Shimizu, R
2001-11-01
A wave field restoration method in transmission electron microscopy (TEM) was mathematically derived based on a three-dimensional (3D) image formation theory. Wave field restoration using this method together with spherical aberration correction was experimentally confirmed in through-focus images of amorphous tungsten thin film, and the resolution of the reconstructed phase image was successfully improved from the Scherzer resolution limit to the information limit. In an application of this method to a crystalline sample, the surface structure of Au(110) was observed in a profile-imaging mode. The processed phase image showed quantitatively the atomic relaxation of the topmost layer.
Wang, Chengjian; Zhang, Ping; Jin, Wanjun; Li, Lingmei; Qiang, Shan; Zhang, Ying; Huang, Linjuan; Wang, Zhongfu
2017-01-06
Rapid, simple and versatile methods for quantitative analysis of glycoprotein O-glycans are urgently required for current studies on protein O-glycosylation patterns and the search for disease O-glycan biomarkers. Relative quantitation of O-glycans using stable isotope labeling followed by mass spectrometric analysis represents an ideal and promising technique. However, it is hindered by the shortage of reliable nonreductive O-glycan release methods as well as the too large or too small inconstant mass difference between the light and heavy isotope form derivatives of O-glycans, which results in difficulties during the recognition and quantitative analysis of O-glycans by mass spectrometry. Herein we report a facile and versatile O-glycan relative quantification strategy, based on an improved one-pot method that can quantitatively achieve nonreductive release and in situ chromophoric labeling of intact mucin-type O-glycans in one step. In this study, the one-pot method is optimized and applied for quantitative O-glycan release and tagging with either non-deuterated (d 0 -) or deuterated (d 5 -) 1-phenyl-3-methyl-5-pyrazolone (PMP). The obtained O-glycan derivatives feature a permanent 10-Da mass difference between the d 0 - and d 5 -PMP forms, allowing complete discrimination and comparative quantification of these isotopically labeled O-glycans by mass spectrometric techniques. Moreover, the d 0 - and d 5 -PMP derivatives of O-glycans also have a relatively high hydrophobicity as well as a strong UV adsorption, especially suitable for high-resolution separation and high-sensitivity detection by RP-HPLC-UV. We have refined the conditions for the one-pot reaction as well as the corresponding sample purification approach. The good quantitation feasibility, reliability and linearity of this strategy have been verified using bovine fetuin and porcine stomach mucin as model O-glycoproteins. Additionally, we have also successfully applied this method to the quantitative O-glycomic comparison between perch and salmon eggs by ESI-MS, MS/MS and online RP-HPLC-UV-ESI-MS/MS, demonstrating its excellent applicability to various complex biological samples. O-Linked glycoproteins, generated via a widely existing glycosylation modification process on serine (Ser) or threonine (Thr) residues of nascent proteins, play essential roles in a series of biological processes. As a type of informational molecule, the O-glycans of these glycoproteins participate directly in these biological mechanisms. Thus, the characteristic differences or changes of O-glycans in expression level usually relate to pathologies of many diseases and represent an important opportunity to uncover the functional mechanisms of various glycoprotein O-glycans. The novel strategy introduced here provides a simple and versatile analytical method for the precise quantitation of glycoprotein O-glycans by mass spectrometry, enabling rapid evaluation of the differences or changes of O-glycans in expression level. It is attractive for the field of quantitative/comparative O-glycomics, which has great significance for exploring the complex structure-function relationship of O-glycans, as well as for the search of O-glycan biomarkers of some major diseases and O-glycan related targets of some drugs. Copyright © 2016 Elsevier B.V. All rights reserved.
Experimental and numerical analysis of interfilament resistances in NbTi strands
NASA Astrophysics Data System (ADS)
Breschi, M.; Massimini, M.; Ribani, P. L.; Spina, T.; Corato, V.
2014-05-01
Superconducting strands are composite wires made of fine superconducting filaments embedded in a metallic matrix. The transverse resistivity among superconducting filaments affects the coupling losses during electromagnetic transients and the electro-thermal behavior of the wire in case of a quench. A direct measurement of the transverse interfilament resistance as a function of temperature in NbTi multi-filamentary wires was performed at the ENEA Frascati Superconductivity Division, Italy by means of a four-probe method. The complexity of these measurements is remarkable, due to the current distribution phenomena that occur among superconducting filaments during these tests. A two-dimensional finite element method model of the wire cross section and a three-dimensional electrical circuit model of the wire sample developed at the University of Bologna are applied here to derive qualitative and quantitative information about the transverse electrical resistance matrix. The experiment is aimed at verifying the qualitative behaviors and trends predicted by the numerical calculations, especially concerning the current redistribution length and consequent length effects of the sample under test. A fine tuning of the model parameters at the filament level allowed us to reproduce the experimental results and get quantitative information about the current distribution phenomena between filaments.
NASA Astrophysics Data System (ADS)
Yan, X. L.; Coetsee, E.; Wang, J. Y.; Swart, H. C.; Terblans, J. J.
2017-07-01
The polycrystalline Ni/Cu multilayer thin films consisting of 8 alternating layers of Ni and Cu were deposited on a SiO2 substrate by means of electron beam evaporation in a high vacuum. Concentration-depth profiles of the as-deposited multilayered Ni/Cu thin films were determined with Auger electron spectroscopy (AES) in combination with Ar+ ion sputtering, under various bombardment conditions with the samples been stationary as well as rotating in some cases. The Mixing-Roughness-Information depth (MRI) model used for the fittings of the concentration-depth profiles accounts for the interface broadening of the experimental depth profiling. The interface broadening incorporates the effects of atomic mixing, surface roughness and information depth of the Auger electrons. The roughness values extracted from the MRI model fitting of the depth profiling data agrees well with those measured by atomic force microscopy (AFM). The ion sputtering induced surface roughness during the depth profiling was accordingly quantitatively evaluated from the fitted MRI parameters with sample rotation and stationary conditions. The depth resolutions of the AES depth profiles were derived directly from the values determined by the fitting parameters in the MRI model.
Spatio-temporal models of mental processes from fMRI.
Janoos, Firdaus; Machiraju, Raghu; Singh, Shantanu; Morocz, Istvan Ákos
2011-07-15
Understanding the highly complex, spatially distributed and temporally organized phenomena entailed by mental processes using functional MRI is an important research problem in cognitive and clinical neuroscience. Conventional analysis methods focus on the spatial dimension of the data discarding the information about brain function contained in the temporal dimension. This paper presents a fully spatio-temporal multivariate analysis method using a state-space model (SSM) for brain function that yields not only spatial maps of activity but also its temporal structure along with spatially varying estimates of the hemodynamic response. Efficient algorithms for estimating the parameters along with quantitative validations are given. A novel low-dimensional feature-space for representing the data, based on a formal definition of functional similarity, is derived. Quantitative validation of the model and the estimation algorithms is provided with a simulation study. Using a real fMRI study for mental arithmetic, the ability of this neurophysiologically inspired model to represent the spatio-temporal information corresponding to mental processes is demonstrated. Moreover, by comparing the models across multiple subjects, natural patterns in mental processes organized according to different mental abilities are revealed. Copyright © 2011 Elsevier Inc. All rights reserved.
Röwer, Claudia; Vissers, Johannes P C; Koy, Cornelia; Kipping, Marc; Hecker, Michael; Reimer, Toralf; Gerber, Bernd; Thiesen, Hans-Jürgen; Glocker, Michael O
2009-12-01
As more and more alternative treatments become available for breast carcinoma, there is a need to stratify patients and individual molecular information seems to be suitable for this purpose. In this study, we applied label-free protein quantitation by nanoscale LC-MS and investigated whether this approach could be used for defining a proteome signature for invasive ductal breast carcinoma. Tissue samples from healthy breast and tumor were collected from three patients. Protein identifications were based on LC-MS peptide fragmentation data which were obtained simultaneously to the quantitative information. Hereby, an invasive ductal breast carcinoma proteome signature was generated which contains 60 protein entries. The on-column concentrations for osteoinductive factor, vimentin, GAP-DH, and NDKA are provided as examples. These proteins represent distinctive gene ontology groups of differentially expressed proteins and are discussed as risk markers for primary tumor pathogenesis. The developed methodology has been found well applicable in a clinical environment in which standard operating procedures can be kept; a prerequisite for the definition of molecular parameter sets that shall be capable for stratification of patients.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-08
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. The FHWA received no comments in...
Normalizing the causality between time series.
Liang, X San
2015-08-01
Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.
Thermodynamics of information exchange between two coupled quantum dots
NASA Astrophysics Data System (ADS)
Kutvonen, Aki; Sagawa, Takahiro; Ala-Nissila, Tapio
2016-03-01
We propose a setup based on two coupled quantum dots where thermodynamics of a measurement can be quantitatively characterized. The information obtained in the measurement can be utilized by performing feedback in a manner apparently breaking the second law of thermodynamics. In this way the setup can be operated as a Maxwell's demon, where both the measurement and feedback are performed separately by controlling an external parameter. This is analogous to the case of the original Szilard engine. Since the setup contains both the microscopic demon and the engine itself, the operation of the whole measurement-feedback cycle can be explained in detail at the level of single realizations. In addition, we derive integral fluctuation relations for both the bare and coarse-grained entropy productions in the setup.
Normalizing the causality between time series
NASA Astrophysics Data System (ADS)
Liang, X. San
2015-08-01
Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Tao; Tsui, Benjamin M. W.; Li, Xin
Purpose: The radioligand {sup 11}C-KR31173 has been introduced for positron emission tomography (PET) imaging of the angiotensin II subtype 1 receptor in the kidney in vivo. To study the biokinetics of {sup 11}C-KR31173 with a compartmental model, the input function is needed. Collection and analysis of arterial blood samples are the established approach to obtain the input function but they are not feasible in patients with renal diseases. The goal of this study was to develop a quantitative technique that can provide an accurate image-derived input function (ID-IF) to replace the conventional invasive arterial sampling and test the method inmore » pigs with the goal of translation into human studies. Methods: The experimental animals were injected with [{sup 11}C]KR31173 and scanned up to 90 min with dynamic PET. Arterial blood samples were collected for the artery derived input function (AD-IF) and used as a gold standard for ID-IF. Before PET, magnetic resonance angiography of the kidneys was obtained to provide the anatomical information required for derivation of the recovery coefficients in the abdominal aorta, a requirement for partial volume correction of the ID-IF. Different image reconstruction methods, filtered back projection (FBP) and ordered subset expectation maximization (OS-EM), were investigated for the best trade-off between bias and variance of the ID-IF. The effects of kidney uptakes on the quantitative accuracy of ID-IF were also studied. Biological variables such as red blood cell binding and radioligand metabolism were also taken into consideration. A single blood sample was used for calibration in the later phase of the input function. Results: In the first 2 min after injection, the OS-EM based ID-IF was found to be biased, and the bias was found to be induced by the kidney uptake. No such bias was found with the FBP based image reconstruction method. However, the OS-EM based image reconstruction was found to reduce variance in the subsequent phase of the ID-IF. The combined use of FBP and OS-EM resulted in reduced bias and noise. After performing all the necessary corrections, the areas under the curves (AUCs) of the AD-IF were close to that of the AD-IF (average AUC ratio =1 ± 0.08) during the early phase. When applied in a two-tissue-compartmental kinetic model, the average difference between the estimated model parameters from ID-IF and AD-IF was 10% which was within the error of the estimation method. Conclusions: The bias of radioligand concentration in the aorta from the OS-EM image reconstruction is significantly affected by radioligand uptake in the adjacent kidney and cannot be neglected for quantitative evaluation. With careful calibrations and corrections, the ID-IF derived from quantitative dynamic PET images can be used as the input function of the compartmental model to quantify the renal kinetics of {sup 11}C-KR31173 in experimental animals and the authors intend to evaluate this method in future human studies.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-07
... the Draft Guidance of Applying Quantitative Data To Develop Data-Derived Extrapolation Factors for.... SUMMARY: EPA is announcing that Eastern Research Group, Inc. (ERG), a contractor to the EPA, will convene an independent panel of experts to review the draft document, ``Guidance for Applying Quantitative...
Quantitative Verse in a Quantity-Insensitive Language: Baif's "vers mesures."
ERIC Educational Resources Information Center
Bullock, Barbara E.
1997-01-01
Analysis of the quantitative metrical verse of French Renaissance poet Jean-Antoine de Baif finds that the metrics, often seen as unscannable and using an incomprehensible phonetic orthography, derive largely from a system that is accentual, with the orthography permitting the poet to encode quantitative distinctions that coincide with the meter.…
Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy
NASA Astrophysics Data System (ADS)
Sindern, Sven; Meyer, F. Michael
2016-09-01
Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become increasingly important for supply of REEs in the future.
Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S
2015-01-16
Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.
Danner, Marion; Vennedey, Vera; Hiligsmann, Mickaël; Fauser, Sascha; Stock, Stephanie
2016-02-01
Patients suffering from age-related macular degeneration (AMD) are rarely actively involved in decision-making, despite facing preference-sensitive treatment decisions. This paper presents a qualitative study to prepare quantitative preference elicitation in AMD patients. The aims of this study were (1) to gain familiarity with and learn about the special requirements of the AMD patient population for quantitative data collection; and (2) to select/refine patient-relevant treatment attributes and levels, and gain insights into preference structures. Semi-structured focus group interviews were performed. An interview guide including preselected categories in the form of seven potentially patient-relevant treatment attributes was followed. To identify the most patient-relevant treatment attributes, a ranking exercise was performed. Deductive content analyses were done by two independent reviewers for each attribute to derive subcategories (potential levels of attributes) and depict preference trends. The focus group interviews included 21 patients. The interviews revealed that quantitative preference surveys in this population will have to be interviewer assisted to make the survey feasible for patients. The five most patient-relevant attributes were the effect on visual function [ranking score (RS): 139], injection frequency (RS: 101), approval status (RS: 83), side effects (RS: 79), and monitoring frequency (RS: 76). Attribute and level refinement was based on patients' statements. Preference trends and dependencies between attributes informed the quantitative instrument design. This study suggests that qualitative research is a very helpful step to prepare the design and administration of quantitative preference elicitation instruments. It especially facilitated familiarization with the target population and its preferences, and it supported attribute/level refinement.
Bayesian networks and information theory for audio-visual perception modeling.
Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis
2010-09-01
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.
Toxicology profiles of chemical and radiological contaminants at Hanford
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, B.L.; Strenge, D.L.; Stenner, R.D.
1995-07-01
This document summarizes toxicology information required under Section 3.3 (Toxicity Assessment) of HSRAM, and can also be used to develop the short toxicology profiles required in site assessments (described in HSRAM, Section 3.3.5). Toxicology information is used in the dose-response step of the risk assessment process. The dose-response assessment describes the quantitative relationship between the amount of exposure to a substance and the extent of toxic injury or disease. Data are derived from animal studies or, less frequently, from studies in exposed human populations. The risks of a substance cannot be ascertained with any degree of confidence unless dose-response relationsmore » are quantified. This document summarizes dose-response information available from the US Environmental Protection Agency (EPA). The contaminants selected for inclusion in this document represent most of the contaminants found at Hanford (both radiological and chemical), based on sampling and analysis performed during site investigations, and historical information on waste disposal practices at the Hanford Site.« less
The information needs and information seeking behaviour of family doctors.
Bryant, Sue Lacey
2004-06-01
To explore the information needs and information seeking behaviour of family doctors, identifying any differences in attitudes and behaviours deriving from membership of a training practice and investigating the impact of a practice librarian. A case study of general practitioners (GPs) in Aylesbury Vale incorporated a quantitative study of use of the medical library, and two qualitative techniques, in-depth interviews and group discussions. A total of 58 GPs, almost three quarters of those in the Vale, participated; 19 via individual interviews and a further 39 via two group discussions. Family doctors are prompted to seek information by needs arising from a combination of professional responsibilities and personal characteristics. A need for problem-orientated information, related to the care of individual patients, was the predominant factor that prompted these GPs to seek information. Personal collections remain the preferred information resource; electronic sources rank second. The study demonstrated low use of the medical library. However, both vocational training and the employment of a practice librarian impacted on library use. The study illuminates the information needs and preferences of GPs and illustrates the contribution that librarians may make at practice level, indicating the importance of outreach work.
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2011 CFR
2011-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2012 CFR
2012-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
21 CFR 101.62 - Nutrient content claims for fat, fatty acid, and cholesterol content of foods.
Code of Federal Regulations, 2013 CFR
2013-04-01
... fat—50 percent less fat than our regular brownies”); and (B) Quantitative information comparing the... quantitative information may be located elsewhere on the information panel in accordance with § 101.2. (iii... percent less fat per 3 oz than our regular spinach souffle”); and (B) Quantitative information comparing...
Neuroergonomics: Quantitative Modeling of Individual, Shared, and Team Neurodynamic Information.
Stevens, Ronald H; Galloway, Trysha L; Willemsen-Dunlap, Ann
2018-06-01
The aim of this study was to use the same quantitative measure and scale to directly compare the neurodynamic information/organizations of individual team members with those of the team. Team processes are difficult to separate from those of individual team members due to the lack of quantitative measures that can be applied to both process sets. Second-by-second symbolic representations were created of each team member's electroencephalographic power, and quantitative estimates of their neurodynamic organizations were calculated from the Shannon entropy of the symbolic data streams. The information in the neurodynamic data streams of health care ( n = 24), submarine navigation ( n = 12), and high school problem-solving ( n = 13) dyads was separated into the information of each team member, the information shared by team members, and the overall team information. Most of the team information was the sum of each individual's neurodynamic information. The remaining team information was shared among the team members. This shared information averaged ~15% of the individual information, with momentary levels of 1% to 80%. Continuous quantitative estimates can be made from the shared, individual, and team neurodynamic information about the contributions of different team members to the overall neurodynamic organization of a team and the neurodynamic interdependencies among the team members. Information models provide a generalizable quantitative method for separating a team's neurodynamic organization into that of individual team members and that shared among team members.
Wavelet entropy: a new tool for analysis of short duration brain electrical signals.
Rosso, O A; Blanco, S; Yordanova, J; Kolev, V; Figliola, A; Schürmann, M; Başar, E
2001-01-30
Since traditional electrical brain signal analysis is mostly qualitative, the development of new quantitative methods is crucial for restricting the subjectivity in the study of brain signals. These methods are particularly fruitful when they are strongly correlated with intuitive physical concepts that allow a better understanding of brain dynamics. Here, new method based on orthogonal discrete wavelet transform (ODWT) is applied. It takes as a basic element the ODWT of the EEG signal, and defines the relative wavelet energy, the wavelet entropy (WE) and the relative wavelet entropy (RWE). The relative wavelet energy provides information about the relative energy associated with different frequency bands present in the EEG and their corresponding degree of importance. The WE carries information about the degree of order/disorder associated with a multi-frequency signal response, and the RWE measures the degree of similarity between different segments of the signal. In addition, the time evolution of the WE is calculated to give information about the dynamics in the EEG records. Within this framework, the major objective of the present work was to characterize in a quantitative way functional dynamics of order/disorder microstates in short duration EEG signals. For that aim, spontaneous EEG signals under different physiological conditions were analyzed. Further, specific quantifiers were derived to characterize how stimulus affects electrical events in terms of frequency synchronization (tuning) in the event related potentials.
Burstyn, Igor; Boffetta, Paolo; Kauppinen, Timo; Heikkilä, Pirjo; Svane, Ole; Partanen, Timo; Stücker, Isabelle; Frentzel-Beyme, Rainer; Ahrens, Wolfgang; Merzenich, Hiltrud; Heederik, Dick; Hooiveld, Mariëtte; Langård, Sverre; Randem, Britt G; Järvholm, Bengt; Bergdahl, Ingvar; Shaham, Judith; Ribak, Joseph; Kromhout, Hans
2003-01-01
An exposure matrix (EM) for known and suspected carcinogens was required for a multicenter international cohort study of cancer risk and bitumen among asphalt workers. Production characteristics in companies enrolled in the study were ascertained through use of a company questionnaire (CQ). Exposures to coal tar, bitumen fume, organic vapor, polycyclic aromatic hydrocarbons, diesel fume, silica, and asbestos were assessed semi-quantitatively using information from CQs, expert judgment, and statistical models. Exposures of road paving workers to bitumen fume, organic vapor, and benzo(a)pyrene were estimated quantitatively by applying regression models, based on monitoring data, to exposure scenarios identified by the CQs. Exposures estimates were derived for 217 companies enrolled in the cohort, plus the Swedish asphalt paving industry in general. Most companies were engaged in road paving and asphalt mixing, but some also participated in general construction and roofing. Coal tar use was most common in Denmark and The Netherlands, but the practice is now obsolete. Quantitative estimates of exposure to bitumen fume, organic vapor, and benzo(a)pyrene for pavers, and semi-quantitative estimates of exposure to these agents among all subjects were strongly correlated. Semi-quantitative estimates of exposure to bitumen fume and coal tar exposures were only moderately correlated. EM assessed non-monotonic historical decrease in exposures to all agents assessed except silica and diesel exhaust. We produced a data-driven EM using methodology that can be adapted for other multicenter studies. Copyright 2003 Wiley-Liss, Inc.
Ratio maps of iron ore deposits Atlantic City district, Wyoming
NASA Technical Reports Server (NTRS)
Vincent, R. K.
1973-01-01
Preliminary results of a spectral rationing technique are shown for a region at the southern end of the Wind River Range, Wyoming. Digital ratio graymaps and analog ratio images have been produced for the test site, but ground truth is not yet available for thorough interpretation of these products. ERTS analog ratio images were found generally better than either ERTS single-channel images or high altitude aerial photos for the discrimination of vegetation from non-vegetation in the test site region. Some linear geological features smaller than the ERTS spatial resolution are seen as well in ERTS ratio and single-channel images as in high altitude aerial photography. Geochemical information appears to be extractable from ERTS data. Good preliminary quantitative agreement between ERTS-derived ratios and laboratory-derived reflectance ratios of rocks and minerals encourage plans to use lab data as training sets for a simple ratio gating logic approach to automatic recognition maps.
Ohyama, Akio; Shirasawa, Kenta; Matsunaga, Hiroshi; Negoro, Satomi; Miyatake, Koji; Yamaguchi, Hirotaka; Nunome, Tsukasa; Iwata, Hiroyoshi; Fukuoka, Hiroyuki; Hayashi, Takeshi
2017-08-01
Using newly developed euchromatin-derived genomic SSR markers and a flexible Bayesian mapping method, 13 significant agricultural QTLs were identified in a segregating population derived from a four-way cross of tomato. So far, many QTL mapping studies in tomato have been performed for progeny obtained from crosses between two genetically distant parents, e.g., domesticated tomatoes and wild relatives. However, QTL information of quantitative traits related to yield (e.g., flower or fruit number, and total or average weight of fruits) in such intercross populations would be of limited use for breeding commercial tomato cultivars because individuals in the populations have specific genetic backgrounds underlying extremely different phenotypes between the parents such as large fruit in domesticated tomatoes and small fruit in wild relatives, which may not be reflective of the genetic variation in tomato breeding populations. In this study, we constructed F 2 population derived from a cross between two commercial F 1 cultivars in tomato to extract QTL information practical for tomato breeding. This cross corresponded to a four-way cross, because the four parental lines of the two F 1 cultivars were considered to be the founders. We developed 2510 new expressed sequence tag (EST)-based (euchromatin-derived) genomic SSR markers and selected 262 markers from these new SSR markers and publicly available SSR markers to construct a linkage map. QTL analysis for ten agricultural traits of tomato was performed based on the phenotypes and marker genotypes of F 2 plants using a flexible Bayesian method. As results, 13 QTL regions were detected for six traits by the Bayesian method developed in this study.
Molecular Structure-Based Large-Scale Prediction of Chemical-Induced Gene Expression Changes.
Liu, Ruifeng; AbdulHameed, Mohamed Diwan M; Wallqvist, Anders
2017-09-25
The quantitative structure-activity relationship (QSAR) approach has been used to model a wide range of chemical-induced biological responses. However, it had not been utilized to model chemical-induced genomewide gene expression changes until very recently, owing to the complexity of training and evaluating a very large number of models. To address this issue, we examined the performance of a variable nearest neighbor (v-NN) method that uses information on near neighbors conforming to the principle that similar structures have similar activities. Using a data set of gene expression signatures of 13 150 compounds derived from cell-based measurements in the NIH Library of Integrated Network-based Cellular Signatures program, we were able to make predictions for 62% of the compounds in a 10-fold cross validation test, with a correlation coefficient of 0.61 between the predicted and experimentally derived signatures-a reproducibility rivaling that of high-throughput gene expression measurements. To evaluate the utility of the predicted gene expression signatures, we compared the predicted and experimentally derived signatures in their ability to identify drugs known to cause specific liver, kidney, and heart injuries. Overall, the predicted and experimentally derived signatures had similar receiver operating characteristics, whose areas under the curve ranged from 0.71 to 0.77 and 0.70 to 0.73, respectively, across the three organ injury models. However, detailed analyses of enrichment curves indicate that signatures predicted from multiple near neighbors outperformed those derived from experiments, suggesting that averaging information from near neighbors may help improve the signal from gene expression measurements. Our results demonstrate that the v-NN method can serve as a practical approach for modeling large-scale, genomewide, chemical-induced, gene expression changes.
Sedykh, Alexander; Zhu, Hao; Tang, Hao; Zhang, Liying; Richard, Ann; Rusyn, Ivan; Tropsha, Alexander
2011-01-01
Background Quantitative high-throughput screening (qHTS) assays are increasingly being used to inform chemical hazard identification. Hundreds of chemicals have been tested in dozens of cell lines across extensive concentration ranges by the National Toxicology Program in collaboration with the National Institutes of Health Chemical Genomics Center. Objectives Our goal was to test a hypothesis that dose–response data points of the qHTS assays can serve as biological descriptors of assayed chemicals and, when combined with conventional chemical descriptors, improve the accuracy of quantitative structure–activity relationship (QSAR) models applied to prediction of in vivo toxicity end points. Methods We obtained cell viability qHTS concentration–response data for 1,408 substances assayed in 13 cell lines from PubChem; for a subset of these compounds, rodent acute toxicity half-maximal lethal dose (LD50) data were also available. We used the k nearest neighbor classification and random forest QSAR methods to model LD50 data using chemical descriptors either alone (conventional models) or combined with biological descriptors derived from the concentration–response qHTS data (hybrid models). Critical to our approach was the use of a novel noise-filtering algorithm to treat qHTS data. Results Both the external classification accuracy and coverage (i.e., fraction of compounds in the external set that fall within the applicability domain) of the hybrid QSAR models were superior to conventional models. Conclusions Concentration–response qHTS data may serve as informative biological descriptors of molecules that, when combined with conventional chemical descriptors, may considerably improve the accuracy and utility of computational approaches for predicting in vivo animal toxicity end points. PMID:20980217
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... and opinions, but are not statistical surveys that yield quantitative results that can be generalized... generic clearance for qualitative information will not be used for quantitative information collections... for submission for other generic mechanisms that are designed to yield quantitative results. The...
Using seismic derived lithology parameters for hydrocarbon indication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Riel, P.; Sisk, M.
1996-08-01
The last two decades have shown a strong increase in the use of seismic amplitude information for direct hydrocarbon indication. However, working with seismic amplitudes (and seismic attributes) has several drawbacks: tuning effects must be handled; quantitative analysis is difficult because seismic amplitudes are not directly related to lithology; and seismic amplitudes are reflection events, making it is unclear if amplitude changes relate to lithology variations above or below the interface. These drawbacks are overcome by working directly on seismic derived lithology data, lithology being a layer property rather than an interface property. Technology to extract lithology from seismic datamore » has made great strides, and a large range of methods are now available to users including: (1) Bandlimited acoustic impedance (AI) inversion; (2) Reconstruction of the low AI frequencies from seismic velocities, from spatial well log interpolation, and using constrained sparse spike inversion techniques; (3) Full bandwidth reconstruction of multiple lithology properties (porosity, sand fraction, density etc.,) in time and depth using inverse modeling. For these technologies to be fully leveraged, accessibility by end users is critical. All these technologies are available as interactive 2D and 3D workstation applications, integrated with seismic interpretation functionality. Using field data examples, we will demonstrate the impact of these different approaches on deriving lithology, and in particular show how accuracy and resolution is increased as more geologic and well information is added.« less
Gougoulias, Christos; Clark, Joanna M; Shaw, Liz J
2014-01-01
It is well known that atmospheric concentrations of carbon dioxide (CO2) (and other greenhouse gases) have increased markedly as a result of human activity since the industrial revolution. It is perhaps less appreciated that natural and managed soils are an important source and sink for atmospheric CO2 and that, primarily as a result of the activities of soil microorganisms, there is a soil-derived respiratory flux of CO2 to the atmosphere that overshadows by tenfold the annual CO2 flux from fossil fuel emissions. Therefore small changes in the soil carbon cycle could have large impacts on atmospheric CO2 concentrations. Here we discuss the role of soil microbes in the global carbon cycle and review the main methods that have been used to identify the microorganisms responsible for the processing of plant photosynthetic carbon inputs to soil. We discuss whether application of these techniques can provide the information required to underpin the management of agro-ecosystems for carbon sequestration and increased agricultural sustainability. We conclude that, although crucial in enabling the identification of plant-derived carbon-utilising microbes, current technologies lack the high-throughput ability to quantitatively apportion carbon use by phylogentic groups and its use efficiency and destination within the microbial metabolome. It is this information that is required to inform rational manipulation of the plant–soil system to favour organisms or physiologies most important for promoting soil carbon storage in agricultural soil. PMID:24425529
Rationalising the 'irrational': a think aloud study of discrete choice experiment responses.
Ryan, Mandy; Watson, Verity; Entwistle, Vikki
2009-03-01
Stated preference methods assume respondents' preferences are consistent with utility theory, but many empirical studies report evidence of preferences that violate utility theory. This evidence is often derived from quantitative tests that occur naturally within, or are added to, stated preference tasks. In this study, we use qualitative methods to explore three axioms of utility theory: completeness, monotonicity, and continuity. We take a novel approach, adopting a 'think aloud' technique to identify violations of the axioms of utility theory and to consider how well the quantitative tests incorporated within a discrete choice experiment are able to detect these. Results indicate that quantitative tests classify respondents as being 'irrational' when qualitative statements would indicate they are 'rational'. In particular, 'non-monotonic' responses can often be explained by respondents inferring additional information beyond what is presented in the task, and individuals who appear to adopt non-compensatory decision-making strategies do so because they rate particular attributes very highly (they are not attempting to simplify the task). The results also provide evidence of 'cost-based responses': respondents assumed tests with higher costs would be of higher quality. The value of including in-depth qualitative validation techniques in the development of stated preference tasks is shown.
Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A
2010-08-01
Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.
Optical properties of volcanic ash: improving remote sensing observations.
NASA Astrophysics Data System (ADS)
Whelley, Patrick; Colarco, Peter; Aquila, Valentina; Krotkov, Nickolay; Bleacher, Jake; Garry, Brent; Young, Kelsey; Rocha Lima, Adriana; Martins, Vanderlei; Carn, Simon
2016-04-01
Many times each year explosive volcanic eruptions loft ash into the atmosphere. Global travel and trade rely on aircraft vulnerable to encounters with airborne ash. Volcanic ash advisory centers (VAACs) rely on dispersion forecasts and satellite data to issue timely warnings. To improve ash forecasts model developers and satellite data providers need realistic information about volcanic ash microphysical and optical properties. In anticipation of future large eruptions we can study smaller events to improve our remote sensing and modeling skills so when the next Pinatubo 1991 or larger eruption occurs, ash can confidently be tracked in a quantitative way. At distances >100km from their sources, drifting ash plumes, often above meteorological clouds, are not easily detected from conventional remote sensing platforms, save deriving their quantitative characteristics, such as mass density. Quantitative interpretation of these observations depends on a priori knowledge of the spectral optical properties of the ash in UV (>0.3μm) and TIR wavelengths (>10μm). Incorrect assumptions about the optical properties result in large errors in inferred column mass loading and size distribution, which misguide operational ash forecasts. Similarly, simulating ash properties in global climate models also requires some knowledge of optical properties to improve aerosol speciation.
Naik, P K; Singh, T; Singh, H
2009-07-01
Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-25
... Comprehensive Quantitative Impact Study.'' DATES: You should submit comments by March 26, 2010. ADDRESSES... requesting approval of the following new information collection: Title: Basel Comprehensive Quantitative... quantitative impact study (QIS) to assess the impact of the proposed revisions that were published by the Basel...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... not statistical surveys that yield quantitative results that can be generalized to the population of... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. No comments were received in response...
Hyshka, Elaine; Karekezi, Kamagaju; Tan, Benjamin; Slater, Linda G; Jahrig, Jesse; Wild, T Cameron
2017-03-20
A growing body of research assesses population need for substance use services. However, the extent to which survey research incorporates expert versus consumer perspectives on service need is unknown. We conducted a large, international review to (1) describe extant research on population need for substance use services, and the extent to which it incorporates expert and consumer perspectives on service need, (2) critically assess methodological and measurement approaches used to study consumer-defined need, and (3) examine the potential for existing research that prioritizes consumer perspectives to inform substance use service system planning. Systematic searches of seven databases identified 1930 peer-reviewed articles addressing population need for substance use services between January 1980 and May 2015. Empirical studies (n = 1887) were categorized according to source(s) of data used to derive population estimates of service need (administrative records, biological samples, qualitative data, and/or quantitative surveys). Quantitative survey studies (n = 1594) were categorized as to whether service need was assessed from an expert and/or consumer perspective; studies employing consumer-defined need measures (n = 217) received further in-depth quantitative coding to describe study designs and measurement strategies. Almost all survey studies (96%; n = 1534) used diagnostically-oriented measures derived from an expert perspective to assess service need. Of the small number (14%, n = 217) of survey studies that assessed consumer's perspectives, most (77%) measured perceived need for generic services (i.e. 'treatment'), with fewer (42%) examining self-assessed barriers to service use, or informal help-seeking from family and friends (10%). Unstandardized measures were commonly used, and very little research was longitudinal or tested hypotheses. Only one study used a consumer-defined need measure to estimate required service system capacity. Rhetorical calls for including consumer perspectives in substance use service system planning are belied by the empirical literature, which is dominated by expert-driven approaches to measuring population need. Studies addressing consumer-defined need for substance use services are conceptually underdeveloped, and exhibit methodological and measurement weaknesses. Further scholarship is needed to integrate multidisciplinary perspectives in this literature, and fully realize the promise of incorporating consumer perspectives into substance use service system planning.
Hofland, J; Tenbrinck, R; van Eijck, C H J; Eggermont, A M M; Gommers, D; Erdmann, W
2003-04-01
Agreement between continuously measured oxygen consumption during quantitative closed system anaesthesia and intermittently Fick-derived calculated oxygen consumption was assessed in 11 patients undergoing simultaneous occlusion of the aorta and inferior vena cava for hypoxic treatment of pancreatic cancer. All patients were mechanically ventilated using a quantitative closed system anaesthesia machine (PhysioFlex) and had pulmonary and radial artery catheters inserted. During the varying haemodynamic conditions that accompany this procedure, 73 paired measurements were obtained. A significant correlation between Fick-derived and closed system-derived oxygen consumption was found (r = 0.78, p = 0.006). Linear regression showed that Fick-derived measure = [(1.19 x closed system derived measure) - 72], with the overall closed circuit-derived values being higher. However, the level of agreement between the two techniques was poor. Bland-Altman analysis found that the bias was 36 ml.min(-1), precision 39 ml.min(-1), difference between 95% limits of agreement 153 ml.min(-1). Therefore, we conclude that the two measurement techniques are not interchangeable in a clinical setting.
Lin, Long-Ze; Harnly, James M.
2013-01-01
A general method was developed for the systematic quantitation of flavanols, proanthocyanidins, isoflavones, flavanones, dihydrochalcones, stilbenes, and hydroxybenzoic acid derivatives (mainly hydrolyzable tannins) based on UV band II absorbance arising from the benzoyl structure. The compound structures and the wavelength maximum were well correlated and were divided into four groups: the flavanols and proanthocyanidins at 278 nm, hydrolyzable tannins at 274 nm, flavanones at 288 nm, and isoflavones at 260 nm. Within each group, molar relative response factors (MRRFs) were computed for each compound based on the absorbance ratio of the compound and the group reference standard. Response factors were computed for the compounds as purchased (MRRF), after drying (MRRFD), and as the best predicted value (MRRFP). Concentrations for each compound were computed based on calibration with the group reference standard and the MRRFP. The quantitation of catechins, proanthocyanidins, and gallic acid derivatives in white tea was used as an example. PMID:22577798
Benefit-risk analysis : a brief review and proposed quantitative approaches.
Holden, William L
2003-01-01
Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.
Rosedale, Mary; Malaspina, Dolores; Malamud, Daniel; Strauss, Shiela M; Horne, Jaclyn D; Abouzied, Salman; Cruciani, Ricardo A; Knotkova, Helena
2012-01-01
This article reports and discusses how quantitative (physiological and behavioral) and qualitative methods are being combined in an open-label pilot feasibility study. The study evaluates safety, tolerability, and acceptability of a protocol to treat depression in HIV-infected individuals, using a 2-week block of transcranial direct current stimulation (tDCS) over the dorsolateral prefrontal cortex. Major depressive disorder (MDD) is the second most prevalent psychiatric disorder after substance abuse among HIV-positive adults, and novel antidepressant treatments are needed for this vulnerable population. The authors describe the challenges and contributions derived from different research perspectives and methodological approaches and provide a philosophical framework for combining quantitative and qualitative measurements for a fuller examination of the disorder. Four methodological points are presented: (1) the value of combining quantitative and qualitative approaches; (2) the need for context-specific measures when studying patients with medical and psychiatric comorbidities; (3) the importance of research designs that integrate physiological, behavioral, and qualitative approaches when evaluating novel treatments; and (4) the need to explore the relationships between biomarkers, clinical symptom assessments, patient self-evaluations, and patient experiences when developing new, patient-centered protocols. The authors conclude that the complexity of studying novel treatments in complex and new patient populations requires complex research designs to capture the richness of data that inform translational research.
[Quantitative estimation source of urban atmospheric CO2 by carbon isotope composition].
Liu, Wei; Wei, Nan-Nan; Wang, Guang-Hua; Yao, Jian; Zeng, You-Shi; Fan, Xue-Bo; Geng, Yan-Hong; Li, Yan
2012-04-01
To effectively reduce urban carbon emissions and verify the effectiveness of currently project for urban carbon emission reduction, quantitative estimation sources of urban atmospheric CO2 correctly is necessary. Since little fractionation of carbon isotope exists in the transportation from pollution sources to the receptor, the carbon isotope composition can be used for source apportionment. In the present study, a method was established to quantitatively estimate the source of urban atmospheric CO2 by the carbon isotope composition. Both diurnal and height variations of concentrations of CO2 derived from biomass, vehicle exhaust and coal burning were further determined for atmospheric CO2 in Jiading district of Shanghai. Biomass-derived CO2 accounts for the largest portion of atmospheric CO2. The concentrations of CO2 derived from the coal burning are larger in the night-time (00:00, 04:00 and 20:00) than in the daytime (08:00, 12:00 and 16:00), and increase with the increase of height. Those derived from the vehicle exhaust decrease with the height increase. The diurnal and height variations of sources reflect the emission and transport characteristics of atmospheric CO2 in Jiading district of Shanghai.
Fang, Jiansong; Pang, Xiaocong; Wu, Ping; Yan, Rong; Gao, Li; Li, Chao; Lian, Wenwen; Wang, Qi; Liu, Ai-lin; Du, Guan-hua
2016-05-01
A dataset of 67 berberine derivatives for the inhibition of butyrylcholinesterase (BuChE) was studied based on the combination of quantitative structure-activity relationships models, molecular docking, and molecular dynamics methods. First, a series of berberine derivatives were reported, and their inhibitory activities toward butyrylcholinesterase (BuChE) were evaluated. By 2D- quantitative structure-activity relationships studies, the best model built by partial least-square had a conventional correlation coefficient of the training set (R(2)) of 0.883, a cross-validation correlation coefficient (Qcv2) of 0.777, and a conventional correlation coefficient of the test set (Rpred2) of 0.775. The model was also confirmed by Y-randomization examination. In addition, the molecular docking and molecular dynamics simulation were performed to better elucidate the inhibitory mechanism of three typical berberine derivatives (berberine, C2, and C55) toward BuChE. The predicted binding free energy results were consistent with the experimental data and showed that the van der Waals energy term (ΔEvdw) difference played the most important role in differentiating the activity among the three inhibitors (berberine, C2, and C55). The developed quantitative structure-activity relationships models provide details on the fine relationship linking structure and activity and offer clues for structural modifications, and the molecular simulation helps to understand the inhibitory mechanism of the three typical inhibitors. In conclusion, the results of this study provide useful clues for new drug design and discovery of BuChE inhibitors from berberine derivatives. © 2015 John Wiley & Sons A/S.
Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L
2009-05-01
To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.
Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury
Liu, Wei; Soderlund, Karl; Senseney, Justin S.; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B.; Liu, Tian; Wang, Yi; Oakes, Terrence R.; Riedy, Gerard
2017-01-01
Purpose To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. Materials and Methods The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multi-echo gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Results Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping–derived quantitative measures of microhemorrhages also decreased over time: −0.85 mm3 per day ± 1.59 for total volume (P = .039) and −0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). Conclusion The number of microhemorrhages and quantitative susceptibility mapping–derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. PMID:26371749
Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury.
Liu, Wei; Soderlund, Karl; Senseney, Justin S; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B; Liu, Tian; Wang, Yi; Oakes, Terrence R; Riedy, Gerard
2016-02-01
To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multiecho gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping-derived quantitative measures of microhemorrhages also decreased over time: -0.85 mm(3) per day ± 1.59 for total volume (P = .039) and -0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). The number of microhemorrhages and quantitative susceptibility mapping-derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. © RSNA, 2015.
NASA Astrophysics Data System (ADS)
Brehme, Marc; Koschmieder, Steffen; Montazeri, Maryam; Copland, Mhairi; Oehler, Vivian G.; Radich, Jerald P.; Brümmendorf, Tim H.; Schuppert, Andreas
2016-04-01
Modelling the parameters of multistep carcinogenesis is key for a better understanding of cancer progression, biomarker identification and the design of individualized therapies. Using chronic myeloid leukemia (CML) as a paradigm for hierarchical disease evolution we show that combined population dynamic modelling and CML patient biopsy genomic analysis enables patient stratification at unprecedented resolution. Linking CD34+ similarity as a disease progression marker to patient-derived gene expression entropy separated established CML progression stages and uncovered additional heterogeneity within disease stages. Importantly, our patient data informed model enables quantitative approximation of individual patients’ disease history within chronic phase (CP) and significantly separates “early” from “late” CP. Our findings provide a novel rationale for personalized and genome-informed disease progression risk assessment that is independent and complementary to conventional measures of CML disease burden and prognosis.
Inference of relativistic electron spectra from measurements of inverse Compton radiation
NASA Astrophysics Data System (ADS)
Craig, I. J. D.; Brown, J. C.
1980-07-01
The inference of relativistic electron spectra from spectral measurement of inverse Compton radiation is discussed for the case where the background photon spectrum is a Planck function. The problem is formulated in terms of an integral transform that relates the measured spectrum to the unknown electron distribution. A general inversion formula is used to provide a quantitative assessment of the information content of the spectral data. It is shown that the observations must generally be augmented by additional information if anything other than a rudimentary two or three parameter model of the source function is to be derived. It is also pointed out that since a similar equation governs the continuum spectra emitted by a distribution of black-body radiators, the analysis is relevant to the problem of stellar population synthesis from galactic spectra.
Theoretical Foundations of Remote Sensing for Glacier Assessment and Mapping
NASA Technical Reports Server (NTRS)
Bishop, Michael P.; Bush, Andrew B. G.; Furfaro, Roberto; Gillespie, Alan R.; Hall, Dorothy K.; Haritashya, Umesh K.; Shroder, John F., Jr.
2014-01-01
The international scientific community is actively engaged in assessing ice sheet and alpine glacier fluctuations at a variety of scales. The availability of stereoscopic, multitemporal, and multispectral satellite imagery from the optical wavelength regions of the electromagnetic spectrum has greatly increased our ability to assess glaciological conditions and map the cryosphere. There are, however, important issues and limitations associated with accurate satellite information extraction and mapping, as well as new opportunities for assessment and mapping that are all rooted in understanding the fundamentals of the radiation transfer cascade. We address the primary radiation transfer components, relate them to glacier dynamics and mapping, and summarize the analytical approaches that permit transformation of spectral variation into thematic and quantitative parameters. We also discuss the integration of satellite-derived information into numerical modeling approaches to facilitate understandings of glacier dynamics and causal mechanisms.
NASA Astrophysics Data System (ADS)
Sylwester, J.; Mewe, R.; Schrijver, J.
1980-06-01
In this paper, the third in a series dealing with plasmas out of equilibrium we present quantitative methods of analysis of non-stationary flare plasma parameters. The method is designed to be used for the interpretation of the SMM XRP Bent Crystal Spectrometer spectra. Our analysis is based on measurements of 11 specific lines in the 1.77-3.3 Å range. Using the proposed method we are able to derive information about temperature, density, emission measure, and other related parameters of the flare plasma. It is shown that the measurements, to be made by XRP can give detailed information on these parameters and their time evolution. The method is then tested on some artificial flares, and proves to be useful and accurate.
Saint: a lightweight integration environment for model annotation.
Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil
2009-11-15
Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).
NASA Technical Reports Server (NTRS)
Schaber, G. G.
1991-01-01
The contacts between 34 geological/geomorphic terrain units in the northern quarter of Venus mapped from Venera 15/16 data were digitized and converted to a Sinusoidal Equal-Area projection. The result was then registered with a merged Pioneer Venus/Venera 15/16 altimetric database, root mean square (rms) slope values, and radar reflectivity values derived from Pioneer Venus. The resulting information includes comparisons among individual terrain units and terrain groups to which they are assigned in regard to percentage of map area covered, elevation, rms slopes, distribution of suspected craters greater than 10 km in diameter.
Detection of regional air pollution episodes utilizing satellite digital data in the visual range
NASA Technical Reports Server (NTRS)
Burke, H.-H. K.
1982-01-01
Digital analyses of satellite visible data for selected high-sulfate cases over the northeastern U.S., on July 21 and 22, 1978, are compared with ground-based measurements. Quantitative information on total aerosol loading derived from the satellite digitized data using an atmospheric radiative transfer model is found to agree with the ground measurements, and it is shown that the extent and transport of the haze pattern may be monitored from the satellite data over the period of maximum intensity for the episode. Attention is drawn to the potential benefits of satellite monitoring of pollution episodes demonstrated by the model.
Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar
2015-01-01
The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5–6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance. PMID:26656260
Capelli, R; Mahne, N; Koshmak, K; Giglia, A; Doyle, B P; Mukherjee, S; Nannarone, S; Pasquali, L
2016-07-14
Resonant soft X-ray reflectivity at the carbon K edge, with linearly polarized light, was used to derive quantitative information of film morphology, molecular arrangement, and electronic orbital anisotropies of an ultrathin 3,4,9,10-perylene tetracarboxylic dianhydride (PTCDA) film on Au(111). The experimental spectra were simulated by computing the propagation of the electromagnetic field in a trilayer system (vacuum/PTCDA/Au), where the organic film was treated as an anisotropic medium. Optical constants were derived from the calculated (through density functional theory) absorption cross sections of the single molecule along the three principal molecular axes. These were used to construct the dielectric tensor of the film, assuming the molecules to be lying flat with respect to the substrate and with a herringbone arrangement parallel to the substrate plane. Resonant soft X-ray reflectivity proved to be extremely sensitive to film thickness, down to the single molecular layer. The best agreement between simulation and experiment was found for a film of 1.6 nm, with flat laying configuration of the molecules. The high sensitivity to experimental geometries in terms of beam incidence and light polarization was also clarified through simulations. The optical anisotropies of the organic film were experimentally determined and through the comparison with calculations, it was possible to relate them to the orbital symmetry of the empty electronic states.
Indication of Horizontal DNA Gene Transfer by Extracellular Vesicles
Speiseder, Thomas; Badbaran, Anita; Reimer, Rudolph; Indenbirken, Daniela; Grundhoff, Adam; Brunswig-Spickenheier, Bärbel; Alawi, Malik; Lange, Claudia
2016-01-01
The biological relevance of extracellular vesicles (EV) in intercellular communication has been well established. Thus far, proteins and RNA were described as main cargo. Here, we show that EV released from human bone marrow derived mesenchymal stromal cells (BM-hMSC) also carry high-molecular DNA in addition. Extensive EV characterization revealed this DNA mainly associated with the outer EV membrane and to a smaller degree also inside the EV. Our EV purification protocol secured that DNA is not derived from apoptotic or necrotic cells. To analyze the relevance of EV-associated DNA we lentivirally transduced Arabidopsis thaliana-DNA (A.t.-DNA) as indicator into BM-hMSC and generated EV. Using quantitative polymerase chain reaction (qPCR) techniques we detected high copy numbers of A.t.-DNA in EV. In recipient hMSC incubated with tagged EV for two weeks we identified A.t.-DNA transferred to recipient cells. Investigation of recipient cell DNA using quantitative PCR and verification of PCR-products by sequencing suggested stable integration of A.t.-DNA. In conclusion, for the first time our proof-of-principle experiments point to horizontal DNA transfer into recipient cells via EV. Based on our results we assume that eukaryotic cells are able to exchange genetic information in form of DNA extending the known cargo of EV by genomic DNA. This mechanism might be of relevance in cancer but also during cell evolution and development. PMID:27684368
Automation of the ELISpot assay for high-throughput detection of antigen-specific T-cell responses.
Almeida, Coral-Ann M; Roberts, Steven G; Laird, Rebecca; McKinnon, Elizabeth; Ahmed, Imran; Pfafferott, Katja; Turley, Joanne; Keane, Niamh M; Lucas, Andrew; Rushton, Ben; Chopra, Abha; Mallal, Simon; John, Mina
2009-05-15
The enzyme linked immunospot (ELISpot) assay is a fundamental tool in cellular immunology, providing both quantitative and qualitative information on cellular cytokine responses to defined antigens. It enables the comprehensive screening of patient derived peripheral blood mononuclear cells to reveal the antigenic restriction of T-cell responses and is an emerging technique in clinical laboratory investigation of certain infectious diseases. As with all cellular-based assays, the final results of the assay are dependent on a number of technical variables that may impact precision if not highly standardised between operators. When studies that are large scale or using multiple antigens are set up manually, these assays may be labour intensive, have many manual handling steps, are subject to data and sample integrity failure and may show large inter-operator variability. Here we describe the successful automated performance of the interferon (IFN)-gamma ELISpot assay from cell counting through to electronic capture of cytokine quantitation and present the results of a comparison between automated and manual performance of the ELISpot assay. The mean number of spot forming units enumerated by both methods for limiting dilutions of CMV, EBV and influenza (CEF)-derived peptides in six healthy individuals were highly correlated (r>0.83, p<0.05). The precision results from the automated system compared favourably with the manual ELISpot and further ensured electronic tracking, increased through-put and reduced turnaround time.
Li, Pu; Wang, Xin; Li, Jian; Meng, Zhi-Yun; Li, Shu-Chun; Li, Zhong-Jun; Lu, Ying-Yuan; Ren, Hong; Lou, Ya-Qing; Lu, Chuang; Dou, Gui-Fang; Zhang, Guo-Liang
2015-01-01
Fructose-based 3-acetyl-2,3-dihydro-1,3,4-oxadiazole (GLB) is a novel antitumor agent and belongs to glycosylated spiro-heterocyclic oxadiazole scaffold derivative. This research first reported a simple, specific, sensitive and stable high performance liquid chromatography -ultraviolet detector (HPLC-UV) method for the quantitative determination of GLB in plasma. In this method, the chromatographic separation was achieved with a reversed phase C18 column. The calibration curve for GLB was linear at 300 nm. The lower limit of quantification was 10 ng/mL. The precision, accuracy and stability of the method were validated adequately. This method was successfully applied to the pharmacokinetic study in rats for detection of GLB after oral administration. Moreover, the structures of parent compound GLB and its two major metabolites M1 and M2 were identified in plasma using an ultra performance liquid chromatography- electrospray ionization-quadrupole-time of flight- mass spectrometry (UPLC-ESI-QTOF-MS) method. Our results indicated that the di-hydroxylation (M1) and hydroxylation (M2) of GLB are the major metabolites. In conclusion, the present study provided valuable information on an analytical method for the determination of GLB and its metabolites in rats, can be used to support further developing of this antitumor agent. PMID:26148672
Indication of Horizontal DNA Gene Transfer by Extracellular Vesicles.
Fischer, Stefanie; Cornils, Kerstin; Speiseder, Thomas; Badbaran, Anita; Reimer, Rudolph; Indenbirken, Daniela; Grundhoff, Adam; Brunswig-Spickenheier, Bärbel; Alawi, Malik; Lange, Claudia
The biological relevance of extracellular vesicles (EV) in intercellular communication has been well established. Thus far, proteins and RNA were described as main cargo. Here, we show that EV released from human bone marrow derived mesenchymal stromal cells (BM-hMSC) also carry high-molecular DNA in addition. Extensive EV characterization revealed this DNA mainly associated with the outer EV membrane and to a smaller degree also inside the EV. Our EV purification protocol secured that DNA is not derived from apoptotic or necrotic cells. To analyze the relevance of EV-associated DNA we lentivirally transduced Arabidopsis thaliana-DNA (A.t.-DNA) as indicator into BM-hMSC and generated EV. Using quantitative polymerase chain reaction (qPCR) techniques we detected high copy numbers of A.t.-DNA in EV. In recipient hMSC incubated with tagged EV for two weeks we identified A.t.-DNA transferred to recipient cells. Investigation of recipient cell DNA using quantitative PCR and verification of PCR-products by sequencing suggested stable integration of A.t.-DNA. In conclusion, for the first time our proof-of-principle experiments point to horizontal DNA transfer into recipient cells via EV. Based on our results we assume that eukaryotic cells are able to exchange genetic information in form of DNA extending the known cargo of EV by genomic DNA. This mechanism might be of relevance in cancer but also during cell evolution and development.
Use of in Vitro HTS-Derived Concentration-Response Data as ...
Background: Quantitative high-throughput screening (qHTS) assays are increasingly being employed to inform chemical hazard identification. Hundreds of chemicals have been tested in dozens of cell lines across extensive concentration ranges by the National Toxicology Program in collaboration with the NIH Chemical Genomics Center. Objectives: To test a hypothesis that dose-response data points of the qHTS assays can serve as biological descriptors of assayed chemicals and, when combined with conventional chemical descriptors, may improve the accuracy of Quantitative Structure-Activity Relationship (QSAR) models applied to prediction of in vivo toxicity endpoints. Methods and Results: The cell viability qHTS concentration-response data for 1,408 substances assayed in 13 cell lines were obtained from PubChem; for a subset of these compounds rodent acute toxicity LD50 data were also available. The classification k Nearest Neighbor and Random Forest QSAR methods were employed for modeling LD50 data using either chemical descriptors alone (conventional models) or in combination with biological descriptors derived from the concentration-response qHTS data (hybrid models). Critical to our approach was the use of a novel noise-filtering algorithm to treat qHTS data. We show that both the external classification accuracy and coverage (i.e., fraction of compounds in the external set that fall within the applicability domain) of the hybrid QSAR models was superior to convent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capelli, R.; Koshmak, K.; Giglia, A.
Resonant soft X-ray reflectivity at the carbon K edge, with linearly polarized light, was used to derive quantitative information of film morphology, molecular arrangement, and electronic orbital anisotropies of an ultrathin 3,4,9,10-perylene tetracarboxylic dianhydride (PTCDA) film on Au(111). The experimental spectra were simulated by computing the propagation of the electromagnetic field in a trilayer system (vacuum/PTCDA/Au), where the organic film was treated as an anisotropic medium. Optical constants were derived from the calculated (through density functional theory) absorption cross sections of the single molecule along the three principal molecular axes. These were used to construct the dielectric tensor of themore » film, assuming the molecules to be lying flat with respect to the substrate and with a herringbone arrangement parallel to the substrate plane. Resonant soft X-ray reflectivity proved to be extremely sensitive to film thickness, down to the single molecular layer. The best agreement between simulation and experiment was found for a film of 1.6 nm, with flat laying configuration of the molecules. The high sensitivity to experimental geometries in terms of beam incidence and light polarization was also clarified through simulations. The optical anisotropies of the organic film were experimentally determined and through the comparison with calculations, it was possible to relate them to the orbital symmetry of the empty electronic states.« less
A Unified Theory of Impact Crises and Mass Extinctions: Quantitative Tests
NASA Technical Reports Server (NTRS)
Rampino, Michael R.; Haggerty, Bruce M.; Pagano, Thomas C.
1997-01-01
Several quantitative tests of a general hypothesis linking impacts of large asteroids and comets with mass extinctions of life are possible based on astronomical data, impact dynamics, and geological information. The waiting of large-body impacts on the Earth derive from the flux of Earth-crossing asteroids and comets, and the estimated size of impacts capable of causing large-scale environmental disasters, predict that impacts of objects greater than or equal to 5 km in diameter (greater than or equal to 10 (exp 7) Mt TNT equivalent) could be sufficient to explain the record of approximately 25 extinction pulses in the last 540 Myr, with the 5 recorded major mass extinctions related to impacts of the largest objects of greater than or equal to 10 km in diameter (greater than or equal to 10(exp 8) Mt Events). Smaller impacts (approximately 10 (exp 6) Mt), with significant regional environmental effects, could be responsible for the lesser boundaries in the geologic record.
Quantitative Characterization of Tissue Microstructure with Temporal Diffusion Spectroscopy
Xu, Junzhong; Does, Mark D.; Gore, John C.
2009-01-01
The signals recorded by diffusion-weighted magnetic resonance imaging (DWI) are dependent on the micro-structural properties of biological tissues, so it is possible to obtain quantitative structural information non-invasively from such measurements. Oscillating gradient spin echo (OGSE) methods have the ability to probe the behavior of water diffusion over different time scales and the potential to detect variations in intracellular structure. To assist in the interpretation of OGSE data, analytical expressions have been derived for diffusion-weighted signals with OGSE methods for restricted diffusion in some typical structures, including parallel planes, cylinders and spheres, using the theory of temporal diffusion spectroscopy. These analytical predictions have been confirmed with computer simulations. These expressions suggest how OGSE signals from biological tissues should be analyzed to characterize tissue microstructure, including how to estimate cell nuclear sizes. This approach provides a model to interpret diffusion data obtained from OGSE measurements that can be used for applications such as monitoring tumor response to treatment in vivo. PMID:19616979
da Silva Nunes, Wilian; de Oliveira, Caroline Silva; Alcantara, Glaucia Braz
2016-04-01
This study reports the chemical composition of five types of industrial frozen fruit pulps (acerola, cashew, grape, passion fruit and pineapple fruit pulps) and compares them with homemade pulps at two different stages of ripening. The fruit pulps were characterized by analyzing their metabolic profiles and determining their ethanol content using quantitative Nuclear Magnetic Resonance (qNMR). In addition, principal component analysis (PCA) was applied to extract more information from the NMR data. We detected ethanol in all industrial and homemade pulps; and acetic acid in cashew, grape and passion fruit industrial and homemade pulps. The ethanol content in some industrial pulps is above the level recommended by regulatory agencies and is near the levels of some post-ripened homemade pulps. This study demonstrates that qNMR can be used to rapidly detect ethanol content in frozen fruit pulps and food derivatives. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Investigating fold structures of 2D materials by quantitative transmission electron microscopy.
Wang, Zhiwei; Zhang, Zengming; Liu, Wei; Wang, Zhong Lin
2017-04-01
We report an approach developed for deriving 3D structural information of 2D membrane folds based on the recently-established quantitative transmission electron microscopy (TEM) in combination with density functional theory (DFT) calculations. Systematic multislice simulations reveal that the membrane folding leads to sufficiently strong electron scattering which enables a precise determination of bending radius. The image contrast depends also on the folding angles of 2D materials due to the variation of projection potentials, which however exerts much smaller effect compared with the bending radii. DFT calculations show that folded edges are typically characteristic of (fractional) nanotubes with the same curvature retained after energy optimization. Owing to the exclusion of Stobbs factor issue, numerical simulations were directly used in comparison with the experimental measurements on an absolute contrast scale, which results in a successful determination of bending radius of folded monolayer MoS 2 films. The method should be applicable to characterizing all 2D membranes with 3D folding features. Copyright © 2017 Elsevier Ltd. All rights reserved.
SYN-JEM: A Quantitative Job-Exposure Matrix for Five Lung Carcinogens.
Peters, Susan; Vermeulen, Roel; Portengen, Lützen; Olsson, Ann; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans
2016-08-01
The use of measurement data in occupational exposure assessment allows more quantitative analyses of possible exposure-response relations. We describe a quantitative exposure assessment approach for five lung carcinogens (i.e. asbestos, chromium-VI, nickel, polycyclic aromatic hydrocarbons (by its proxy benzo(a)pyrene (BaP)) and respirable crystalline silica). A quantitative job-exposure matrix (JEM) was developed based on statistical modeling of large quantities of personal measurements. Empirical linear models were developed using personal occupational exposure measurements (n = 102306) from Europe and Canada, as well as auxiliary information like job (industry), year of sampling, region, an a priori exposure rating of each job (none, low, and high exposed), sampling and analytical methods, and sampling duration. The model outcomes were used to create a JEM with a quantitative estimate of the level of exposure by job, year, and region. Decreasing time trends were observed for all agents between the 1970s and 2009, ranging from -1.2% per year for personal BaP and nickel exposures to -10.7% for asbestos (in the time period before an asbestos ban was implemented). Regional differences in exposure concentrations (adjusted for measured jobs, years of measurement, and sampling method and duration) varied by agent, ranging from a factor 3.3 for chromium-VI up to a factor 10.5 for asbestos. We estimated time-, job-, and region-specific exposure levels for four (asbestos, chromium-VI, nickel, and RCS) out of five considered lung carcinogens. Through statistical modeling of large amounts of personal occupational exposure measurement data we were able to derive a quantitative JEM to be used in community-based studies. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Calibration of Sea Ice Motion from QuikSCAT with those from SSM/I and Buoy
NASA Technical Reports Server (NTRS)
Liu, Antony K.; Zhao, Yun-He; Zukor, Dorothy J. (Technical Monitor)
2001-01-01
QuikSCAT backscatter and DMSP SSM/I radiance data are used to derive sea ice motion for both the Arctic and Antarctic region using wavelet analysis method. This technique provides improved spatial coverage over the existing array of Arctic Ocean buoys and better temporal resolution over techniques utilizing satellite data from Synthetic Aperture Radar (SAR). Sea ice motion of the Arctic for the period from October 1999 to March 2000 derived from QuikSCAT and SSM/I data agrees well with that derived from ocean buoys quantitatively. Thus the ice tracking results from QuikSCAT and SSM/I are complement to each other, Then, three sea-ice drift daily results from QuikSCAT, SSM/I, and buoy data can be merged to generate composite maps with more complete coverage of sea ice motion than those from single data source. A series of composite sea ice motion maps for December 1999 show that the major circulation patterns of sea ice motion are changing and shifting significantly within every four days and they are dominated by wind forcing. Sea-ice drift in the summer can not be derived from NSCAT and SSM/I data. In later summer of 1999 (in September), however, QuikSCAT data can provide good sea ice motion information in the Arctic. QuiksCAT can also provide at least partial sea ice motion information until June 15 in early summer 1999. For the Antarctic, case study shows that sea ice motion derived from QuikSCAT data is predominantly forced by and is consistent with wind field derived from QuikSCAT around the polar region. These calibrated/validated results indicate that QuikSCAT, SSM/I, and buoy merged daily ice motion are suitably accurate to identify and closely locate sea ice processes, and to improve our current knowledge of sea ice drift and related processes through the data assimilation of ocean-ice numerical model.
NASA Astrophysics Data System (ADS)
Storlazzi, C. D.; Field, M. E.; Bothner, M. H.
2011-03-01
Sediment traps are commonly used as standard tools for monitoring "sedimentation" in coral reef environments. In much of the literature where sediment traps were used to measure the effects of "sedimentation" on corals, it is clear from deployment descriptions and interpretations of the resulting data that information derived from sediment traps has frequently been misinterpreted or misapplied. Despite their widespread use in this setting, sediment traps do not provide quantitative information about "sedimentation" on coral surfaces. Traps can provide useful information about the relative magnitude of sediment dynamics if trap deployment standards are used. This conclusion is based first on a brief review of the state of knowledge of sediment trap dynamics, which has primarily focused on traps deployed high above the seabed in relatively deep water, followed by our understanding of near-bed sediment dynamics in shallow-water environments that characterize coral reefs. This overview is followed by the first synthesis of near-bed sediment trap data collected with concurrent hydrodynamic information in coral reef environments. This collective information is utilized to develop nine protocols for using sediment traps in coral reef environments, which focus on trap parameters that researchers can control such as trap height ( H), trap mouth diameter ( D), the height of the trap mouth above the substrate ( z o ), and the spacing between traps. The hydrodynamic behavior of sediment traps and the limitations of data derived from these traps should be forefront when interpreting sediment trap data to infer sediment transport processes in coral reef environments.
Storlazzi, C.D.; Field, M.E.; Bothner, Michael H.
2011-01-01
Sediment traps are commonly used as standard tools for monitoring “sedimentation” in coral reef environments. In much of the literature where sediment traps were used to measure the effects of “sedimentation” on corals, it is clear from deployment descriptions and interpretations of the resulting data that information derived from sediment traps has frequently been misinterpreted or misapplied. Despite their widespread use in this setting, sediment traps do not provide quantitative information about “sedimentation” on coral surfaces. Traps can provide useful information about the relative magnitude of sediment dynamics if trap deployment standards are used. This conclusion is based first on a brief review of the state of knowledge of sediment trap dynamics, which has primarily focused on traps deployed high above the seabed in relatively deep water, followed by our understanding of near-bed sediment dynamics in shallow-water environments that characterize coral reefs. This overview is followed by the first synthesis of near-bed sediment trap data collected with concurrent hydrodynamic information in coral reef environments. This collective information is utilized to develop nine protocols for using sediment traps in coral reef environments, which focus on trap parameters that researchers can control such as trap height (H), trap mouth diameter (D), the height of the trap mouth above the substrate (z o ), and the spacing between traps. The hydrodynamic behavior of sediment traps and the limitations of data derived from these traps should be forefront when interpreting sediment trap data to infer sediment transport processes in coral reef environments.
NASA Astrophysics Data System (ADS)
Takegami, Shigehiko; Kitamura, Keisuke; Ohsugi, Mayuko; Ito, Aya; Kitade, Tatsuya
2015-06-01
In order to quantitatively examine the lipophilicity of the widely used organophosphorus pesticides (OPs) chlorfenvinphos (CFVP), chlorpyrifos-methyl (CPFM), diazinon (DZN), fenitrothion (FNT), fenthion (FT), isofenphos (IFP), profenofos (PFF) and pyraclofos (PCF), their partition coefficient (Kp) values between phosphatidylcholine (PC) small unilamellar vesicles (SUVs) and water (liposome-water system) were determined by second-derivative spectrophotometry. The second-derivative spectra of these OPs in the presence of PC SUV showed a bathochromic shift according to the increase in PC concentration and distinct derivative isosbestic points, demonstrating the complete elimination of the residual background signal effects that were observed in the absorption spectra. The Kp values were calculated from the second-derivative intensity change induced by addition of PC SUV and obtained with a good precision of R.S.D. below 10%. The Kp values were in the order of CPFM > FT > PFF > PCF > IFP > CFVP > FNT ⩾ DZN and did not show a linear correlation relationship with the reported partition coefficients obtained using an n-octanol-water system (R2 = 0.530). Also, the results quantitatively clarified the effect of chemical-group substitution in OPs on their lipophilicity. Since the partition coefficient for the liposome-water system is more effective for modeling the quantitative structure-activity relationship than that for the n-octanol-water system, the obtained results are toxicologically important for estimating the accumulation of these OPs in human cell membranes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Tony, E-mail: tc282@nau.edu; Nielsen, Erik, E-mail: erik.nielsen@nau.edu; Auberle, William, E-mail: william.auberle@nau.edu
2013-01-15
The environmental impact assessment (EIA) has been a tool for decision makers since the enactment of the National Environmental Policy Act (NEPA). Since that time, few analyses have been performed to verify the quality of information and content within EIAs. High quality information within assessments is vital in order for decision makers, stake holders, and the public to understand the potential impact of proposed actions on the ecosystem and wildlife species. Low quality information has been a major cause for litigation and economic loss. Since 1999, wind energy development has seen an exponential growth with unknown levels of impact onmore » wildlife species, in particular bird and bat species. The purpose of this article is to: (1) develop, validate, and apply a quantitative index to review avian/bat assessment quality for wind energy EIAs; and (2) assess the trends and status of avian/bat assessment quality in a sample of wind energy EIAs. This research presents the development and testing of the Avian and Bat Assessment Quality Index (ABAQI), a new approach to quantify information quality of ecological assessments within wind energy development EIAs in relation to avian and bat species based on review areas and factors derived from 23 state wind/wildlife siting guidance documents. The ABAQI was tested through a review of 49 publicly available EIA documents and validated by identifying high variation in avian and bat assessments quality for wind energy developments. Of all the reviewed EIAs, 66% failed to provide high levels of preconstruction avian and bat survey information, compared to recommended factors from state guidelines. This suggests the need for greater consistency from recommended guidelines by state, and mandatory compliance by EIA preparers to avoid possible habitat and species loss, wind energy development shut down, and future lawsuits. - Highlights: Black-Right-Pointing-Pointer We developed, validated, and applied a quantitative index to review avian/bat assessment quality for wind energy EIAs. Black-Right-Pointing-Pointer We assessed the trends and status of avian/bat assessment quality in a sample of wind energy EIAs. Black-Right-Pointing-Pointer Applied index to 49 EIA documents and identified high variation in assessment quality for wind energy developments. Black-Right-Pointing-Pointer For the reviewed EIAs, 66% provided inadequate preconstruction avian and bat survey information.« less
Yu, S; Gao, S; Gan, Y; Zhang, Y; Ruan, X; Wang, Y; Yang, L; Shi, J
2016-04-01
Quantitative structure-property relationship modelling can be a valuable alternative method to replace or reduce experimental testing. In particular, some endpoints such as octanol-water (KOW) and organic carbon-water (KOC) partition coefficients of polychlorinated biphenyls (PCBs) are easier to predict and various models have been already developed. In this paper, two different methods, which are multiple linear regression based on the descriptors generated using Dragon software and hologram quantitative structure-activity relationships, were employed to predict suspended particulate matter (SPM) derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of 209 PCBs. The predictive ability of the derived models was validated using a test set. The performances of all these models were compared with EPI Suite™ software. The results indicated that the proposed models were robust and satisfactory, and could provide feasible and promising tools for the rapid assessment of the SPM derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of PCBs.
PTMScout, a Web Resource for Analysis of High Throughput Post-translational Proteomics Studies*
Naegle, Kristen M.; Gymrek, Melissa; Joughin, Brian A.; Wagner, Joel P.; Welsch, Roy E.; Yaffe, Michael B.; Lauffenburger, Douglas A.; White, Forest M.
2010-01-01
The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu. PMID:20631208
Rusyn, Ivan; Sedykh, Alexander; Guyton, Kathryn Z.; Tropsha, Alexander
2012-01-01
Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules; however, traditional QSAR models often have limited predictive power due to the lack of data and complexity of in vivo endpoints. Although it has been indeed difficult to obtain experimentally derived toxicity data on a large number of chemicals in the past, the results of quantitative in vitro screening of thousands of environmental chemicals in hundreds of experimental systems are now available and continue to accumulate. In addition, publicly accessible toxicogenomics data collected on hundreds of chemicals provide another dimension of molecular information that is potentially useful for predictive toxicity modeling. These new characteristics of molecular bioactivity arising from short-term biological assays, i.e., in vitro screening and/or in vivo toxicogenomics data can now be exploited in combination with chemical structural information to generate hybrid QSAR–like quantitative models to predict human toxicity and carcinogenicity. Using several case studies, we illustrate the benefits of a hybrid modeling approach, namely improvements in the accuracy of models, enhanced interpretation of the most predictive features, and expanded applicability domain for wider chemical space coverage. PMID:22387746
Iancu, Ovidiu D; Darakjian, Priscila; Kawane, Sunita; Bottomly, Daniel; Hitzemann, Robert; McWeeney, Shannon
2012-01-01
Complex Mus musculus crosses, e.g., heterogeneous stock (HS), provide increased resolution for quantitative trait loci detection. However, increased genetic complexity challenges detection methods, with discordant results due to low data quality or complex genetic architecture. We quantified the impact of theses factors across three mouse crosses and two different detection methods, identifying procedures that greatly improve detection quality. Importantly, HS populations have complex genetic architectures not fully captured by the whole genome kinship matrix, calling for incorporating chromosome specific relatedness information. We analyze three increasingly complex crosses, using gene expression levels as quantitative traits. The three crosses were an F(2) intercross, a HS formed by crossing four inbred strains (HS4), and a HS (HS-CC) derived from the eight lines found in the collaborative cross. Brain (striatum) gene expression and genotype data were obtained using the Illumina platform. We found large disparities between methods, with concordance varying as genetic complexity increased; this problem was more acute for probes with distant regulatory elements (trans). A suite of data filtering steps resulted in substantial increases in reproducibility. Genetic relatedness between samples generated overabundance of detected eQTLs; an adjustment procedure that includes the kinship matrix attenuates this problem. However, we find that relatedness between individuals is not evenly distributed across the genome; information from distinct chromosomes results in relatedness structure different from the whole genome kinship matrix. Shared polymorphisms from distinct chromosomes collectively affect expression levels, confounding eQTL detection. We suggest that considering chromosome specific relatedness can result in improved eQTL detection.
Abortion and mental health: quantitative synthesis and analysis of research published 1995-2009.
Coleman, Priscilla K
2011-09-01
Given the methodological limitations of recently published qualitative reviews of abortion and mental health, a quantitative synthesis was deemed necessary to represent more accurately the published literature and to provide clarity to clinicians. To measure the association between abortion and indicators of adverse mental health, with subgroup effects calculated based on comparison groups (no abortion, unintended pregnancy delivered, pregnancy delivered) and particular outcomes. A secondary objective was to calculate population-attributable risk (PAR) statistics for each outcome. After the application of methodologically based selection criteria and extraction rules to minimise bias, the sample comprised 22 studies, 36 measures of effect and 877 181 participants (163 831 experienced an abortion). Random effects pooled odds ratios were computed using adjusted odds ratios from the original studies and PAR statistics were derived from the pooled odds ratios. Women who had undergone an abortion experienced an 81% increased risk of mental health problems, and nearly 10% of the incidence of mental health problems was shown to be attributable to abortion. The strongest subgroup estimates of increased risk occurred when abortion was compared with term pregnancy and when the outcomes pertained to substance use and suicidal behaviour. This review offers the largest quantitative estimate of mental health risks associated with abortion available in the world literature. Calling into question the conclusions from traditional reviews, the results revealed a moderate to highly increased risk of mental health problems after abortion. Consistent with the tenets of evidence-based medicine, this information should inform the delivery of abortion services.
CT-derived Biomechanical Metrics Improve Agreement Between Spirometry and Emphysema
Bhatt, Surya P.; Bodduluri, Sandeep; Newell, John D.; Hoffman, Eric A.; Sieren, Jessica C.; Han, Meilan K.; Dransfield, Mark T.; Reinhardt, Joseph M.
2016-01-01
Rationale and Objectives Many COPD patients have marked discordance between FEV1 and degree of emphysema on CT. Biomechanical differences between these patients have not been studied. We aimed to identify reasons for the discordance between CT and spirometry in some patients with COPD. Materials and Methods Subjects with GOLD stage I–IV from a large multicenter study (COPDGene) were arranged by percentiles of %predicted FEV1 and emphysema on CT. Three categories were created using differences in percentiles: Catspir with predominant airflow obstruction/minimal emphysema, CatCT with predominant emphysema/minimal airflow obstruction, and Catmatched with matched FEV1 and emphysema. Image registration was used to derive Jacobian determinants, a measure of lung elasticity, anisotropy and strain tensors, to assess biomechanical differences between groups. Regression models were created with the above categories as outcome variable, adjusting for demographics, scanner type, quantitative CT-derived emphysema, gas trapping, and airway thickness (Model 1), and after adding biomechanical CT metrics (Model 2). Results Jacobian determinants, anisotropy and strain tensors were strongly associated with FEV1. With Catmatched as control, Model 2 predicted Catspir and CatCT better than Model 1 (Akaike Information Criterion, AIC 255.8 vs. 320.8). In addition to demographics, the strongest independent predictors of FEV1 were Jacobian mean (β= 1.60,95%CI = 1.16 to 1.98; p<0.001), coefficient of variation (CV) of Jacobian (β= 1.45,95%CI = 0.86 to 2.03; p<0.001) and CV strain (β= 1.82,95%CI = 0.68 to 2.95; p = 0.001). CVs of Jacobian and strain are both potential markers of biomechanical lung heterogeneity. Conclusions CT-derived measures of lung mechanics improve the link between quantitative CT and spirometry, offering the potential for new insights into the linkage between regional parenchymal destruction and global decrement in lung function in COPD patients. PMID:27055745
Meyners, Christian; Baud, Matthias G J; Fuchter, Matthew J; Meyer-Almes, Franz-Josef
2014-09-01
Performing kinetic studies on protein ligand interactions provides important information on complex formation and dissociation. Beside kinetic parameters such as association rates and residence times, kinetic experiments also reveal insights into reaction mechanisms. Exploiting intrinsic tryptophan fluorescence a parallelized high-throughput Förster resonance energy transfer (FRET)-based reporter displacement assay with very low protein consumption was developed to enable the large-scale kinetic characterization of the binding of ligands to recombinant human histone deacetylases (HDACs) and a bacterial histone deacetylase-like amidohydrolase (HDAH) from Bordetella/Alcaligenes. For the binding of trichostatin A (TSA), suberoylanilide hydroxamic acid (SAHA), and two other SAHA derivatives to HDAH, two different modes of action, simple one-step binding and a two-step mechanism comprising initial binding and induced fit, were verified. In contrast to HDAH, all compounds bound to human HDAC1, HDAC6, and HDAC8 through a two-step mechanism. A quantitative view on the inhibitor-HDAC systems revealed two types of interaction, fast binding and slow dissociation. We provide arguments for the thesis that the relationship between quantitative kinetic and mechanistic information and chemical structures of compounds will serve as a valuable tool for drug optimization. Copyright © 2014 Elsevier Inc. All rights reserved.
Stochastic time series analysis of fetal heart-rate variability
NASA Astrophysics Data System (ADS)
Shariati, M. A.; Dripps, J. H.
1990-06-01
Fetal Heart Rate(FHR) is one of the important features of fetal biophysical activity and its long term monitoring is used for the antepartum(period of pregnancy before labour) assessment of fetal well being. But as yet no successful method has been proposed to quantitatively represent variety of random non-white patterns seen in FHR. Objective of this paper is to address this issue. In this study the Box-Jenkins method of model identification and diagnostic checking was used on phonocardiographic derived FHR(averaged) time series. Models remained exclusively autoregressive(AR). Kalman filtering in conjunction with maximum likelihood estimation technique forms the parametric estimator. Diagnosrics perfonned on the residuals indicated that a second order model may be adequate in capturing type of variability observed in 1 up to 2 mm data windows of FHR. The scheme may be viewed as a means of data reduction of a highly redundant information source. This allows a much more efficient transmission of FHR information from remote locations to places with facilities and expertise for doser analysis. The extracted parameters is aimed to reflect numerically the important FHR features. These are normally picked up visually by experts for their assessments. As a result long term FHR recorded during antepartum period could then be screened quantitatively for detection of patterns considered normal or abnonnal. 1.
The Effect of Radiation on Selected Photographic Film
NASA Technical Reports Server (NTRS)
Slater, Richard; Kinard, John; Firsov, Ivan
2000-01-01
We conducted this film test to evaluate several manufacturers' photographic films for their ability to acquire imagery on the International Space Station. We selected 25 motion picture, photographic slide, and negative films from three different film manufacturers. We based this selection on the fact that their films ranked highest in other similar film tests, and on their general acceptance by the international community. This test differed from previous tests because the entire evaluation process leading up to the final selection was based on information derived after the original flight film was scanned to a digital file. Previously conducted tests were evaluated entirely based on 8 x 10s that were produced from the film either directly or through the internegative process. This new evaluation procedure provided accurate quantitative data on granularity and contrast from the digital data. This test did not try to define which film was best visually. This is too often based on personal preference. However, the test results did group the films by good, marginal, and unacceptable. We developed, and included in this report, a template containing quantitative, graphical, and visual information for each film. These templates should be sufficient for comparing the different films tested and subsequently selecting a film or films to be used for experiments and general documentation on the International Space Station.
NASA Astrophysics Data System (ADS)
Su, Xin; Fang, Shaoyin; Zhang, Daosen; Zhang, Qinnan; He, Yingtian; Lu, Xiaoxu; Liu, Shengde; Zhong, Liyun
2015-12-01
Mesenchymal stem cells (MSCs) differentiate into islet-like cells, providing a possible solution for type I diabetes treatment. To search for the precise molecular mechanism of the directional differentiation of MSC-derived islet-like cells, biomolecular composition, and structural conformation information during MSC differentiation, is required. Because islet-like cells lack specific surface markers, the commonly employed immunostaining technique is not suitable for their identification, physical separation, and enrichment. Combining Raman spectroscopic data, a fitting accuracy-improved biochemical component analysis, and multiple peaks fitting approach, we identified the quantitative biochemical and intensity change of Raman peaks that show the differentiation of MSCs into islet-like cells. Along with increases in protein and glycogen content, and decreases in deoxyribonucleic acid and ribonucleic acid content, in islet-like cells relative to MSCs, it was found that a characteristic peak of insulin (665 cm-1) has twice the intensity in islet-like cells relative to MSCs, indicating differentiation of MSCs into islet-like cells was successful. Importantly, these Raman signatures provide useful information on the structural and pathological states during MSC differentiation and help to develop noninvasive and label-free Raman sorting methods for stem cells and their lineages.
A methodology to estimate uncertainty for emission projections through sensitivity analysis.
Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación
2015-04-01
Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.
Recommended approaches in the application of ...
ABSTRACT:Only a fraction of chemicals in commerce have been fully assessed for their potential hazards to human health due to difficulties involved in conventional regulatory tests. It has recently been proposed that quantitative transcriptomic data can be used to determine benchmark dose (BMD) and estimate a point of departure (POD). Several studies have shown that transcriptional PODs correlate with PODs derived from analysis of pathological changes, but there is no consensus on how the genes that are used to derive a transcriptional POD should be selected. Because of very large number of unrelated genes in gene expression data, the process of selecting subsets of informative genes is a major challenge. We used published microarray data from studies on rats exposed orally to multiple doses of six chemicals for 5, 14, 28, and 90 days. We evaluated eight different approaches to select genes for POD derivation and compared them to three previously proposed approaches. The relationship between transcriptional BMDs derived using these 11 approaches were compared with PODs derived from apical data that might be used in a human health risk assessment. We found that transcriptional benchmark dose values for all 11 approaches were remarkably aligned with different apical PODs, while a subset of between 3 and 8 of the approaches met standard statistical criteria across the 5-, 14-, 28-, and 90-day time points and thus qualify as effective estimates of apical PODs. Our r
Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang
2017-01-01
The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.
Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data
Gritsenko, Alexey A.; Hulsman, Marc; Reinders, Marcel J. T.; de Ridder, Dick
2015-01-01
Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP), a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP) model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates. PMID:26275099
Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data.
Gritsenko, Alexey A; Hulsman, Marc; Reinders, Marcel J T; de Ridder, Dick
2015-08-01
Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP), a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP) model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates.
Recent development in preparation of European soil hydraulic maps
NASA Astrophysics Data System (ADS)
Toth, B.; Weynants, M.; Pasztor, L.; Hengl, T.
2017-12-01
Reliable quantitative information on soil hydraulic properties is crucial for modelling hydrological, meteorological, ecological and biological processes of the Critical Zone. Most of the Earth system models need information on soil moisture retention capacity and hydraulic conductivity in the full matric potential range. These soil hydraulic properties can be quantified, but their measurement is expensive and time consuming, therefore measurement-based catchment scale mapping of these soil properties is not possible. The increasing availability of soil information and methods describing relationships between simple soil characteristics and soil hydraulic properties provide the possibility to derive soil hydraulic maps based on spatial soil datasets and pedotransfer functions (PTFs). Over the last decade there has been a significant development in preparation of soil hydraulic maps. Spatial datasets on model parameters describing the soil hydraulic processes have become available for countries, continents and even for the whole globe. Our aim is to present European soil hydraulic maps, show their performance, highlight their advantages and drawbacks, and propose possible ways to further improve the performance of those.
Code of Federal Regulations, 2014 CFR
2014-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2014 CFR
2014-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2012 CFR
2012-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2011 CFR
2011-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2012 CFR
2012-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Code of Federal Regulations, 2011 CFR
2011-07-01
... productivity data, including unit cost information. This quantitative information will be reported on Forms OSM-51A and OSM-51B or OSM-51C, Quantitative Program Management Information, as applicable. (c) The...
Rock-avalanche Deposits Record Quantitative Information On Internal Deformation During Runout
NASA Astrophysics Data System (ADS)
McSaveney, M. J.; Zhang, M.
2016-12-01
The rock avalanche deposit at Wenjiagou Creek, China, shows grain-size changes with distance from source and with depth below the surface. To see what quantitative information on internal deformation might be able to be inferred from such information, we conducted a series of laboratory tests using a conventional ring-shear apparatus (Torshear Model 27-WF2202) at GNS Science, Lower Hutt, NZ. Lacking ready access to the limestone of the Wenjiagou Creek deposit, we used locally sourced 0.5-2 mm sand sieved from the greywacke-derived gravel bed of the Hutt River. To keep within the reliable operating limits of the apparatus, we conducted 38 dry tests using the combinations of normal stress, shear rate and shear displacement listed in Table 1. Size distributions were determined over the range 0.1 - 2000 µm using a laser sizer. Results showed that the number of grain breakages increased systematically with increasing normal stress and shear displacement, while shear rate had no significant influence. We concluded that if calibrated using appropriate materials, we would be able to quantify amounts of internal shear deformation in a rock avalanche by analysis of grain-size variations in the deposit. Table 1 Ring-shear test program Normal stress (kPa) Shear rate (mm/min) Shear displacement (mm) 200 100 74.2 37.1 0 100 200 500 1000 3000 400 100 74.2 37.1 0 100 200 500 1000 600 100 74.2 0 100 200 500 1000
NASA Astrophysics Data System (ADS)
Wang, Hui; Wellmann, Florian; Verweij, Elizabeth; von Hebel, Christian; van der Kruk, Jan
2017-04-01
Lateral and vertical spatial heterogeneity of subsurface properties such as soil texture and structure influences the available water and resource supply for crop growth. High-resolution mapping of subsurface structures using non-invasive geo-referenced geophysical measurements, like electromagnetic induction (EMI), enables a characterization of 3D soil structures, which have shown correlations to remote sensing information of the crop states. The benefit of EMI is that it can return 3D subsurface information, however the spatial dimensions are limited due to the labor intensive measurement procedure. Although active and passive sensors mounted on air- or space-borne platforms return 2D images, they have much larger spatial dimensions. Combining both approaches provides us with a potential pathway to extend the detailed 3D geophysical information to a larger area by using remote sensing information. In this study, we aim at extracting and providing insights into the spatial and statistical correlation of the geophysical and remote sensing observations of the soil/vegetation continuum system. To this end, two key points need to be addressed: 1) how to detect and recognize the geometric patterns (i.e., spatial heterogeneity) from multiple data sets, and 2) how to quantitatively describe the statistical correlation between remote sensing information and geophysical measurements. In the current study, the spatial domain is restricted to shallow depths up to 3 meters, and the geostatistical database contains normalized difference vegetation index (NDVI) derived from RapidEye satellite images and apparent electrical conductivities (ECa) measured from multi-receiver EMI sensors for nine depths of exploration ranging from 0-2.7 m. The integrated data sets are mapped into both the physical space (i.e. the spatial domain) and feature space (i.e. a two-dimensional space framed by the NDVI and the ECa data). Hidden Markov Random Fields (HMRF) are employed to model the underlying heterogeneities in spatial domain and finite Gaussian mixture models are adopted to quantitatively describe the statistical patterns in terms of center vectors and covariance matrices in feature space. A recently developed parallel stochastic clustering algorithm is adopted to implement the HMRF models and the Markov chain Monte Carlo based Bayesian inference. Certain spatial patterns such as buried paleo-river channels covered by shallow sediments are investigated as typical examples. The results indicate that the geometric patterns of the subsurface heterogeneity can be represented and quantitatively characterized by HMRF. Furthermore, the statistical patterns of the NDVI and the EMI data from the soil/vegetation-continuum system can be inferred and analyzed in a quantitative manner.
Rethinking health numeracy: a multidisciplinary literature review.
Ancker, Jessica S; Kaufman, David
2007-01-01
The purpose of this review is to organize various published conceptions of health numeracy and to discuss how health numeracy contributes to the productive use of quantitative information for health. We define health numeracy as the individual-level skills needed to understand and use quantitative health information, including basic computation skills, ability to use information in documents and non-text formats such as graphs, and ability to communicate orally. We also identify two other factors affecting whether a consumer can use quantitative health information: design of documents and other information artifacts, and health-care providers' communication skills. We draw upon the distributed cognition perspective to argue that essential ingredients for the productive use of quantitative health information include not only health numeracy but also good provider communication skills, as well as documents and devices that are designed to enhance comprehension and cognition.
Semiotic foundation for multisensor-multilook fusion
NASA Astrophysics Data System (ADS)
Myler, Harley R.
1998-07-01
This paper explores the concept of an application of semiotic principles to the design of a multisensor-multilook fusion system. Semiotics is an approach to analysis that attempts to process media in a united way using qualitative methods as opposed to quantitative. The term semiotic refers to signs, or signatory data that encapsulates information. Semiotic analysis involves the extraction of signs from information sources and the subsequent processing of the signs into meaningful interpretations of the information content of the source. The multisensor fusion problem predicated on a semiotic system structure and incorporating semiotic analysis techniques is explored and the design for a multisensor system as an information fusion system is explored. Semiotic analysis opens the possibility of using non-traditional sensor sources and modalities in the fusion process, such as verbal and textual intelligence derived from human observers. Examples of how multisensor/multimodality data might be analyzed semiotically is shown and discussion on how a semiotic system for multisensor fusion could be realized is outlined. The architecture of a semiotic multisensor fusion processor that can accept situational awareness data is described, although an implementation has not as yet been constructed.
Quantitation of permethylated N-glycans through multiple-reaction monitoring (MRM) LC-MS/MS.
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L; Mechref, Yehia
2015-04-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.
Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS
NASA Astrophysics Data System (ADS)
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia
2015-04-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple-reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan was determined to be 30%, whereas it was found to be 35% for either fucosylated or sialylated N-glycans. The optimum CE for mannose and complex type N-glycan was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan compositions in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these glycans was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitude. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples.
Martinez-Pinna, Roxana; Gonzalez de Peredo, Anne; Monsarrat, Bernard; Burlet-Schiltz, Odile; Martin-Ventura, Jose Luis
2014-08-01
To find potential biomarkers of abdominal aortic aneurysms (AAA), we performed a differential proteomic study based on human plasma-derived microvesicles. Exosomes and microparticles isolated from plasma of AAA patients and control subjects (n = 10 each group) were analyzed by a label-free quantitative MS-based strategy. Homemade and publicly available software packages have been used for MS data analysis. The application of two kinds of bioinformatic tools allowed us to find differential protein profiles from AAA patients. Some of these proteins found by the two analysis methods belong to main pathological mechanisms of AAA such as oxidative stress, immune-inflammation, and thrombosis. Data analysis from label-free MS-based experiments requires the use of sophisticated bioinformatic approaches to perform quantitative studies from complex protein mixtures. The application of two of these bioinformatic tools provided us a preliminary list of differential proteins found in plasma-derived microvesicles not previously associated to AAA, which could help us to understand the pathological mechanisms related to this disease. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Takahashi, Yuji; Shomura, Ayahiko; Sasaki, Takuji; Yano, Masahiro
2001-01-01
Hd6 is a quantitative trait locus involved in rice photoperiod sensitivity. It was detected in backcross progeny derived from a cross between the japonica variety Nipponbare and the indica variety Kasalath. To isolate a gene at Hd6, we used a large segregating population for the high-resolution and fine-scale mapping of Hd6 and constructed genomic clone contigs around the Hd6 region. Linkage analysis with P1-derived artificial chromosome clone-derived DNA markers delimited Hd6 to a 26.4-kb genomic region. We identified a gene encoding the α subunit of protein kinase CK2 (CK2α) in this region. The Nipponbare allele of CK2α contains a premature stop codon, and the resulting truncated product is undoubtedly nonfunctional. Genetic complementation analysis revealed that the Kasalath allele of CK2α increases days-to-heading. Map-based cloning with advanced backcross progeny enabled us to identify a gene underlying a quantitative trait locus even though it exhibited a relatively small effect on the phenotype. PMID:11416158
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-01
... emissions, (applicants are encouraged to provide quantitative information regarding expected reductions in...). Applicants are encouraged to provide quantitative information that validates the existence of substantial... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
78 FR 8113 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-05
... to collect information as part of quantitative research related to residential mortgage loan... is the subject of the disclosure. The quantitative research will involve testing the mortgage loan... quantitative research methodologies. The contractors will select participants via screening questionnaires to...
A Primer on Disseminating Applied Quantitative Research
ERIC Educational Resources Information Center
Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.
2010-01-01
Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... human health assessment program that evaluates quantitative and qualitative risk information on effects... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-12
... INFORMATION: Title: Comparative Effectiveness Research Inventory. Abstract: The information collection... will not be used for quantitative information collections that are designed to yield reliably... mechanisms that are designed to yield quantitative results. The Agency received no comments in response to...
NASA Astrophysics Data System (ADS)
Adabi, Saba; Conforto, Silvia; Hosseinzadeh, Matin; Noe, Shahryar; Daveluy, Steven; Mehregan, Darius; Nasiriavanaki, Mohammadreza
2017-02-01
Optical Coherence Tomography (OCT) offers real-time high-resolution three-dimensional images of tissue microstructures. In this study, we used OCT skin images acquired from ten volunteers, neither of whom had any skin conditions addressing the features of their anatomic location. OCT segmented images are analyzed based on their optical properties (attenuation coefficient) and textural image features e.g., contrast, correlation, homogeneity, energy, entropy, etc. Utilizing the information and referring to their clinical insight, we aim to make a comprehensive computational model for the healthy skin. The derived parameters represent the OCT microstructural morphology and might provide biological information for generating an atlas of normal skin from different anatomic sites of human skin and may allow for identification of cell microstructural changes in cancer patients. We then compared the parameters of healthy samples with those of abnormal skin and classified them using a linear Support Vector Machines (SVM) with 82% accuracy.
Cost benefit analysis of the transfer of NASA remote sensing technology to the state of Georgia
NASA Technical Reports Server (NTRS)
Zimmer, R. P. (Principal Investigator); Wilkins, R. D.; Kelly, D. L.; Brown, D. M.
1977-01-01
The author has identified the following significant results. First order benefits can generally be quantified, thus allowing quantitative comparisons of candidate land cover data systems. A meaningful dollar evaluation of LANDSAT can be made by a cost comparison with equally effective data systems. Users of LANDSAT data can be usefully categorized as performing three general functions: planning, permitting, and enforcing. The value of LANDSAT data to the State of Georgia is most sensitive to the parameters: discount rate, digitization cost, and photo acquisition cost. Under a constrained budget, LANDSAT could provide digitized land cover information roughly seven times more frequently than could otherwise be obtained. Thus on one hand, while the services derived from LANDSAT data in comparison to the baseline system has a positive net present value, on the other hand if the budget were constrained, more frequent information could be provided using the LANDSAT system than otherwise be obtained.
Steps through the revision process of reproductive health sections of ICD-11.
Chou, Doris; Tunçalp, Özge; Hotamisligil, Selen; Norman, Jane; Say, Lale; Volkmer, Björn; Pattinson, Bob; Rooney, Cleo; Serour, Gamal; de Mouzon, Jacques; Gardosi, Jason; Thueroff, Joachim; Mark, Morgan; D'Hooghe, Thomas
2012-01-01
In 2007, the WHO initiated an organizational structure for the 11th revision of the International Classification of Diseases (ICD). Effective deployment of ICD-derived tools facilitates the use and collection of health information in a variety of resource settings, promoting quantitatively informed decisions. They also facilitate comparison of disease incidence and outcomes between different countries and different health care systems around the world. The Department of Reproductive Health and Research (RHR) coordinates the revision of chapters 14 (diseases of the genitourinary system), 15 (pregnancy, childbirth, and puerperium), and 16 (conditions originating in the perinatal period). RHR convened a technical advisory group (TAG), the Genito-Urinary Reproductive Medicine (GURM) TAG, for the ICD revision. The TAG's work reflects the collective understanding of sexual and reproductive health and is now available for review within the ICD-11 revision process. Copyright © 2012 S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, S.C.; Bartle, K.D.; Holden, K.M.L.
1994-12-31
A series of heteroatom-rich coal and coal-derived liquids have been analysed using gas chromatography (GC) in combination with three different element-selective detectors. Selected chromatograms, including a supercritical extract (Mequinenza lignite) and aromatic fractions isolated from coal tar pitch samples are presented. In each case a series of sulphur- and/or nitrogen-containing compounds have been identified using either flame photometric detection (GC/FID/FPD) or nitrogen-phosphorous detection (GC/FID/NPD) and the information compared with that obtained from a GC coupled to an atomic emission detector (GC-AED). Preliminary results have demonstrated the relative response characteristics of each detector and their respective ability to acquire qualitative andmore » quantitative information in interfering background matrices. Further, due to the unique capabilities of GC-AED, a number of dual heteroatomic (sulphur-oxygen and nitrogen-oxygen) compounds have been identified.« less
NASA Technical Reports Server (NTRS)
Cloutis, E. A.; Lambert, J.; Smith, D. G. W.; Gaffey, M. J.
1987-01-01
High-resolution visible and near-infrared diffuse reflectance spectra of mafic silicates can be deconvolved to yield quantitative information concerning mineral mixture properties, and the results can be directly applied to remotely sensed data. Spectral reflectance measurements of laboratory mixtures of olivine, orthophyroxene, and clinopyroxene with known chemistries, phase abundances, and particle size distributions have been utilized to develop correlations between spectral properties and the physicochemical parameters of the samples. A large number of mafic silicate spectra were measured and examined for systematic variations in spectral properties as a function of chemistry, phase abundance, and particle size. Three classes of spectral parameters (ratioed, absolute, and wavelength) were examined for any correlations. Each class is sensitive to particular mafic silicate properties. Spectral deconvolution techniques have been developed for quantifying, with varying degrees of accuracy, the assemblage properties (chemistry, phase abundance, and particle size).
Assessment of Health Effects of Exogenous Urea: Summary and Key Findings.
Dickerson, Aisha S; Lee, Janice S; Keshava, Channa; Hotchkiss, Andrew; Persad, Amanda S
2018-05-01
Urea has been utilized as a reductant in diesel fuels to lower emission of nitrogen oxides, igniting interest in probable human health hazards associated with exposure to exogenous urea. Here, we summarize and update key findings on potential health effects of exogenous urea, including carcinogenicity. No definitive target organs for oral exposure were identified; however, results in animal studies suggest that the liver and kidney could be potential target organs of urea toxicity. The available human-subject literature suggests that the impact on lung function is minimal. Based on the literature on exogenous urea, we concluded that there was inadequate information to assess the carcinogenic potential of urea, or perform a quantitative assessment to derive reference values. Given the limited information on exogenous urea, additional research to address gaps for exogenous urea should include long-term cancer bioassays, two-generation reproductive toxicity studies, and mode-of-action investigations.
Brehme, Marc; Koschmieder, Steffen; Montazeri, Maryam; Copland, Mhairi; Oehler, Vivian G.; Radich, Jerald P.; Brümmendorf, Tim H.; Schuppert, Andreas
2016-01-01
Modelling the parameters of multistep carcinogenesis is key for a better understanding of cancer progression, biomarker identification and the design of individualized therapies. Using chronic myeloid leukemia (CML) as a paradigm for hierarchical disease evolution we show that combined population dynamic modelling and CML patient biopsy genomic analysis enables patient stratification at unprecedented resolution. Linking CD34+ similarity as a disease progression marker to patient-derived gene expression entropy separated established CML progression stages and uncovered additional heterogeneity within disease stages. Importantly, our patient data informed model enables quantitative approximation of individual patients’ disease history within chronic phase (CP) and significantly separates “early” from “late” CP. Our findings provide a novel rationale for personalized and genome-informed disease progression risk assessment that is independent and complementary to conventional measures of CML disease burden and prognosis. PMID:27048866
Volumetric visualization of multiple-return LIDAR data: Using voxels
Stoker, Jason M.
2009-01-01
Elevation data are an important component in the visualization and analysis of geographic information. The creation and display of 3D models representing bare earth, vegetation, and surface structures have become a major focus of light detection and ranging (lidar) remote sensing research in the past few years. Lidar is an active sensor that records the distance, or range, of a laser usually fi red from an airplane, helicopter, or satellite. By converting the millions of 3D lidar returns from a system into bare ground, vegetation, or structural elevation information, extremely accurate, high-resolution elevation models can be derived and produced to visualize and quantify scenes in three dimensions. These data can be used to produce high-resolution bare-earth digital elevation models; quantitative estimates of vegetative features such as canopy height, canopy closure, and biomass; and models of urban areas such as building footprints and 3D city models.
Zhang, Xin-Ke; Lan, Yi-Bin; Zhu, Bao-Qing; Xiang, Xiao-Feng; Duan, Chang-Qing; Shi, Ying
2018-01-01
Monosaccharides, organic acids and amino acids are the important flavour-related components in wines. The aim of this article is to develop and validate a method that could simultaneously analyse these compounds in wine based on silylation derivatisation and gas chromatography-mass spectrometry (GC-MS), and apply this method to the investigation of the changes of these compounds and speculate upon their related influences on Cabernet Sauvignon wine flavour during wine ageing. This work presented a new approach for wine analysis and provided more information concerning red wine ageing. This method could simultaneously quantitatively analyse 2 monosaccharides, 8 organic acids and 13 amino acids in wine. A validation experiment showed good linearity, sensitivity, reproducibility and recovery. Multiple derivatives of five amino acids have been found but their effects on quantitative analysis were negligible, except for methionine. The evolution pattern of each category was different, and we speculated that the corresponding mechanisms involving microorganism activities, physical interactions and chemical reactions had a great correlation with red wine flavours during ageing. Simultaneously quantitative analysis of monosaccharides, organic acids and amino acids in wine was feasible and reliable and this method has extensive application prospects. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Quantitative MRI of kidneys in renal disease.
Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J
2018-03-01
To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p < 0.05), higher ADCs (2.46 ± 0.20 vs. 2.18 ± 0.10 × 10 -3 mm 2 /s, p < 0.05), lower R2*s (14.9 ± 1.7 vs. 18.1 ± 1.6 s -1 , p < 0.05), and lower tissue stiffness (3.2 ± 0.3 vs. 3.8 ± 0.5 kPa, p < 0.05). Excellent reproducibility of the quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.
Wheelan, P; Zirrolli, J A; Clay, K L
1992-01-01
A method has been developed for the analysis of derivatized diradylglycerols obtained from glycerophosphocholine (GPC) of transformed murine bone marrow-derived mast cells that provided high performance liquid chromatography (HPLC) separation of GPC subclasses and molecular species separation with on-line quantitation using UV detection. In addition, the derivatized diradylglycerol species were unequivocably identified by continuous flow fast-atom bombardment mass spectrometry. GPC was initially isolated by thin-layer chromatography (TLC), the phosphocholine group was hydrolyzed, and the resultant diradylglycerol was derivatized with 7-[(chlorocarbonyl)-methoxy]-4-methylcoumarin (CMMC). After separation of the derivatized subclasses by normal phase HPLC, the individual molecular species of the alkylacyl and diacyl subclasses were quantitated and collected during a subsequent reverse phase HPLC step. With an extinction coefficient of 14,700 l mol-1 cm-1 at a wavelength detection of 320 nm, the CMMC derivatives afforded sensitive UV detection (100 pmol) and quantitation of the molecular species. Continuous flow fast-atom bombardment mass spectrometry of the alkylacyl CMMC derivatives yielded abundant [MH]+ ions and a single fragment ion formed by loss of alkylketene from the sn-2 acyl group, [MH-(R = C = O)]+. No fragmentation of the sn-1 alkyl chain was observed. Diacyl derivatives also produced abundant [MH]+ ions plus two fragment ions arising from loss of RCOOH from each of the acyl substituents and two fragment ions from the loss of alkyketene from each acyl group. Individual molecular species substituents were assigned from these ions.
Takegami, Shigehiko; Kitamura, Keisuke; Ohsugi, Mayuko; Ito, Aya; Kitade, Tatsuya
2015-06-15
In order to quantitatively examine the lipophilicity of the widely used organophosphorus pesticides (OPs) chlorfenvinphos (CFVP), chlorpyrifos-methyl (CPFM), diazinon (DZN), fenitrothion (FNT), fenthion (FT), isofenphos (IFP), profenofos (PFF) and pyraclofos (PCF), their partition coefficient (Kp) values between phosphatidylcholine (PC) small unilamellar vesicles (SUVs) and water (liposome-water system) were determined by second-derivative spectrophotometry. The second-derivative spectra of these OPs in the presence of PC SUV showed a bathochromic shift according to the increase in PC concentration and distinct derivative isosbestic points, demonstrating the complete elimination of the residual background signal effects that were observed in the absorption spectra. The Kp values were calculated from the second-derivative intensity change induced by addition of PC SUV and obtained with a good precision of R.S.D. below 10%. The Kp values were in the order of CPFM>FT>PFF>PCF>IFP>CFVP>FNT⩾DZN and did not show a linear correlation relationship with the reported partition coefficients obtained using an n-octanol-water system (R(2)=0.530). Also, the results quantitatively clarified the effect of chemical-group substitution in OPs on their lipophilicity. Since the partition coefficient for the liposome-water system is more effective for modeling the quantitative structure-activity relationship than that for the n-octanol-water system, the obtained results are toxicologically important for estimating the accumulation of these OPs in human cell membranes. Copyright © 2015 Elsevier B.V. All rights reserved.
76 FR 13018 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. Total Burden Estimate for the...
75 FR 81665 - Notice of Intent to Seek Approval to Reinstate an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... are both quantitative and descriptive. Quantitative information from the most recently completed... activities with respect to industrial collaboration [cir] Conducting a survey of all center participants to probe the participant satisfaction with center activities [cir] Compiling a set of quantitative...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
... quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as a result of... encouraged to provide quantitative information that validates the existence of substantial transportation... quantitative and qualitative measures. Therefore, applicants for TIGER Discretionary Grants are generally...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... provide quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as... provide quantitative information that validates the existence of substantial transportation-related costs... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...
76 FR 52383 - Reports, Forms, and Recordkeeping Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-22
... OMB: Title: 49 CFR 575--Consumer Information Regulations (sections 103 and 105) Quantitative Research... research and is now requesting to conduct follow- up quantitative research with consumers to assess current.... The results of that research phase were used to inform the quantitative phase of research which this...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that.... This type of generic clearance for qualitative information will not be used for quantitative... for submission for other generic mechanisms that are designed to yield quantitative results. The...
Comprehensive proteomic characterization of stem cell-derived extracellular matrices.
Ragelle, Héloïse; Naba, Alexandra; Larson, Benjamin L; Zhou, Fangheng; Prijić, Miralem; Whittaker, Charles A; Del Rosario, Amanda; Langer, Robert; Hynes, Richard O; Anderson, Daniel G
2017-06-01
In the stem-cell niche, the extracellular matrix (ECM) serves as a structural support that additionally provides stem cells with signals that contribute to the regulation of stem-cell function, via reciprocal interactions between cells and components of the ECM. Recently, cell-derived ECMs have emerged as in vitro cell culture substrates to better recapitulate the native stem-cell microenvironment outside the body. Significant changes in cell number, morphology and function have been observed when mesenchymal stem cells (MSC) were cultured on ECM substrates as compared to standard tissue-culture polystyrene (TCPS). As select ECM components are known to regulate specific stem-cell functions, a robust characterization of cell-derived ECM proteomic composition is critical to better comprehend the role of the ECM in directing cellular processes. Here, we characterized and compared the protein composition of ECM produced in vitro by bone marrow-derived MSC, adipose-derived MSC and neonatal fibroblasts from different donors, employing quantitative proteomic methods. Each cell-derived ECM displayed a specific and unique matrisome signature, yet they all shared a common set of proteins. We evaluated the biological response of cells cultured on the different matrices and compared them to cells on standard TCPS. The matrices lead to differential survival and gene-expression profiles among the cell types and as compared to TCPS, indicating that the cell-derived ECMs influence each cell type in a different manner. This general approach to understanding the protein composition of different tissue-specific and cell-derived ECM will inform the rational design of defined systems and biomaterials that recapitulate critical ECM signals for stem-cell culture and tissue engineering. Copyright © 2017 Elsevier Ltd. All rights reserved.
A general theory of multimetric indices and their properties
Schoolmaster, Donald R.; Grace, James B.; Schweiger, E. William
2012-01-01
1. Stewardship of biological and ecological resources requires the ability to make integrative assessments of ecological integrity. One of the emerging methods for making such integrative assessments is multimetric indices (MMIs). These indices synthesize data, often from multiple levels of biological organization, with the goal of deriving a single index that reflects the overall effects of human disturbance. Despite the widespread use of MMIs, there is uncertainty about why this approach can be effective. An understanding of MMIs requires a quantitative theory that illustrates how the properties of candidate metrics relates to MMIs generated from those metrics. 2. We present the initial basis for such a theory by deriving the general mathematical characteristics of MMIs assembled from metrics. We then use the theory to derive quantitative answers to the following questions: Is there an optimal number of metrics to comprise an index? How does covariance among metrics affect the performance of the index derived from those metrics? And what are the criteria to decide whether a given metric will improve the performance of an index? 3. We find that the optimal number of metrics to be included in an index depends on the theoretical distribution of signal of the disturbance gradient contained in each metric. For example, if the rank-ordered parameters of a metric-disturbance regression can be described by a monotonically decreasing function, then an optimum number of metrics exists and can often be derived analytically. We derive the conditions by which adding a given metric can be expected to improve an index. 4. We find that the criterion defining such conditions depends nonlinearly of the signal of the disturbance gradient, the noise (error) of the metric and the correlation of the metric errors. Importantly, we find that correlation among metric errors increases the signal required for the metric to improve the index. 5. The theoretical framework presented in this study provides the basis for understanding the properties of MMIs. It can also be useful throughout the index construction process. Specifically, it can be used to aid understanding of the benefits and limitations of combining metrics into indices; it can inform selection/collection of candidate metrics; and it can be used directly as a decision aid in effective index construction.
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2014 CFR
2014-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2011 CFR
2011-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2010 CFR
2010-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2012 CFR
2012-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
21 CFR 101.13 - Nutrient content claims-general principles.
Code of Federal Regulations, 2013 CFR
2013-04-01
... must also bear: (A) Clear and concise quantitative information comparing the amount of the subject... is on the information panel, the quantitative information may be located elsewhere on the information...., “Modified fat cheesecake”). This statement of identity must be immediately followed by the comparative...
XX/XY System of Sex Determination in the Geophilomorph Centipede Strigamia maritima
Green, Jack E.; Dalíková, Martina; Sahara, Ken; Marec, František; Akam, Michael
2016-01-01
We show that the geophilomorph centipede Strigamia maritima possesses an XX/XY system of sex chromosomes, with males being the heterogametic sex. This is, to our knowledge, the first report of sex chromosomes in any geophilomorph centipede. Using the recently assembled Strigamia genome sequence, we identified a set of scaffolds differentially represented in male and female DNA sequence. Using quantitative real-time PCR, we confirmed that three candidate X chromosome-derived scaffolds are present at approximately twice the copy number in females as in males. Furthermore, we confirmed that six candidate Y chromosome-derived scaffolds contain male-specific sequences. Finally, using this molecular information, we designed an X chromosome-specific DNA probe and performed fluorescent in situ hybridization against mitotic and meiotic chromosome spreads to identify the Strigamia XY sex-chromosome pair cytologically. We found that the X and Y chromosomes are recognizably different in size during the early pachytene stage of meiosis, and exhibit incomplete and delayed pairing. PMID:26919730
Cesa, Stefania; Carradori, Simone; Bellagamba, Giuseppe; Locatelli, Marcello; Casadei, Maria Antonietta; Masci, Alessandra; Paolicelli, Patrizia
2017-10-01
Colour is the first organoleptic property that consumers appreciate of a foodstuff. In blueberry (Vaccinium spp.) fruits, the anthocyanins are the principal pigments determining the colour as well as many of the beneficial effects attributed to this functional food. Commercial blueberry-derived products represent important sources of these healthy molecules all year round. In this study, blueberries were produced into purees comparing two homogenization methods and further heated following different thermal treatments. All the supernatants of the homogenates were monitored for pH. Then, the hydroalcoholic extracts of the same samples were characterized by CIELAB and HPLC-DAD analyses. These analytical techniques provide complementary information on fruit pigments content as a whole and on quali-quantitative profile of the single bioactive colorants. These data could be very interesting to know the best manufacturing procedure to prepare blueberry-derived products, well accepted by the consumers, while maintaining their healthy properties unaltered. Copyright © 2017. Published by Elsevier Ltd.
Toropova, Alla P; Schultz, Terry W; Toropov, Andrey A
2016-03-01
Data on toxicity toward Tetrahymena pyriformis is indicator of applicability of a substance in ecologic and pharmaceutical aspects. Quantitative structure-activity relationships (QSARs) between the molecular structure of benzene derivatives and toxicity toward T. pyriformis (expressed as the negative logarithms of the population growth inhibition dose, mmol/L) are established. The available data were randomly distributed three times into the visible training and calibration sets, and invisible validation sets. The statistical characteristics for the validation set are the following: r(2)=0.8179 and s=0.338 (first distribution); r(2)=0.8682 and s=0.341 (second distribution); r(2)=0.8435 and s=0.323 (third distribution). These models are built up using only information on the molecular structure: no data on physicochemical parameters, 3D features of the molecular structure and quantum mechanics descriptors are involved in the modeling process. Copyright © 2016 Elsevier B.V. All rights reserved.
Shape dependence of two-cylinder Rényi entropies for free bosons on a lattice
NASA Astrophysics Data System (ADS)
Chojnacki, Leilee; Cook, Caleb Q.; Dalidovich, Denis; Hayward Sierens, Lauren E.; Lantagne-Hurtubise, Étienne; Melko, Roger G.; Vlaar, Tiffany J.
2016-10-01
Universal scaling terms occurring in Rényi entanglement entropies have the potential to bring new understanding to quantum critical points in free and interacting systems. Quantitative comparisons between analytical continuum theories and numerical calculations on lattice models play a crucial role in advancing such studies. In this paper, we exactly calculate the universal two-cylinder shape dependence of entanglement entropies for free bosons on finite-size square lattices, and compare to approximate functions derived in the continuum using several different Ansätze. Although none of these Ansätze are exact in the thermodynamic limit, we find that numerical fits are in good agreement with continuum functions derived using the anti-de Sitter/conformal field theory correspondence, an extensive mutual information model, and a quantum Lifshitz model. We use fits of our lattice data to these functions to calculate universal scalars defined in the thin-cylinder limit, and compare to values previously obtained for the free boson field theory in the continuum.
NASA Astrophysics Data System (ADS)
Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke
2016-05-01
Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.
3D-QSAR and molecular docking studies on HIV protease inhibitors
NASA Astrophysics Data System (ADS)
Tong, Jianbo; Wu, Yingji; Bai, Min; Zhan, Pei
2017-02-01
In order to well understand the chemical-biological interactions governing their activities toward HIV protease activity, QSAR models of 34 cyclic-urea derivatives with inhibitory HIV were developed. The quantitative structure activity relationship (QSAR) model was built by using comparative molecular similarity indices analysis (CoMSIA) technique. And the best CoMSIA model has rcv2, rncv2 values of 0.586 and 0.931 for cross-validated and non-cross-validated. The predictive ability of CoMSIA model was further validated by a test set of 7 compounds, giving rpred2 value of 0.973. Docking studies were used to find the actual conformations of chemicals in active site of HIV protease, as well as the binding mode pattern to the binding site in protease enzyme. The information provided by 3D-QSAR model and molecular docking may lead to a better understanding of the structural requirements of 34 cyclic-urea derivatives and help to design potential anti-HIV protease molecules.
NASA Astrophysics Data System (ADS)
Böttger, U.; Waser, R.
2017-07-01
The existence of non-ferroelectric regions in ferroelectric thin films evokes depolarization effects leading to a tilt of the P(E) hysteresis loop. The analysis of measured hysteresis of lead zirconate titanate (PZT) thin films is used to determine a depolarization factor which contains quantitative information about interfacial layers as well as ferroelectrically passive zones in the bulk. The derived interfacial capacitance is smaller than that estimated from conventional extrapolation techniques. In addition, the concept of depolarization is used for the investigation of fatigue behavior of PZT thin films indicating that the mechanism of seed inhibition, which is responsible for the effect, occurs in the entire film.
Information transfer in verbal presentations at scientific meetings
NASA Astrophysics Data System (ADS)
Flinn, Edward A.
The purpose of this note is to suggest a quantitative approach to deciding how much time to give a speaker at a scientific meeting. The elementary procedure is to use the preacher's rule of thumb that no souls are saved after the first 20 minutes. This is in qualitative agreement with the proverb that one cannot listen to a single voice for more than an hour without going to sleep. A refinement of this crude approach can be made by considering the situation from the point of view of a linear physical system with an input, a transfer function, and an output. We attempt here to derive an optimum speaking time through these considerations.
Detection of regional air pollution episodes utilizing satellite data in the visual range
NASA Technical Reports Server (NTRS)
Bowley, C. J.; Burke, H. K.; Barnes, J. C.
1981-01-01
A comparative analysis of satellite-observed haze patterns and ground-based aerosol measurements is carried out for July 20-23, 1978. During this period, a significant regional air pollution episode existed across the northeastern United States, accompanied by widespread haze, reduced surface visibility, and elevated sulfate levels measured by the Sulfate Regional Experiment (SURE) network. The results show that the satellite-observed haze patterns correlate closely with the area of reported low surface visibility (less than 4 mi) and high sulfate levels. Quantitative information on total aerosol loading derived from the satellite-digitized data, using an atmospheric radiative transfer model, agrees well with the results obtained from the ground-based measurements.
NASA Technical Reports Server (NTRS)
Roman, Miguel O.; Nightingale, Joanne; Nickeson, Jaime; Schaepman-Strub, Gabriela
2011-01-01
The goals and objectives of the sub group are: To foster and coordinate quantitative validation of higher level global land products derive d from remotely sensed data, in a traceable way, and to relay results so they are relevant to users. and to increase the quality and effi ciency of global satellite product validation by developing and promo ting international standards and protocols for: (1) Field sampling (2) Scaling techniques (3) Accuracy reporting (4) Data / information exchange also to provide feedback to international structures (GEOSS ) for: (1) Requirements on product accuracy and quality assurance (QA 4EO) (2) Terrestrial ECV measurement standards (3) Definitions for f uture missions
Evaluation of the performance of hydrological variables derived from GLDAS-2 and MERRA-2 in Mexico
NASA Astrophysics Data System (ADS)
Real-Rangel, R. A.; Pedrozo-Acuña, A.; Breña-Naranjo, J. A.
2017-12-01
Hydrological studies have found in data assimilation systems and global reanalysis of land surface variables (e.g soil moisture, streamflow) a wide range of applications, from drought monitoring to water balance and hydro-climatology variability assessment. Indeed, these hydrological data sources have led to an improvement in developing and testing monitoring and prediction systems in poorly gauged regions of the world. This work tests the accuracy and error of land surface variables (precipitation, soil moisture, runoff and temperature) derived from the data assimilation reanalysis products GLDAS-2 and MERRA-2. Validate the performance of these data platforms must be thoroughly evaluated in order to consider the error of hydrological variables (i.e., precipitation, soil moisture, runoff and temperature) derived from the reanalysis products. For such purpose, a quantitative assessment was performed at 2,892 climatological stations, 42 stream gauges and 44 soil moisture probes located in Mexico and across different climate regimes (hyper-arid to tropical humid). Results show comparisons between these gridded products against ground-based observational stations for 1979-2014. The results of this analysis display a spatial distribution of errors and accuracy over Mexico discussing differences between climates, enabling the informed use of these products.
Improving Marine Ecosystem Models with Biochemical Tracers
NASA Astrophysics Data System (ADS)
Pethybridge, Heidi R.; Choy, C. Anela; Polovina, Jeffrey J.; Fulton, Elizabeth A.
2018-01-01
Empirical data on food web dynamics and predator-prey interactions underpin ecosystem models, which are increasingly used to support strategic management of marine resources. These data have traditionally derived from stomach content analysis, but new and complementary forms of ecological data are increasingly available from biochemical tracer techniques. Extensive opportunities exist to improve the empirical robustness of ecosystem models through the incorporation of biochemical tracer data and derived indices, an area that is rapidly expanding because of advances in analytical developments and sophisticated statistical techniques. Here, we explore the trophic information required by ecosystem model frameworks (species, individual, and size based) and match them to the most commonly used biochemical tracers (bulk tissue and compound-specific stable isotopes, fatty acids, and trace elements). Key quantitative parameters derived from biochemical tracers include estimates of diet composition, niche width, and trophic position. Biochemical tracers also provide powerful insight into the spatial and temporal variability of food web structure and the characterization of dominant basal and microbial food web groups. A major challenge in incorporating biochemical tracer data into ecosystem models is scale and data type mismatches, which can be overcome with greater knowledge exchange and numerical approaches that transform, integrate, and visualize data.
Pharmacokinetic Steady-States Highlight Interesting Target-Mediated Disposition Properties.
Gabrielsson, Johan; Peletier, Lambertus A
2017-05-01
In this paper, we derive explicit expressions for the concentrations of ligand L, target R and ligand-target complex RL at steady state for the classical model describing target-mediated drug disposition, in the presence of a constant-rate infusion of ligand. We demonstrate that graphing the steady-state values of ligand, target and ligand-target complex, we obtain striking and often singular patterns, which yield a great deal of insight and understanding about the underlying processes. Deriving explicit expressions for the dependence of L, R and RL on the infusion rate, and displaying graphs of the relations between L, R and RL, we give qualitative and quantitive information for the experimentalist about the processes involved. Understanding target turnover is pivotal for optimising these processes when target-mediated drug disposition (TMDD) prevails. By a combination of mathematical analysis and simulations, we also show that the evolution of the three concentration profiles towards their respective steady-states can be quite complex, especially for lower infusion rates. We also show how parameter estimates obtained from iv bolus studies can be used to derive steady-state concentrations of ligand, target and complex. The latter may serve as a template for future experimental designs.
Cannon, Jonathan
2017-01-01
Mutual information is a commonly used measure of communication between neurons, but little theory exists describing the relationship between mutual information and the parameters of the underlying neuronal interaction. Such a theory could help us understand how specific physiological changes affect the capacity of neurons to synaptically communicate, and, in particular, they could help us characterize the mechanisms by which neuronal dynamics gate the flow of information in the brain. Here we study a pair of linear-nonlinear-Poisson neurons coupled by a weak synapse. We derive an analytical expression describing the mutual information between their spike trains in terms of synapse strength, neuronal activation function, the time course of postsynaptic currents, and the time course of the background input received by the two neurons. This expression allows mutual information calculations that would otherwise be computationally intractable. We use this expression to analytically explore the interaction of excitation, information transmission, and the convexity of the activation function. Then, using this expression to quantify mutual information in simulations, we illustrate the information-gating effects of neural oscillations and oscillatory coherence, which may either increase or decrease the mutual information across the synapse depending on parameters. Finally, we show analytically that our results can quantitatively describe the selection of one information pathway over another when multiple sending neurons project weakly to a single receiving neuron.
77 FR 75498 - Request for Comments on a New Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-20
... statistical surveys that yield quantitative results that can be generalized to the population of study. DATES... surveys that yield quantitative results that can be generalized to the population of study. This feedback... qualitative information will not be used for quantitative information collections that are designed to yield...
78 FR 57903 - Notice of Intent To Seek Approval To Renew an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
.... The indicators are both quantitative and descriptive. Quantitative information from the most recently... center activities with respect to industrial collaboration. [cir] Conducting a survey of all center... quantitative indicators determined by NSF to analyze the management and operation of the center. [cir...
Solving Quantitative Problems: Guidelines for Teaching Derived from Research.
ERIC Educational Resources Information Center
Kramers-Pals, H.; Pilot, A.
1988-01-01
Presents four guidelines for teaching quantitative problem-solving based on research results: analyze difficulties of students, develop a system of heuristics, select and map key relations, and design instruction with proper orientation, exercise, and feedback. Discusses the four guidelines and uses flow charts and diagrams to show how the…
Evaluation and Quantitative trait loci mapping of resistance to powdery mildew in lettuce
USDA-ARS?s Scientific Manuscript database
Lettuce (Lactuca sativa L.) is the major leafy vegetable that is susceptible to powdery mildew disease under greenhouse and field conditions. We mapped quantitative trait loci (QTLs) for resistance to powdery mildew under greenhouse conditions in an interspecific population derived from a cross betw...
A set of literature data was used to derive several quantitative structure-activity relationships (QSARs) to predict the rate constants for the microbial reductive dehalogenation of chlorinated aromatics. Dechlorination rate constants for 25 chloroaromatics were corrected for th...
Shape based segmentation of MRIs of the bones in the knee using phase and intensity information
NASA Astrophysics Data System (ADS)
Fripp, Jurgen; Bourgeat, Pierrick; Crozier, Stuart; Ourselin, Sébastien
2007-03-01
The segmentation of the bones from MR images is useful for performing subsequent segmentation and quantitative measurements of cartilage tissue. In this paper, we present a shape based segmentation scheme for the bones that uses texture features derived from the phase and intensity information in the complex MR image. The phase can provide additional information about the tissue interfaces, but due to the phase unwrapping problem, this information is usually discarded. By using a Gabor filter bank on the complex MR image, texture features (including phase) can be extracted without requiring phase unwrapping. These texture features are then analyzed using a support vector machine classifier to obtain probability tissue matches. The segmentation of the bone is fully automatic and performed using a 3D active shape model based approach driven using gradient and texture information. The 3D active shape model is automatically initialized using a robust affine registration. The approach is validated using a database of 18 FLASH MR images that are manually segmented, with an average segmentation overlap (Dice similarity coefficient) of 0.92 compared to 0.9 obtained using the classifier only.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
... research to explore issues of quantitative benefit information. They all described the collection of data... research will involve quantitative assessment of the comprehension of important information in the document... of experiences and varying degrees of satisfaction with information currently provided at the time...
Development of Mass Spectrometric Ionization Methods for Fullerenes and Fullerene Derivatives
Currently investigations into the environmental behavior of fullerenes and fullerene derivatives is hampered by the lack of well characterized standards and by the lack of readily available quantitative analytical methods. Reported herein are investigations into the utility of ma...
Wang, Dan; Silkie, Sarah S; Nelson, Kara L; Wuertz, Stefan
2010-09-01
Cultivation- and library-independent, quantitative PCR-based methods have become the method of choice in microbial source tracking. However, these qPCR assays are not 100% specific and sensitive for the target sequence in their respective hosts' genome. The factors that can lead to false positive and false negative information in qPCR results are well defined. It is highly desirable to have a way of removing such false information to estimate the true concentration of host-specific genetic markers and help guide the interpretation of environmental monitoring studies. Here we propose a statistical model based on the Law of Total Probability to predict the true concentration of these markers. The distributions of the probabilities of obtaining false information are estimated from representative fecal samples of known origin. Measurement error is derived from the sample precision error of replicated qPCR reactions. Then, the Monte Carlo method is applied to sample from these distributions of probabilities and measurement error. The set of equations given by the Law of Total Probability allows one to calculate the distribution of true concentrations, from which their expected value, confidence interval and other statistical characteristics can be easily evaluated. The output distributions of predicted true concentrations can then be used as input to watershed-wide total maximum daily load determinations, quantitative microbial risk assessment and other environmental models. This model was validated by both statistical simulations and real world samples. It was able to correct the intrinsic false information associated with qPCR assays and output the distribution of true concentrations of Bacteroidales for each animal host group. Model performance was strongly affected by the precision error. It could perform reliably and precisely when the standard deviation of the precision error was small (≤ 0.1). Further improvement on the precision of sample processing and qPCR reaction would greatly improve the performance of the model. This methodology, built upon Bacteroidales assays, is readily transferable to any other microbial source indicator where a universal assay for fecal sources of that indicator exists. Copyright © 2010 Elsevier Ltd. All rights reserved.
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
Quantitative study of protein-protein interactions by quartz nanopipettes
NASA Astrophysics Data System (ADS)
Tiwari, Purushottam Babu; Astudillo, Luisana; Miksovska, Jaroslava; Wang, Xuewen; Li, Wenzhi; Darici, Yesim; He, Jin
2014-08-01
In this report, protein-modified quartz nanopipettes were used to quantitatively study protein-protein interactions in attoliter sensing volumes. As shown by numerical simulations, the ionic current through the conical-shaped nanopipette is very sensitive to the surface charge variation near the pore mouth. With the appropriate modification of negatively charged human neuroglobin (hNgb) onto the inner surface of a nanopipette, we were able to detect concentration-dependent current change when the hNgb-modified nanopipette tip was exposed to positively charged cytochrome c (Cyt c) with a series of concentrations in the bath solution. Such current change is due to the adsorption of Cyt c to the inner surface of the nanopipette through specific interactions with hNgb. In contrast, a smaller current change with weak concentration dependence was observed when Cyt c was replaced with lysozyme, which does not specifically bind to hNgb. The equilibrium dissociation constant (KD) for the Cyt c-hNgb complex formation was derived and the value matched very well with the result from surface plasmon resonance measurement. This is the first quantitative study of protein-protein interactions by a conical-shaped nanopore based on charge sensing. Our results demonstrate that nanopipettes can potentially be used as a label-free analytical tool to quantitatively characterize protein-protein interactions.In this report, protein-modified quartz nanopipettes were used to quantitatively study protein-protein interactions in attoliter sensing volumes. As shown by numerical simulations, the ionic current through the conical-shaped nanopipette is very sensitive to the surface charge variation near the pore mouth. With the appropriate modification of negatively charged human neuroglobin (hNgb) onto the inner surface of a nanopipette, we were able to detect concentration-dependent current change when the hNgb-modified nanopipette tip was exposed to positively charged cytochrome c (Cyt c) with a series of concentrations in the bath solution. Such current change is due to the adsorption of Cyt c to the inner surface of the nanopipette through specific interactions with hNgb. In contrast, a smaller current change with weak concentration dependence was observed when Cyt c was replaced with lysozyme, which does not specifically bind to hNgb. The equilibrium dissociation constant (KD) for the Cyt c-hNgb complex formation was derived and the value matched very well with the result from surface plasmon resonance measurement. This is the first quantitative study of protein-protein interactions by a conical-shaped nanopore based on charge sensing. Our results demonstrate that nanopipettes can potentially be used as a label-free analytical tool to quantitatively characterize protein-protein interactions. Electronic supplementary information (ESI) available: Determination of nanopipette diameter; surface modification scheme; numerical simulation; noise analysis; SPR experiments. See DOI: 10.1039/c4nr02964j
Analysis of commercial and public bioactivity databases.
Tiikkainen, Pekka; Franke, Lutz
2012-02-27
Activity data for small molecules are invaluable in chemoinformatics. Various bioactivity databases exist containing detailed information of target proteins and quantitative binding data for small molecules extracted from journals and patents. In the current work, we have merged several public and commercial bioactivity databases into one bioactivity metabase. The molecular presentation, target information, and activity data of the vendor databases were standardized. The main motivation of the work was to create a single relational database which allows fast and simple data retrieval by in-house scientists. Second, we wanted to know the amount of overlap between databases by commercial and public vendors to see whether the former contain data complementing the latter. Third, we quantified the degree of inconsistency between data sources by comparing data points derived from the same scientific article cited by more than one vendor. We found that each data source contains unique data which is due to different scientific articles cited by the vendors. When comparing data derived from the same article we found that inconsistencies between the vendors are common. In conclusion, using databases of different vendors is still useful since the data overlap is not complete. It should be noted that this can be partially explained by the inconsistencies and errors in the source data.
Evaluating Lignocellulosic Biomass, Its Derivatives, and Downstream Products with Raman Spectroscopy
Lupoi, Jason S.; Gjersing, Erica; Davis, Mark F.
2015-01-01
The creation of fuels, chemicals, and materials from plants can aid in replacing products fabricated from non-renewable energy sources. Before using biomass in downstream applications, it must be characterized to assess chemical traits, such as cellulose, lignin, or lignin monomer content, or the sugars released following an acid or enzymatic hydrolysis. The measurement of these traits allows researchers to gage the recalcitrance of the plants and develop efficient deconstruction strategies to maximize yields. Standard methods for assessing biomass phenotypes often have experimental protocols that limit their use for screening sizeable numbers of plant species. Raman spectroscopy, a non-destructive, non-invasive vibrational spectroscopy technique, is capable of providing qualitative, structural information and quantitative measurements. Applications of Raman spectroscopy have aided in alleviating the constraints of standard methods by coupling spectral data with multivariate analysis to construct models capable of predicting analytes. Hydrolysis and fermentation products, such as glucose and ethanol, can be quantified off-, at-, or on-line. Raman imaging has enabled researchers to develop a visual understanding of reactions, such as different pretreatment strategies, in real-time, while also providing integral chemical information. This review provides an overview of what Raman spectroscopy is, and how it has been applied to the analysis of whole lignocellulosic biomass, its derivatives, and downstream process monitoring. PMID:25941674
Evaluating lignocellulosic biomass, its derivatives, and downstream products with Raman spectroscopy
Lupoi, Jason S.; Gjersing, Erica; Davis, Mark F.
2015-04-20
The creation of fuels, chemicals, and materials from plants can aid in replacing products fabricated from non-renewable energy sources. Before using biomass in downstream applications, it must be characterized to assess chemical traits, such as cellulose, lignin, or lignin monomer content, or the sugars released following an acid or enzymatic hydrolysis. The measurement of these traits allows researchers to gage the recalcitrance of the plants and develop efficient deconstruction strategies to maximize yields. Standard methods for assessing biomass phenotypes often have experimental protocols that limit their use for screening sizeable numbers of plant species. Raman spectroscopy, a non-destructive, non-invasive vibrationalmore » spectroscopy technique, is capable of providing qualitative, structural information and quantitative measurements. Applications of Raman spectroscopy have aided in alleviating the constraints of standard methods by coupling spectral data with multivariate analysis to construct models capable of predicting analytes. Hydrolysis and fermentation products, such as glucose and ethanol, can be quantified off-, at-, or on-line. Raman imaging has enabled researchers to develop a visual understanding of reactions, such as different pretreatment strategies, in real-time, while also providing integral chemical information. Finally, this review provides an overview of what Raman spectroscopy is, and how it has been applied to the analysis of whole lignocellulosic biomass, its derivatives, and downstream process monitoring.« less
Analytical-Based Partial Volume Recovery in Mouse Heart Imaging
NASA Astrophysics Data System (ADS)
Dumouchel, Tyler; deKemp, Robert A.
2011-02-01
Positron emission tomography (PET) is a powerful imaging modality that has the ability to yield quantitative images of tracer activity. Physical phenomena such as photon scatter, photon attenuation, random coincidences and spatial resolution limit quantification potential and must be corrected to preserve the accuracy of reconstructed images. This study focuses on correcting the partial volume effects that arise in mouse heart imaging when resolution is insufficient to resolve the true tracer distribution in the myocardium. The correction algorithm is based on fitting 1D profiles through the myocardium in gated PET images to derive myocardial contours along with blood, background and myocardial activity. This information is interpolated onto a 2D grid and convolved with the tomograph's point spread function to derive regional recovery coefficients enabling partial volume correction. The point spread function was measured by placing a line source inside a small animal PET scanner. PET simulations were created based on noise properties measured from a reconstructed PET image and on the digital MOBY phantom. The algorithm can estimate the myocardial activity to within 5% of the truth when different wall thicknesses, backgrounds and noise properties are encountered that are typical of healthy FDG mouse scans. The method also significantly improves partial volume recovery in simulated infarcted tissue. The algorithm offers a practical solution to the partial volume problem without the need for co-registered anatomic images and offers a basis for improved quantitative 3D heart imaging.
Falach, Reut; Sapoznikov, Anita; Gal, Yoav; Israeli, Ofir; Leitner, Moshe; Seliger, Nehama; Ehrlich, Sharon; Kronman, Chanoch; Sabo, Tamar
2016-09-06
The plant-derived toxins ricin and abrin, operate by site-specific depurination of ribosomes, which in turn leads to protein synthesis arrest. The clinical manifestation following pulmonary exposure to these toxins is that of a severe lung inflammation and respiratory insufficiency. Deciphering the pathways mediating between the catalytic activity and the developing lung inflammation, requires a quantitative appreciation of the catalytic activity of the toxins, in-vivo. In the present study, we monitored truncated cDNA molecules which are formed by reverse transcription when a depurinated 28S rRNA serves as template. We found that maximal depurination after intranasal exposure of mice to 2LD50 ricin was reached 48h, where nearly 40% of the ribosomes have been depurinated and that depurination can be halted by post-exposure administration of anti-ricin antibodies. We next demonstrated that the effect of ricin intoxication on different cell types populating the lungs differs greatly, and that outstandingly high levels of damage (80% depurination), were observed in particular for pulmonary epithelial cells. Finally, we found that the magnitude of depurination induced by the related plant-derived toxin abrin, was significantly lower in comparison to ricin, and can be attributed mostly to reduced depurination of pulmonary epithelial cells by abrin. This study provides for the first time vital information regarding the scope and timing of the catalytic performance of ricin and abrin in the lungs of intact animals. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
3D-Reconstruction of recent volcanic activity from ROV-video, Charles Darwin Seamounts, Cape Verdes
NASA Astrophysics Data System (ADS)
Kwasnitschka, T.; Hansteen, T. H.; Kutterolf, S.; Freundt, A.; Devey, C. W.
2011-12-01
As well as providing well-localized samples, Remotely Operated Vehicles (ROVs) produce huge quantities of visual data whose potential for geological data mining has seldom if ever been fully realized. We present a new workflow to derive essential results of field geology such as quantitative stratigraphy and tectonic surveying from ROV-based photo and video material. We demonstrate the procedure on the Charles Darwin Seamounts, a field of small hot spot volcanoes recently identified at a depth of ca. 3500m southwest of the island of Santo Antao in the Cape Verdes. The Charles Darwin Seamounts feature a wide spectrum of volcanic edifices with forms suggestive of scoria cones, lava domes, tuff rings and maar-type depressions, all of comparable dimensions. These forms, coupled with the highly fragmented volcaniclastic samples recovered by dredging, motivated surveying parts of some edifices down to centimeter scale. ROV-based surveys yielded volcaniclastic samples of key structures linked by extensive coverage of stereoscopic photographs and high-resolution video. Based upon the latter, we present our workflow to derive three-dimensional models of outcrops from a single-camera video sequence, allowing quantitative measurements of fault orientation, bedding structure, grain size distribution and photo mosaicking within a geo-referenced framework. With this information we can identify episodes of repetitive eruptive activity at individual volcanic centers and see changes in eruptive style over time, which, despite their proximity to each other, is highly variable.
Quantitation of Permethylated N-Glycans through Multiple-Reaction Monitoring (MRM) LC-MS/MS
Zhou, Shiyue; Hu, Yunli; DeSantos-Garcia, Janie L.; Mechref, Yehia
2015-01-01
The important biological roles of glycans and their implications in disease development and progression have created a demand for the development of sensitive quantitative glycomics methods. Quantitation of glycans existing at low abundance is still analytically challenging. In this study, an N-linked glycans quantitation method using multiple reaction monitoring (MRM) on a triple quadrupole instrument was developed. Optimum normalized collision energy (CE) for both sialylated and fucosylated N-glycan structures was determined to be 30% while it was found to be 35% for either fucosylated or sialylated structures The optimum CE for mannose and complex type N-glycan structures was determined to be 35%. Additionally, the use of three transitions was shown to facilitate reliable quantitation. A total of 88 N-glycan structures in human blood serum were quantified using this MRM approach. Reliable detection and quantitation of these structures was achieved when the equivalence of 0.005 μL of blood serum was analyzed. Accordingly, N-glycans down to the 100th of a μL level can be reliably quantified in pooled human blood serum, spanning a dynamic concentration range of three orders of magnitudes. MRM was also effectively utilized to quantitatively compare the expression of N-glycans derived from brain-targeting breast carcinoma cells (MDA-MB-231BR) and metastatic breast cancer cells (MDA-MB-231). Thus, the described MRM method of permethylated N-glycan structures enables a rapid and reliable identification and quantitation of glycans derived from glycoproteins purified or present in complex biological samples. PMID:25698222
2014-01-01
In the post-genomic era, it has become evident that genetic changes alone are not sufficient to understand most disease processes including pancreatic cancer. Genome sequencing has revealed a complex set of genetic alterations in pancreatic cancer such as point mutations, chromosomal losses, gene amplifications and telomere shortening that drive cancerous growth through specific signaling pathways. Proteome-based approaches are important complements to genomic data and provide crucial information of the target driver molecules and their post-translational modifications. By applying quantitative mass spectrometry, this is an alternative way to identify biomarkers for early diagnosis and personalized medicine. We review the current quantitative mass spectrometric technologies and analyses that have been developed and applied in the last decade in the context of pancreatic cancer. Examples of candidate biomarkers that have been identified from these pancreas studies include among others, asporin, CD9, CXC chemokine ligand 7, fibronectin 1, galectin-1, gelsolin, intercellular adhesion molecule 1, insulin-like growth factor binding protein 2, metalloproteinase inhibitor 1, stromal cell derived factor 4, and transforming growth factor beta-induced protein. Many of these proteins are involved in various steps in pancreatic tumor progression including cell proliferation, adhesion, migration, invasion, metastasis, immune response and angiogenesis. These new protein candidates may provide essential information for the development of protein diagnostics and targeted therapies. We further argue that new strategies must be advanced and established for the integration of proteomic, transcriptomic and genomic data, in order to enhance biomarker translation. Large scale studies with meta data processing will pave the way for novel and unexpected correlations within pancreatic cancer, that will benefit the patient, with targeted treatment. PMID:24708694
Toraman, Hilal E; Franz, Kristina; Ronsse, Frederik; Van Geem, Kevin M; Marin, Guy B
2016-08-19
Insight in the composition of the algae derived bio-oils is crucial for the development of efficient conversion processes and better upgrading strategies for microalgae. Comprehensive two-dimensional gas chromatography (GC×GC) coupled to nitrogen chemiluminescence detector (NCD) and time-of-flight mass spectrometer (TOF-MS) allows to obtain the detailed quantitative composition of the nitrogen containing compounds in the aqueous and the organic fraction of fast pyrolysis bio-oils from microalgae. Normal phase (apolar×mid-polar) and reverse phase column (polar×apolar) combination are investigated to optimize the separation of the detected nitrogen containing compounds. The reverse phase column combination gives the most detailed information in terms of the nitrogen containing compounds. The combined information from the GC×GC-TOF-MS (qualitative) and GC×GC-NCD (quantitative) with the use of a well-chosen internal standard, i.e. caprolactam, enables the identification and quantification of nitrogen containing compounds belonging to 13 different classes: amines, imidazoles, amides, imides, nitriles, pyrazines, pyridines, indoles, pyrazoles, pyrimidines, quinolines, pyrimidinediones and other nitrogen containing compounds which were not assigned to a specific class. The aqueous fraction mostly consists of amines (4.0wt%) and imidazoles (2.8wt%) corresponding to approximately 80wt% of the total identified nitrogen containing compounds. On the other hand, the organic fraction shows a more diverse distribution of nitrogen containing compounds with the majority of the compounds quantified as amides (3.0wt%), indoles (2.0wt%), amines (1.7wt%) and imides (1.3wt%) corresponding to approximately 65wt% of the total identified nitrogen containing compounds. Copyright © 2016 Elsevier B.V. All rights reserved.
Dong, Yi; Wang, Wen-Ping; Lin, Pan; Fan, Peili; Mao, Feng
2016-01-01
We performed a prospective study to evaluate the value of contrast-enhanced ultrasound (CEUS) in quantitative evaluation of renal cortex perfusion in patients suspected of early diabetic nephropathies (DN), with the estimated GFR (MDRD equation) as the gold standard. The study protocol was approved by the hospital review board; each patient gave written informed consent. Our study included 46 cases (21 males and 25 females, mean age 55.6 ± 4.14 years) of clinical confirmed early DN patients. After intravenous bolus injection of 1 ml sulfur hexafluoride microbubbles of ultrasound contrast agent, real time CEUS of renal cortex was performed successively using a 2-5 MHz convex probe. Time-intensity curves (TICs) and quantitative indexes were created with Qlab software. Receiver operating characteristic (ROC) curves were used to predict the diagnostic criteria of CEUS quantitative indexes, and their diagnostic efficiencies were compared with resistance index (RI) and peak systolic velocity (PSV) of renal segmental arteries by chi square test. Our control group included forty-five healthy volunteers. Difference was considered statistically significant with P < 0.05. Changes of area under curve (AUC), derived peak intensity (DPI) were statistically significant (P < 0.05). DPI less than 12 and AUC greater than 1400 had high utility in DN, with 71.7% and 67.3% sensitivity, 77.8% and 80.0% specificity. These results were significantly better than those obtained with RI and PSV which had no significant difference in early stage of DN (P > 0.05). CEUS might be helpful to improve early diagnosis of DN by quantitative analyses. AUC and DPI might be valuable quantitative indexes.
75 FR 18571 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-12
... Comprehensive Quantitative Impact Study.'' The OCC has also given notice that it has sent this collection to OMB... following new information collection: Title: Basel Comprehensive Quantitative Impact Study. OMB Control No... the Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to...
78 FR 68450 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... hours. Key informant interviews and the quantitative survey will be conducted by telephone. As telephone... qualitative and quantitative data in order to develop and refine the Tool, and assess feasibility and audience... collection will be used to help inform a quantitative stage of work to include a national sample of...
Visualizing the Critique: Integrating Quantitative Reasoning with the Design Process
ERIC Educational Resources Information Center
Weinstein, Kathryn
2017-01-01
In the age of "Big Data," information is often quantitative in nature. The ability to analyze information through the sifting of data has been identified as a core competency for success in navigating daily life and participation in the contemporary workforce. This skill, known as Quantitative Reasoning (QR), is characterized by the…
O'Sullivan, F; Kirrane, J; Muzi, M; O'Sullivan, J N; Spence, A M; Mankoff, D A; Krohn, K A
2010-03-01
Kinetic quantitation of dynamic positron emission tomography (PET) studies via compartmental modeling usually requires the time-course of the radio-tracer concentration in the arterial blood as an arterial input function (AIF). For human and animal imaging applications, significant practical difficulties are associated with direct arterial sampling and as a result there is substantial interest in alternative methods that require no blood sampling at the time of the study. A fixed population template input function derived from prior experience with directly sampled arterial curves is one possibility. Image-based extraction, including requisite adjustment for spillover and recovery, is another approach. The present work considers a hybrid statistical approach based on a penalty formulation in which the information derived from a priori studies is combined in a Bayesian manner with information contained in the sampled image data in order to obtain an input function estimate. The absolute scaling of the input is achieved by an empirical calibration equation involving the injected dose together with the subject's weight, height and gender. The technique is illustrated in the context of (18)F -Fluorodeoxyglucose (FDG) PET studies in humans. A collection of 79 arterially sampled FDG blood curves are used as a basis for a priori characterization of input function variability, including scaling characteristics. Data from a series of 12 dynamic cerebral FDG PET studies in normal subjects are used to evaluate the performance of the penalty-based AIF estimation technique. The focus of evaluations is on quantitation of FDG kinetics over a set of 10 regional brain structures. As well as the new method, a fixed population template AIF and a direct AIF estimate based on segmentation are also considered. Kinetics analyses resulting from these three AIFs are compared with those resulting from radially sampled AIFs. The proposed penalty-based AIF extraction method is found to achieve significant improvements over the fixed template and the segmentation methods. As well as achieving acceptable kinetic parameter accuracy, the quality of fit of the region of interest (ROI) time-course data based on the extracted AIF, matches results based on arterially sampled AIFs. In comparison, significant deviation in the estimation of FDG flux and degradation in ROI data fit are found with the template and segmentation methods. The proposed AIF extraction method is recommended for practical use.
Invited Article: Concepts and tools for the evaluation of measurement uncertainty
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Iyer, Hari K.
2017-01-01
Measurements involve comparisons of measured values with reference values traceable to measurement standards and are made to support decision-making. While the conventional definition of measurement focuses on quantitative properties (including ordinal properties), we adopt a broader view and entertain the possibility of regarding qualitative properties also as legitimate targets for measurement. A measurement result comprises the following: (i) a value that has been assigned to a property based on information derived from an experiment or computation, possibly also including information derived from other sources, and (ii) a characterization of the margin of doubt that remains about the true value of the property after taking that information into account. Measurement uncertainty is this margin of doubt, and it can be characterized by a probability distribution on the set of possible values of the property of interest. Mathematical or statistical models enable the quantification of measurement uncertainty and underlie the varied collection of methods available for uncertainty evaluation. Some of these methods have been in use for over a century (for example, as introduced by Gauss for the combination of mutually inconsistent observations or for the propagation of "errors"), while others are of fairly recent vintage (for example, Monte Carlo methods including those that involve Markov Chain Monte Carlo sampling). This contribution reviews the concepts, models, methods, and computations that are commonly used for the evaluation of measurement uncertainty, and illustrates their application in realistic examples drawn from multiple areas of science and technology, aiming to serve as a general, widely accessible reference.
Wei, Jiangyong; Hu, Xiaohua; Zou, Xiufen; Tian, Tianhai
2017-12-28
Recent advances in omics technologies have raised great opportunities to study large-scale regulatory networks inside the cell. In addition, single-cell experiments have measured the gene and protein activities in a large number of cells under the same experimental conditions. However, a significant challenge in computational biology and bioinformatics is how to derive quantitative information from the single-cell observations and how to develop sophisticated mathematical models to describe the dynamic properties of regulatory networks using the derived quantitative information. This work designs an integrated approach to reverse-engineer gene networks for regulating early blood development based on singel-cell experimental observations. The wanderlust algorithm is initially used to develop the pseudo-trajectory for the activities of a number of genes. Since the gene expression data in the developed pseudo-trajectory show large fluctuations, we then use Gaussian process regression methods to smooth the gene express data in order to obtain pseudo-trajectories with much less fluctuations. The proposed integrated framework consists of both bioinformatics algorithms to reconstruct the regulatory network and mathematical models using differential equations to describe the dynamics of gene expression. The developed approach is applied to study the network regulating early blood cell development. A graphic model is constructed for a regulatory network with forty genes and a dynamic model using differential equations is developed for a network of nine genes. Numerical results suggests that the proposed model is able to match experimental data very well. We also examine the networks with more regulatory relations and numerical results show that more regulations may exist. We test the possibility of auto-regulation but numerical simulations do not support the positive auto-regulation. In addition, robustness is used as an importantly additional criterion to select candidate networks. The research results in this work shows that the developed approach is an efficient and effective method to reverse-engineer gene networks using single-cell experimental observations.
NASA Astrophysics Data System (ADS)
Evans, Ben; Moeller, Iris; Smith, Geoff; Spencer, Tom
2017-04-01
Saltmarshes provide valuable ecosystem services and are protected. Nevertheless they are generally thought to be declining in extent in North West Europe and beyond. The drivers of this decline and its variability are complex and inadequately described. When considering management for future ecosystem service provision it is important to understand why, where, and to what extent areal decline is likely to occur. Physically-based morphological modelling of fine-sediment systems is in its infancy. The models and necessary expertise and facilities to run and validate them are rarely directly accessible to practitioners. This paper uses an accessible and easily applied data-driven modelling approach for the quantitative estimation of current marsh system status and likely future marsh development. Central to this approach are monitoring datasets providing high resolution spatial data and the recognition that antecedent morphology exerts a principal control on future landform change (morphodynamic feedback). Further, current morphology can also be regarded as an integrated response of the intertidal system to the process environment . It may also, therefore, represent proxy information on historical conditions beyond the period of observational records. Novel methods are developed to extract quantitative morphological information from aerial photographic, LiDAR and satellite datasets. Morphometric indices are derived relating to the functional configuration of landform units that go beyond previous efforts and basic description of extent. The incorporation of morphometric indices derived from existing monitoring datasets is shown to improve the performance of statistical models for predicting salt marsh evolution but wider applications and benefits are expected. The indices are useful landscape descriptors when assessing system status and may provide relatively robust measures for comparison against historical datasets. They are also valuable metrics when considering how the landscape delivers ecosystem services and are essential for the testing and validation of morphological models of salt marshes and other systems.
NASA Astrophysics Data System (ADS)
Morin, Efrat; Marra, Francesco; Peleg, Nadav; Mei, Yiwen; Anagnostou, Emmanouil N.
2017-04-01
Rainfall frequency analysis is used to quantify the probability of occurrence of extreme rainfall and is traditionally based on rain gauge records. The limited spatial coverage of rain gauges is insufficient to sample the spatiotemporal variability of extreme rainfall and to provide the areal information required by management and design applications. Conversely, remote sensing instruments, even if quantitative uncertain, offer coverage and spatiotemporal detail that allow overcoming these issues. In recent years, remote sensing datasets began to be used for frequency analyses, taking advantage of increased record lengths and quantitative adjustments of the data. However, the studies so far made use of concepts and techniques developed for rain gauge (i.e. point or multiple-point) data and have been validated by comparison with gauge-derived analyses. These procedures add further sources of uncertainty and prevent from isolating between data and methodological uncertainties and from fully exploiting the available information. In this study, we step out of the gauge-centered concept presenting a direct comparison between at-site Intensity-Duration-Frequency (IDF) curves derived from different remote sensing datasets on corresponding spatial scales, temporal resolutions and records. We analyzed 16 years of homogeneously corrected and gauge-adjusted C-Band weather radar estimates, high-resolution CMORPH and gauge-adjusted high-resolution CMORPH over the Eastern Mediterranean. Results of this study include: (a) good spatial correlation between radar and satellite IDFs ( 0.7 for 2-5 years return period); (b) consistent correlation and dispersion in the raw and gauge adjusted CMORPH; (c) bias is almost uniform with return period for 12-24 h durations; (d) radar identifies thicker tail distributions than CMORPH and the tail of the distributions depends on the spatial and temporal scales. These results demonstrate the potential of remote sensing datasets for rainfall frequency analysis for management (e.g. warning and early-warning systems) and design (e.g. sewer design, large scale drainage planning)
Mobility-Related Teacher Turnover and the Unequal Distribution of Experienced Teachers in Turkey
ERIC Educational Resources Information Center
Özoglu, Murat
2015-01-01
This study investigates the issue of mobility-related teacher turnover in Turkey through both quantitative and qualitative methods. The quantitative findings derived from descriptive and correlational analyses of countrywide teacher-assignment and transfer data indicate that a high rate of mobility-related turnover is observed in the…
Exciting New Images | Lunar Reconnaissance Orbiter Camera
slowly and relentlessly reshapes the Moon's topography. Comparative study of the shapes of lunar craters , quantitative comparison be derived? And how can we quantify and compare the topography of a large number of for quantitative characterization of impact crater topography (Mahanti, P. et al., 2014, Icarus v. 241
Gifford, Aliya; Walker, Ronald C.; Towse, Theodore F.; Brian Welch, E.
2015-01-01
Abstract. Beyond estimation of depot volumes, quantitative analysis of adipose tissue properties could improve understanding of how adipose tissue correlates with metabolic risk factors. We investigated whether the fat signal fraction (FSF) derived from quantitative fat–water magnetic resonance imaging (MRI) scans at 3.0 T correlates to CT Hounsfield units (HU) of the same tissue. These measures were acquired in the subcutaneous white adipose tissue (WAT) at the umbilical level of 21 healthy adult subjects. A moderate correlation exists between MRI- and CT-derived WAT values for all subjects, R2=0.54, p<0.0001, with a slope of −2.6, (95% CI [−3.3,−1.8]), indicating that a decrease of 1 HU equals a mean increase of 0.38% FSF. We demonstrate that FSF estimates obtained using quantitative fat–water MRI techniques correlate with CT HU values in subcutaneous WAT, and therefore, MRI-based FSF could be used as an alternative to CT HU for assessing metabolic risk factors. PMID:26702407
A method for three-dimensional quantitative observation of the microstructure of biological samples
NASA Astrophysics Data System (ADS)
Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying
2009-07-01
Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.
Zhao, Y; Gran, B; Pinilla, C; Markovic-Plese, S; Hemmer, B; Tzou, A; Whitney, L W; Biddison, W E; Martin, R; Simon, R
2001-08-15
The interaction of TCRs with MHC peptide ligands can be highly flexible, so that many different peptides are recognized by the same TCR in the context of a single restriction element. We provide a quantitative description of such interactions, which allows the identification of T cell epitopes and molecular mimics. The response of T cell clones to positional scanning synthetic combinatorial libraries is analyzed with a mathematical approach that is based on a model of independent contribution of individual amino acids to peptide Ag recognition. This biometric analysis compares the information derived from these libraries composed of trillions of decapeptides with all the millions of decapeptides contained in a protein database to rank and predict the most stimulatory peptides for a given T cell clone. We demonstrate the predictive power of the novel strategy and show that, together with gene expression profiling by cDNA microarrays, it leads to the identification of novel candidate autoantigens in the inflammatory autoimmune disease, multiple sclerosis.
NASA Astrophysics Data System (ADS)
Soltis, Joseph M.; Savage, Anne; Leong, Kirsten M.
2004-05-01
The most commonly occurring elephant vocalization is the rumble, a frequency-modulated call with infrasonic components. Upwards of ten distinct rumble subtypes have been proposed, but little quantitative work on the acoustic properties of rumbles has been conducted. Rumble vocalizations (N=269) from six females housed at Disney's Animal Kingdom were analyzed. Vocalizations were recorded from microphones in collars around subject necks, and rumbles were digitized and measured using SIGNAL software. Sixteen acoustic variables were measured for each call, extracting both source and filter features. Multidimensional scaling analysis indicates that there are no acoustically distinct rumble subtypes, but that there is quantitative variation across rumbles. Discriminant function analysis showed that the acoustic characteristics of rumbles differ across females. A classification success rate of 65% was achieved when assigning unselected rumbles to one of the six females (test set =64 calls) according to the functions derived from the originally selected calls (training set =205 calls). The rumble is best viewed as a single call type with graded variation, but information regarding individual identity is encoded in female rumbles.
Xue, Angli; Wang, Hongcheng; Zhu, Jun
2017-09-28
Startle behavior is important for survival, and abnormal startle responses are related to several neurological diseases. Drosophila melanogaster provides a powerful system to investigate the genetic underpinnings of variation in startle behavior. Since mechanically induced, startle responses and environmental conditions can be readily quantified and precisely controlled. The 156 wild-derived fully sequenced lines of the Drosophila Genetic Reference Panel (DGRP) were used to identify SNPs and transcripts associated with variation in startle behavior. The results validated highly significant effects of 33 quantitative trait SNPs (QTSs) and 81 quantitative trait transcripts (QTTs) directly associated with phenotypic variation of startle response. We also detected QTT variation controlled by 20 QTSs (tQTSs) and 73 transcripts (tQTTs). Association mapping based on genomic and transcriptomic data enabled us to construct a complex genetic network that underlies variation in startle behavior. Based on principles of evolutionary conservation, human orthologous genes could be superimposed on this network. This study provided both genetic and biological insights into the variation of startle response behavior of Drosophila melanogaster, and highlighted the importance of genetic network to understand the genetic architecture of complex traits.
Measuring Quantum Coherence with Entanglement.
Streltsov, Alexander; Singh, Uttam; Dhar, Himadri Shekhar; Bera, Manabendra Nath; Adesso, Gerardo
2015-07-10
Quantum coherence is an essential ingredient in quantum information processing and plays a central role in emergent fields such as nanoscale thermodynamics and quantum biology. However, our understanding and quantitative characterization of coherence as an operational resource are still very limited. Here we show that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. This finding allows us to define a novel general class of measures of coherence for a quantum system of arbitrary dimension, in terms of the maximum bipartite entanglement that can be generated via incoherent operations applied to the system and an incoherent ancilla. The resulting measures are proven to be valid coherence monotones satisfying all the requirements dictated by the resource theory of quantum coherence. We demonstrate the usefulness of our approach by proving that the fidelity-based geometric measure of coherence is a full convex coherence monotone, and deriving a closed formula for it on arbitrary single-qubit states. Our work provides a clear quantitative and operational connection between coherence and entanglement, two landmark manifestations of quantum theory and both key enablers for quantum technologies.
Quantitative mapping of solute accumulation in a soil-root system by magnetic resonance imaging
NASA Astrophysics Data System (ADS)
Haber-Pohlmeier, S.; Vanderborght, J.; Pohlmeier, A.
2017-08-01
Differential uptake of water and solutes by plant roots generates heterogeneous concentration distributions in soils. Noninvasive observations of root system architecture and concentration patterns therefore provide information about root water and solute uptake. We present the application of magnetic resonance imaging (MRI) to image and monitor root architecture and the distribution of a tracer, GdDTPA2- (Gadolinium-diethylenetriaminepentacetate) noninvasively during an infiltration experiment in a soil column planted with white lupin. We show that inversion recovery preparation within the MRI imaging sequence can quantitatively map concentrations of a tracer in a complex root-soil system. Instead of a simple T1 weighting, the procedure is extended by a wide range of inversion times to precisely map T1 and subsequently to cover a much broader concentration range of the solute. The derived concentrations patterns were consistent with mass balances and showed that the GdDTPA2- tracer represents a solute that is excluded by roots. Monitoring and imaging the accumulation of the tracer in the root zone therefore offers the potential to determine where and by which roots water is taken up.
Geoscientific process monitoring with positron emission tomography (GeoPET)
NASA Astrophysics Data System (ADS)
Kulenkampff, Johannes; Gründig, Marion; Zakhnini, Abdelhamid; Lippmann-Pipke, Johanna
2016-08-01
Transport processes in geomaterials can be observed with input-output experiments, which yield no direct information on the impact of heterogeneities, or they can be assessed by model simulations based on structural imaging using µ-CT. Positron emission tomography (PET) provides an alternative experimental observation method which directly and quantitatively yields the spatio-temporal distribution of tracer concentration. Process observation with PET benefits from its extremely high sensitivity together with a resolution that is acceptable in relation to standard drill core sizes. We strongly recommend applying high-resolution PET scanners in order to achieve a resolution on the order of 1 mm. We discuss the particularities of PET applications in geoscientific experiments (GeoPET), which essentially are due to high material density. Although PET is rather insensitive to matrix effects, mass attenuation and Compton scattering have to be corrected thoroughly in order to derive quantitative values. Examples of process monitoring of advection and diffusion processes with GeoPET illustrate the procedure and the experimental conditions, as well as the benefits and limits of the method.
Quantifying heterogeneity in human tumours using MRI and PET.
Asselin, Marie-Claude; O'Connor, James P B; Boellaard, Ronald; Thacker, Neil A; Jackson, Alan
2012-03-01
Most tumours, even those of the same histological type and grade, demonstrate considerable biological heterogeneity. Variations in genomic subtype, growth factor expression and local microenvironmental factors can result in regional variations within individual tumours. For example, localised variations in tumour cell proliferation, cell death, metabolic activity and vascular structure will be accompanied by variations in oxygenation status, pH and drug delivery that may directly affect therapeutic response. Documenting and quantifying regional heterogeneity within the tumour requires histological or imaging techniques. There is increasing evidence that quantitative imaging biomarkers can be used in vivo to provide important, reproducible and repeatable estimates of tumoural heterogeneity. In this article we review the imaging methods available to provide appropriate biomarkers of tumour structure and function. We also discuss the significant technical issues involved in the quantitative estimation of heterogeneity and the range of descriptive metrics that can be derived. Finally, we have reviewed the existing clinical evidence that heterogeneity metrics provide additional useful information in drug discovery and development and in clinical practice. Copyright © 2012 Elsevier Ltd. All rights reserved.
Yin, Hongyao; Feng, Yujun; Liu, Hanbin; Mu, Meng; Fei, Chenhong
2014-08-26
Owing to its wide availability, nontoxicity, and low cost, CO2 working as a trigger to reversibly switch material properties, including polarity, ionic strength, hydrophilicity, viscosity, surface charge, and degree of polymerization or cross-linking, has attracted an increasing attention in recent years. However, a quantitative correlation between basicity of these materials and their CO2 switchability has been less documented though it is of great importance for fabricating switchable system. In this work, the "switch-on" and "switch-off" abilities of melamine and its amino-substituted derivatives by introducing and removing CO2 are studied, and then their quantitative relationship with basicity is established, so that performances of other organobases can be quantitatively predicted. These findings are beneficial for forecasting the CO2 stimuli-responsive behavior of other organobases and the design of CO2-switchable materials.
Knold, Lone; Reitov, Marianne; Mortensen, Anna Birthe; Hansen-Møller, Jens
2002-01-01
A rapid and quantitative method for the extraction, derivatization, and liquid chromatography with fluorescence detection of ivermectin (IVM) and doramectin (DOM) residues in porcine liver was developed and validated. IVM and DOM were extracted from the liver samples with acetonitrile, the supernatant was evaporated to dryness at 37 degrees C under nitrogen, and the residue was reconstituted in 1-methylimidazole solution. After 2 min at room temperature, IVM and DOM were converted to a fluorescent derivative and then separated on a Hypersil ODS column. The derivatives of IVM and DOM were detected and quantitated with high specificity by fluorescence (excitation: 365 nm, emission: 475 nm). Abamectin was used as an internal standard. The mean extraction efficiencies from fortified samples (15 ng/g) were 75% for IVM and 70% for DOM. The limit of detection was 0.8 ng/g for both IVM and DOM.
Tiyip, Tashpolat; Ding, Jianli; Zhang, Dong; Liu, Wei; Wang, Fei; Tashpolat, Nigara
2017-01-01
Effective pretreatment of spectral reflectance is vital to model accuracy in soil parameter estimation. However, the classic integer derivative has some disadvantages, including spectral information loss and the introduction of high-frequency noise. In this paper, the fractional order derivative algorithm was applied to the pretreatment and partial least squares regression (PLSR) was used to assess the clay content of desert soils. Overall, 103 soil samples were collected from the Ebinur Lake basin in the Xinjiang Uighur Autonomous Region of China, and used as data sets for calibration and validation. Following laboratory measurements of spectral reflectance and clay content, the raw spectral reflectance and absorbance data were treated using the fractional derivative order from the 0.0 to the 2.0 order (order interval: 0.2). The ratio of performance to deviation (RPD), determinant coefficients of calibration (Rc2), root mean square errors of calibration (RMSEC), determinant coefficients of prediction (Rp2), and root mean square errors of prediction (RMSEP) were applied to assess the performance of predicting models. The results showed that models built on the fractional derivative order performed better than when using the classic integer derivative. Comparison of the predictive effects of 22 models for estimating clay content, calibrated by PLSR, showed that those models based on the fractional derivative 1.8 order of spectral reflectance (Rc2 = 0.907, RMSEC = 0.425%, Rp2 = 0.916, RMSEP = 0.364%, and RPD = 2.484 ≥ 2.000) and absorbance (Rc2 = 0.888, RMSEC = 0.446%, Rp2 = 0.918, RMSEP = 0.383% and RPD = 2.511 ≥ 2.000) were most effective. Furthermore, they performed well in quantitative estimations of the clay content of soils in the study area. PMID:28934274
Are Neurodynamic Organizations A Fundamental Property of Teamwork?
Stevens, Ronald H.; Galloway, Trysha L.
2017-01-01
When performing a task it is important for teams to optimize their strategies and actions to maximize value and avoid the cost of surprise. The decisions teams make sometimes have unintended consequences and they must then reorganize their thinking, roles and/or configuration into corrective structures more appropriate for the situation. In this study we ask: What are the neurodynamic properties of these reorganizations and how do they relate to the moment-by-moment, and longer, performance-outcomes of teams?. We describe an information-organization approach for detecting and quantitating the fluctuating neurodynamic organizations in teams. Neurodynamic organization is the propensity of team members to enter into prolonged (minutes) metastable neurodynamic relationships as they encounter and resolve disturbances to their normal rhythms. Team neurodynamic organizations were detected and modeled by transforming the physical units of each team member's EEG power levels into Shannon entropy-derived information units about the team's organization and synchronization. Entropy is a measure of the variability or uncertainty of information in a data stream. This physical unit to information unit transformation bridges micro level social coordination events with macro level expert observations of team behavior allowing multimodal comparisons across the neural, cognitive and behavioral time scales of teamwork. The measures included the entropy of each team member's data stream, the overall team entropy and the mutual information between dyad pairs of the team. Mutual information can be thought of as periods related to team member synchrony. Comparisons between individual entropy and mutual information levels for the dyad combinations of three-person teams provided quantitative estimates of the proportion of a person's neurodynamic organizations that represented periods of synchrony with other team members, which in aggregate provided measures of the overall degree of neurodynamic interactions of the team. We propose that increased neurodynamic organization occurs when a team's operating rhythm can no longer support the complexity of the task and the team needs to expend energy to re-organize into structures that better minimize the “surprise” in the environment. Consistent with this hypothesis, the frequency and magnitude of neurodynamic organizations were less in experienced military and healthcare teams than they were in more junior teams. Similar dynamical properties of neurodynamic organization were observed in models of the EEG data streams of military, healthcare and high school science teams suggesting that neurodynamic organization may be a common property of teamwork. The innovation of this study is the potential it raises for developing globally applicable quantitative models of team dynamics that will allow comparisons to be made across teams, tasks and training protocols. PMID:28512438
76 FR 12960 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... population to which generalizations will be made, the sampling frame, the sample design (including... for submission for other generic mechanisms that are designed to yield quantitative results. The... generic clearance for qualitative information will not be used for quantitative information collections...
Quantitative confirmation of diffusion-limited oxidation theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, K.T.; Clough, R.L.
1990-01-01
Diffusion-limited (heterogeneous) oxidation effects are often important for studies of polymer degradation. Such effects are common in polymers subjected to ionizing radiation at relatively high dose rate. To better understand the underlying oxidation processes and to aid in the planning of accelerated aging studies, it would be desirable to be able to monitor and quantitatively understand these effects. In this paper, we briefly review a theoretical diffusion approach which derives model profiles for oxygen surrounded sheets of material by combining oxygen permeation rates with kinetically based oxygen consumption expressions. The theory leads to a simple governing expression involving the oxygenmore » consumption and permeation rates together with two model parameters {alpha} and {beta}. To test the theory, gamma-initiated oxidation of a sheet of commercially formulated EPDM rubber was performed under conditions which led to diffusion-limited oxidation. Profile shapes from the theoretical treatments are shown to accurately fit experimentally derived oxidation profiles. In addition, direct measurements on the same EPDM material of the oxygen consumption and permeation rates, together with values of {alpha} and {beta} derived from the fitting procedure, allow us to quantitatively confirm for the first time the governing theoretical relationship. 17 refs., 3 figs.« less
Derivative spectrophotometric analysis of benzophenone (as an impurity) in phenytoin
2011-01-01
Three simple and rapid spectrophotometric methods were developed for detection and trace determination of benzophenone (the main impurity) in phenytoin bulk powder and pharmaceutical formulations. The first method, zero-crossing first derivative spectrophotometry, depends on measuring the first derivative trough values at 257.6 nm for benzophenone. The second method, zero-crossing third derivative spectrophotometry, depends on measuring the third derivative peak values at 263.2 nm. The third method, ratio first derivative spectrophotometry, depends on measuring the peak amplitudes of the first derivative of the ratio spectra (the spectra of benzophenone divided by the spectrum of 5.0 μg/mL phenytoin solution) at 272 nm. The calibration graphs were linear over the range of 1-10 μg/mL. The detection limits of the first and the third derivative methods were found to be 0.04 μg/mL and 0.11 μg/mL and the quantitation limits were 0.13 μg/mL and 0.34 μg/mL, respectively, while for the ratio derivative method, the detection limit was 0.06 μg/mL and the quantitation limit was 0.18 μg/mL. The proposed methods were applied successfully to the assay of the studied drug in phenytoin bulk powder and certain pharmaceutical preparations. The results were statistically compared to those obtained using a polarographic method and were found to be in good agreement. PMID:22152156
Zhang, Jun; Hao, Qing-Qing; Liu, Xin; Jing, Zhi; Jia, Wen-Qing; Wang, Shu-Qing; Xu, Wei-Ren; Cheng, Xian-Chao; Wang, Run-Ling
2017-01-01
Telmisartan, a bifunctional agent of blood pressure lowering and glycemia reduction, was previously reported to antagonize angiotensin II type 1 (AT1) receptor and partially activate peroxisome proliferator-activated receptor γ (PPARγ) simultaneously. Through the modification to telmisartan, researchers designed and obtained imidazo-\\pyridine derivatives with the IC50s of 0.49∼94.1 nM against AT1 and EC50s of 20∼3640 nM towards PPARγ partial activation. For minutely inquiring the interaction modes with the relevant receptor and analyzing the structure-activity relationships, molecular docking and 3D-QSAR (Quantitative structure-activity relationships) analysis of these imidazo-\\pyridines on dual targets were conducted in this work. Docking approaches of these derivatives with both receptors provided explicit interaction behaviors and excellent matching degree with the binding pockets. The best CoMFA (Comparative Molecular Field Analysis) models exhibited predictive results of q2=0.553, r2=0.954, SEE=0.127, r2pred=0.779 for AT1 and q2=0.503, r2=1.00, SEE=0.019, r2pred=0.604 for PPARγ, respectively. The contour maps from the optimal model showed detailed information of structural features (steric and electrostatic fields) towards the biological activity. Combining the bioisosterism with the valuable information from above studies, we designed six molecules with better predicted activities towards AT1 and PPARγ partial activation. Overall, these results could be useful for designing potential dual AT1 antagonists and partial PPARγ agonists. PMID:28445965
Zhang, Jun; Hao, Qing-Qing; Liu, Xin; Jing, Zhi; Jia, Wen-Qing; Wang, Shu-Qing; Xu, Wei-Ren; Cheng, Xian-Chao; Wang, Run-Ling
2017-04-11
Telmisartan, a bifunctional agent of blood pressure lowering and glycemia reduction, was previously reported to antagonize angiotensin II type 1 (AT1) receptor and partially activate peroxisome proliferator-activated receptor γ (PPARγ) simultaneously. Through the modification to telmisartan, researchers designed and obtained imidazo-\\pyridine derivatives with the IC50s of 0.49~94.1 nM against AT1 and EC50s of 20~3640 nM towards PPARγ partial activation. For minutely inquiring the interaction modes with the relevant receptor and analyzing the structure-activity relationships, molecular docking and 3D-QSAR (Quantitative structure-activity relationships) analysis of these imidazo-\\pyridines on dual targets were conducted in this work. Docking approaches of these derivatives with both receptors provided explicit interaction behaviors and excellent matching degree with the binding pockets. The best CoMFA (Comparative Molecular Field Analysis) models exhibited predictive results of q2=0.553, r2=0.954, SEE=0.127, r2pred=0.779 for AT1 and q2=0.503, r2=1.00, SEE=0.019, r2pred=0.604 for PPARγ, respectively. The contour maps from the optimal model showed detailed information of structural features (steric and electrostatic fields) towards the biological activity. Combining the bioisosterism with the valuable information from above studies, we designed six molecules with better predicted activities towards AT1 and PPARγ partial activation. Overall, these results could be useful for designing potential dual AT1 antagonists and partial PPARγ agonists.
Natsch, Andreas; Emter, Roger; Haupt, Tina; Ellis, Graham
2018-06-01
Cosmetic regulations prohibit animal testing for the purpose of safety assessment and recent REACH guidance states that the local lymph node assay (LLNA) in mice shall only be conducted if in vitro data cannot give sufficient information for classification and labelling. However, Quantitative Risk Assessment (QRA) for fragrance ingredients requires a NESIL, a dose not expected to cause induction of skin sensitization in humans. In absence of human data, this is derived from the LLNA and it remains a key challenge for risk assessors to derive this value from non-animal data. Here we present a workflow using structural information, reactivity data and KeratinoSens results to predict a LLNA result as a point of departure. Specific additional tests (metabolic activation, complementary reactivity tests) are applied in selected cases depending on the chemical domain of a molecule. Finally, in vitro and in vivo data on close analogues are used to estimate uncertainty of the prediction in the specific chemical domain. This approach was applied to three molecules which were subsequently tested in the LLNA and 22 molecules with available and sometimes discordant human and LLNA data. Four additional case studies illustrate how this approach is being applied to recently developed molecules in the absence of animal data. Estimation of uncertainty and how this can be applied to determine a final NESIL for risk assessment is discussed. We conclude that, in the data-rich domain of fragrance ingredients, sensitization risk assessment without animal testing is possible in most cases by this integrated approach.
Development of absorbing aerosol index simulator based on TM5-M7
NASA Astrophysics Data System (ADS)
Sun, Jiyunting; van Velthoven, Peter; Veefkind, Pepijn
2017-04-01
Aerosols alter the Earth's radiation budget directly by scattering and absorbing solar and thermal radiation, or indirectly by perturbing clouds formation and lifetime. These mechanisms offset the positive radiative forcing ascribed to greenhouse gases. In particular, absorbing aerosols such as black carbon and dust strongly enhance global warming. To quantify the impact of absorbing aerosol on global radiative forcing is challenging. In spite of wide spatial and temporal coverage space-borne instruments (we will use the Ozone Monitoring Instrument, OMI) are unable to derive complete information on aerosol distribution, composition, etc. The retrieval of aerosol optical properties also partly depends on additional information derived from other measurements or global atmospheric chemistry models. Common quantities of great interest presenting the amount of absorbing aerosol are AAOD (absorbing aerosol optical depth), the extinction due to absorption of aerosols under cloud free conditions; and AAI (absorbing aerosol index), a measure of aerosol absorption more directly derivable from UV band observations than AAOD. When comparing model simulations and satellite observations, resemblance is good in terms of the spatial distribution of both parameters. However, the quantitative discrepancy is considerable, indicating possible underestimates of simulated AAI by a factor of 2 to 3. Our research, hence, has started by evaluating to what extent aerosol models, such as our TM5-M7 model, represent the satellite measurements and by identifying the reasons for discrepancies. As a next step a transparent methodology for the comparison between model simulations and satellite observations is under development in the form of an AAI simulator based on TM5-M7.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-01
.... Currie, Program Analyst, Office of policy for Extramural Research Administration, 6705 Rockledge Drive... perceptions and opinions, but are not statistical surveys that yield quantitative results that can be... generic clearance for qualitative information will not be used for quantitative information collections...
77 FR 72831 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... consumers to make better informed financial decisions. Together with the CFPB's Office of Research, OFE is also responsible for conducting ``research related to consumer financial education and counseling... quantitative data through in-person and telephone surveys. The information collected through quantitative...
Atherton, K; Young, B; Salmon, P
2017-11-01
Clinical practice in haematological oncology often involves difficult diagnostic and treatment decisions. In this context, understanding patients' information needs and the functions that information serves for them is particularly important. We systematically reviewed qualitative and quantitative evidence on haematological oncology patients' information needs to inform how these needs can best be addressed in clinical practice. PsycINFO, Medline and CINAHL Plus electronic databases were searched for relevant empirical papers published from January 2003 to July 2016. Synthesis of the findings drew on meta-ethnography and meta-study. Most quantitative studies used a survey design and indicated that patients are largely content with the information they receive from physicians, however much or little they actually receive, although a minority of patients are not content with information. Qualitative studies suggest that a sense of being in a caring relationship with a physician allows patients to feel content with the information they have been given, whereas patients who lack such a relationship want more information. The qualitative evidence can help explain the lack of association between the amount of information received and contentment with it in the quantitative research. Trusting relationships are integral to helping patients feel that their information needs have been met. © 2017 John Wiley & Sons Ltd.
Quantitative computed tomography and aerosol morphometry in COPD and alpha1-antitrypsin deficiency.
Shaker, S B; Maltbaek, N; Brand, P; Haeussermann, S; Dirksen, A
2005-01-01
Relative area of emphysema below -910 Hounsfield units (RA-910) and 15th percentile density (PD15) are quantitative computed tomography (CT) parameters used in the diagnosis of emphysema. New concepts for noninvasive diagnosis of emphysema are aerosol-derived airway morphometry, which measures effective airspace dimensions (EAD) and aerosol bolus dispersion (ABD). Quantitative CT, ABD and EAD were compared in 20 smokers with chronic obstructive pulmonary disease (COPD) and 22 patients with alpha1-antitrypsin deficiency (AAD) with a similar degree of airway obstruction and reduced diffusion capacity. In both groups, there was a significant correlation between RA-910 and PD15 and pulmonary function tests (PFTs). A significant correlation was also found between EAD, RA-910 and PD15 in the study population as a whole. Upon separation into two groups, the significance disappeared for the smokers with COPD and strengthened for those with AAD, where EAD correlated significantly with RA-910 and PD15. ABD was similar in the two groups and did not correlate with PFT and quantitative CT in either group. In conclusion, based on quantitative computed tomography and aerosol-derived airway morphometry, emphysema was significantly more severe in patients with alpha1-antitrypsin deficiency compared with patients with usual emphysema, despite similar measures of pulmonary function tests.
Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina
2018-01-01
The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.
Theoretical foundations for a quantitative approach to paleogenetics. I, II.
NASA Technical Reports Server (NTRS)
Holmquist, R.
1972-01-01
It is shown that by neglecting the phenomena of multiple hits, back mutation, and chance coincidence errors larger than 100% can be introduced in the calculated value of the average number of nucleotide base differences to be expected between two homologous polynucleotides. Mathematical formulas are derived to correct quantitatively for these effects. It is pointed out that the effects change materially the quantitative aspects of phylogenics, such as the length of the legs of the trees. A number of problems are solved without approximation.-
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... Genevieve deAlmeida-Morris, Health Research Evaluator, Office of Science Policy and Communications, National... opinions, but are not statistical surveys that yield quantitative results that can be generalized to the... generic clearance for qualitative information will not be used for quantitative information collections...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-10
... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...), Office of Research and Development, U.S. Environmental Protection Agency, Washington, DC 20460; telephone... human health assessment program that evaluates quantitative and qualitative risk information on effects...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-06
... quantitative and qualitative risk information on effects that may result from exposure to specific chemical... Deputy Director, National Center for Environmental Assessment, (mail code: 8601D), Office of Research and... program that evaluates quantitative and qualitative risk information on effects that may result from...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-16
... health assessment program that evaluates quantitative and qualitative risk information on effects that..., National Center for Environmental Assessment, (mail code: 8601P), Office of Research and Development, U.S... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...
Sullards, M. Cameron; Liu, Ying; Chen, Yanfeng; Merrill, Alfred H.
2011-01-01
Sphingolipids are a highly diverse category of molecules that serve not only as components of biological structures but also as regulators of numerous cell functions. Because so many of the structural features of sphingolipids give rise to their biological activity, there is a need for comprehensive or “sphingolipidomic” methods for identification and quantitation of as many individual subspecies as possible. This review defines sphingolipids as a class, briefly discusses classical methods for their analysis, and focuses primarily on liquid chromatography tandem mass spectrometry (LC-MS/MS) and tissue imaging mass spectrometry (TIMS). Recently, a set of evolving and expanding methods have been developed and rigorously validated for the extraction, identification, separation, and quantitation of sphingolipids by LC-MS/MS. Quantitation of these biomolecules is made possible via the use of an internal standard cocktail. The compounds that can be readily analyzed are free long-chain (sphingoid) bases, sphingoid base 1-phosphates, and more complex species such as ceramides, ceramide 1-phosphates, sphingomyelins, mono- and di-hexosylceramides sulfatides, and novel compounds such as the 1-deoxy- and 1-(deoxymethyl)-sphingoid bases and their N-acyl-derivatives. These methods can be altered slightly to separate and quantitate isomeric species such as glucosyl/galactosylceramide. Because these techniques require the extraction of sphingolipids from their native environment, any information regarding their localization in histological slices is lost. Therefore, this review also describes methods for TIMS. This technique has been shown to be a powerful tool to determine the localization of individual molecular species of sphingolipids directly from tissue slices. PMID:21749933
ERIC Educational Resources Information Center
Larson-Hall, Jenifer; Plonsky, Luke
2015-01-01
This paper presents a set of guidelines for reporting on five types of quantitative data issues: (1) Descriptive statistics, (2) Effect sizes and confidence intervals, (3) Instrument reliability, (4) Visual displays of data, and (5) Raw data. Our recommendations are derived mainly from various professional sources related to L2 research but…
USDA-ARS?s Scientific Manuscript database
Fruit quality traits and dayneutrality are two major foci of several strawberry breeding programs. The identification of quantitative trait loci (QTL) and molecular markers linked to these traits could improve breeding efficiency. In this work, an F1 population derived from the cross ‘Delmarvel’ × ...
Characterization of Amniotic Stem Cells
Koike, Chika; Zhou, Kaixuan; Takeda, Yuji; Fathy, Moustafa; Okabe, Motonori; Yoshida, Toshiko; Nakamura, Yukio; Kato, Yukio
2014-01-01
Abstract The amnion membrane is developed from embryo-derived cells, and amniotic cells have been shown to exhibit multidifferentiation potential. These cells represent a desirable source for stem cells for a variety of reasons. However, to date very few molecular analyses of amnion-derived cells have been reported, and efficient markers for isolating the stem cells remain unclear. This paper assesses the characterization of amnion-derived cells as stem cells by examining stemness marker expressions for amnion-derived epithelial cells and mesenchymal cells by flow cytometry, immunocytochemistry, and quantitative PCR. Flow cytometry revealed that amnion epithelial cells expressed CD133, CD 271, and TRA-1-60, whereas mecenchymal cells expressed CD44, CD73, CD90, and CD105. Immunohistochemistry showed that both cells expressed the stemness markers Oct3/4, Sox2, Klf4, and SSEA4. Stemness genes' expression in amnion epithelial cells, mesenchymal cells, fibroblast, bone marrow–derived mesenchymal stem cells (MSCs), and induced pluripotent stem cells (iPSCs) was compared by quantitative reverse-transcription polymerase chain reaction (RT-PCR). Amnion-derived epithelial cells and mesenchymal cells expressed Oct3/4, Nanog, and Klf4 more than bone marrow–derived MSCs. The sorted TRA1-60–positive cells expressed Oct3/4, Nanog, and Klf4 more than unsorted cells or TRA1-60–negative cells. TRA1-60 can be a marker for isolating amnion epithelial stem cells. PMID:25068631
Pet Imaging Of The Chemistry Of The Brain
NASA Astrophysics Data System (ADS)
Wagner, Henry N., Jr.
1986-06-01
Advances in neurobiology today are as important as the advances in atomic physics at the turn of the century and molecular genetics in the 1950's. Positron-emission tomography is participating in these advances by making it possible for the first time to measure the chemistry of the living human brain in health and disease and to relate the changes at the molecular level to the functioning of the human mind. The amount of data generated requires modern data processing, display, and archiving capabilities. To achieve maximum benefit from the PET imaging and the derived quantitative measurements, the data must be combined with information, usually of a structural nature, from other imaging modalities, chiefly computed tomography and magnetic resonance imaging.
Unsteady aerodynamic modeling and active aeroelastic control
NASA Technical Reports Server (NTRS)
Edwards, J. W.
1977-01-01
Unsteady aerodynamic modeling techniques are developed and applied to the study of active control of elastic vehicles. The problem of active control of a supercritical flutter mode poses a definite design goal stability, and is treated in detail. The transfer functions relating the arbitrary airfoil motions to the airloads are derived from the Laplace transforms of the linearized airload expressions for incompressible two dimensional flow. The transfer function relating the motions to the circulatory part of these loads is recognized as the Theodorsen function extended to complex values of reduced frequency, and is termed the generalized Theodorsen function. Inversion of the Laplace transforms yields exact transient airloads and airfoil motions. Exact root loci of aeroelastic modes are calculated, providing quantitative information regarding subcritical and supercritical flutter conditions.
What is in a contour map? A region-based logical formalization of contour semantics
Usery, E. Lynn; Hahmann, Torsten
2015-01-01
This paper analyses and formalizes contour semantics in a first-order logic ontology that forms the basis for enabling computational common sense reasoning about contour information. The elicited contour semantics comprises four key concepts – contour regions, contour lines, contour values, and contour sets – and their subclasses and associated relations, which are grounded in an existing qualitative spatial ontology. All concepts and relations are illustrated and motivated by physical-geographic features identifiable on topographic contour maps. The encoding of the semantics of contour concepts in first-order logic and a derived conceptual model as basis for an OWL ontology lay the foundation for fully automated, semantically-aware qualitative and quantitative reasoning about contours.
NASA Technical Reports Server (NTRS)
Bizzell, R. M.; Feiveson, A. H.; Hall, F. G.; Bauer, M. E.; Davis, B. J.; Malila, W. A.; Rice, D. P.
1975-01-01
The CITARS was an experiment designed to quantitatively evaluate crop identification performance for corn and soybeans in various environments using a well-defined set of automatic data processing (ADP) techniques. Each technique was applied to data acquired to recognize and estimate proportions of corn and soybeans. The CITARS documentation summarizes, interprets, and discusses the crop identification performances obtained using (1) different ADP procedures; (2) a linear versus a quadratic classifier; (3) prior probability information derived from historic data; (4) local versus nonlocal recognition training statistics and the associated use of preprocessing; (5) multitemporal data; (6) classification bias and mixed pixels in proportion estimation; and (7) data with differnt site characteristics, including crop, soil, atmospheric effects, and stages of crop maturity.
NASA Technical Reports Server (NTRS)
Lang, H. R.; Conel, J. E.; Paylor, E. D.
1984-01-01
A LIDQA evaluation for geologic applications of a LANDSAT TM scene covering the Wind River/Bighorn Basin area, Wyoming, is examined. This involves a quantitative assessment of data quality including spatial and spectral characteristics. Analysis is concentrated on the 6 visible, near infrared, and short wavelength infrared bands. Preliminary analysis demonstrates that: (1) principal component images derived from the correlation matrix provide the most useful geologic information. To extract surface spectral reflectance, the TM radiance data must be calibrated. Scatterplots demonstrate that TM data can be calibrated and sensor response is essentially linear. Low instrumental offset and gain settings result in spectral data that do not utilize the full dynamic range of the TM system.
Some strategies for quantitative scanning Auger electron microscopy
NASA Technical Reports Server (NTRS)
Browning, R.; Peacock, D. C.; Prutton, M.
1985-01-01
The general applicability of power law forms of the background in electron spectra is pointed out and exploited for background removal from under Auger peaks. This form of B(E) is found to be extremely sensitive to instrumental alignment and to fault-free construction - an observation which can be used to set up analyser configurations in an accurate way. Also, differences between N(E) and B(E) can be used to derive a spectrometer transmission function T(E). The questions of information density in an energy-analysing spatially-resolving instrument are addressed after reliable instrumental characterization has been established. Strategies involving ratio histograms, showing the population distribution of the ratio of a pair of Auger peak heights, composition scatter diagrams and windowed imaging are discussed and illustrated.
Quantitative optical diagnostics in pathology recognition and monitoring of tissue reaction to PDT
NASA Astrophysics Data System (ADS)
Kirillin, Mikhail; Shakhova, Maria; Meller, Alina; Sapunov, Dmitry; Agrba, Pavel; Khilov, Alexander; Pasukhin, Mikhail; Kondratieva, Olga; Chikalova, Ksenia; Motovilova, Tatiana; Sergeeva, Ekaterina; Turchin, Ilya; Shakhova, Natalia
2017-07-01
Optical coherence tomography (OCT) is currently actively introduced into clinical practice. Besides diagnostics, it can be efficiently employed for treatment monitoring allowing for timely correction of the treatment procedure. In monitoring of photodynamic therapy (PDT) traditionally employed fluorescence imaging (FI) can benefit from complementary use of OCT. Additional diagnostic efficiency can be derived from numerical processing of optical diagnostics data providing more information compared to visual evaluation. In this paper we report on application of OCT together with numerical processing for clinical diagnostic in gynecology and otolaryngology, for monitoring of PDT in otolaryngology and on OCT and FI applications in clinical and aesthetic dermatology. Image numerical processing and quantification provides increase in diagnostic accuracy. Keywords: optical coherence tomography, fluorescence imaging, photod
Quantitative Agent Based Model of User Behavior in an Internet Discussion Forum
Sobkowicz, Pawel
2013-01-01
The paper presents an agent based simulation of opinion evolution, based on a nonlinear emotion/information/opinion (E/I/O) individual dynamics, to an actual Internet discussion forum. The goal is to reproduce the results of two-year long observations and analyses of the user communication behavior and of the expressed opinions and emotions, via simulations using an agent based model. The model allowed to derive various characteristics of the forum, including the distribution of user activity and popularity (outdegree and indegree), the distribution of length of dialogs between the participants, their political sympathies and the emotional content and purpose of the comments. The parameters used in the model have intuitive meanings, and can be translated into psychological observables. PMID:24324606
Diamantides, N D; Constantinou, S T
1989-07-01
"A model is presented of international migration that is based on the concept of a pool of potential emigrants at the origin created by push-pull forces and by the establishment of information feedback between origin and destination. The forces can be economic, political, or both, and are analytically expressed by the 'mediating factor'. The model is macrodynamic in nature and provides both for the main secular component of the migratory flow and for transient components caused by extraordinary events. The model is expressed in a Bernoulli-type differential equation through which quantitative weights can be derived for each of the operating causes. Out-migration from the Republic of Cyprus is used to test the tenets of the model." excerpt
NASA Astrophysics Data System (ADS)
Smith, P. J.; Popelier, P. L. A.
2004-02-01
The present day abundance of cheap computing power enables the use of quantum chemical ab initio data in Quantitative Structure-Activity Relationships (QSARs). Optimised bond lengths are a new such class of descriptors, which we have successfully used previously in representing electronic effects in medicinal and ecological QSARs (enzyme inhibitory activity, hydrolysis rate constants and pKas). Here we use AM1 and HF/3-21G* bond lengths in conjunction with Partial Least Squares (PLS) and a Genetic Algorithm (GA) to predict the Corticosteroid-Binding Globulin (CBG) binding activity of the classic steroid data set, and the antibacterial activity of nitrofuran derivatives. The current procedure, which does not require molecular alignment, produces good r2 and q2 values. Moreover, it highlights regions in the common steroid skeleton deemed relevant to the active regions of the steroids and nitrofuran derivatives.
NASA Technical Reports Server (NTRS)
Johnson, H.; Kenley, R. A.; Rynard, C.; Golub, M. A.
1985-01-01
Quantitative structure-activity relationships were derived for acetyl- and butyrylcholinesterase inhibition by various organophosphorus esters. Bimolecular inhibition rate constants correlate well with hydrophobic substituent constants, and with the presence or absence of cationic groups on the inhibitor, but not with steric substituent constants. CNDO/2 calculations were performed on a separate set of organophosphorus esters, RR-primeP(O)X, where R and R-prime are alkyl and/or alkoxy groups and X is fluorine, chlorine or a phenoxy group. For each subset with the same X, the CNDO-derived net atomic charge at the central phosphorus atom in the ester correlates well with the alkaline hydrolysis rate constant. For the whole set of esters with different X, two equations were derived that relate either charge and leaving group steric bulk, or orbital energy and bond order to the hydrolysis rate constant.
NASA Astrophysics Data System (ADS)
Cho, Sehyeon; Choi, Min Ji; Kim, Minju; Lee, Sunhoe; Lee, Jinsung; Lee, Seok Joon; Cho, Haelim; Lee, Kyung-Tae; Lee, Jae Yeol
2015-03-01
A series of 3,4-dihydroquinazoline derivatives with anti-cancer activities against human lung cancer A549 cells were subjected to three-dimensional quantitative structure-activity relationship (3D-QSAR) studies using the comparative molecular similarity indices analysis (CoMSIA) approaches. The most potent compound, 1 was used to align the molecules. As a result, the best prediction was obtained with CoMSIA combined the steric, electrostatic, hydrophobic, hydrogen bond donor, and hydrogen bond acceptor fields (q2 = 0.720, r2 = 0.897). This model was validated by an external test set of 6 compounds giving satisfactory predictive r2 value of 0.923 as well as the scrambling stability test. This model would guide the design of potent 3,4-dihydroquinazoline derivatives as anti-cancer agent for the treatment of human lung cancer.
NASA Astrophysics Data System (ADS)
Dinç, Erdal; Kanbur, Murat; Baleanu, Dumitru
2007-10-01
Comparative simultaneous determination of chlortetracycline and benzocaine in the commercial veterinary powder product was carried out by continuous wavelet transform (CWT) and classical derivative transform (or classical derivative spectrophotometry). In this quantitative spectral analysis, two proposed analytical methods do not require any chemical separation process. In the first step, several wavelet families were tested to find an optimal CWT for the overlapping signal processing of the analyzed compounds. Subsequently, we observed that the coiflets (COIF-CWT) method with dilation parameter, a = 400, gives suitable results for this analytical application. For a comparison, the classical derivative spectrophotometry (CDS) approach was also applied to the simultaneous quantitative resolution of the same analytical problem. Calibration functions were obtained by measuring the transform amplitudes corresponding to zero-crossing points for both CWT and CDS methods. The utility of these two analytical approaches were verified by analyzing various synthetic mixtures consisting of chlortetracycline and benzocaine and they were applied to the real samples consisting of veterinary powder formulation. The experimental results obtained from the COIF-CWT approach were statistically compared with those obtained by classical derivative spectrophotometry and successful results were reported.
Yeast Metabolites of Glycated Amino Acids in Beer.
Hellwig, Michael; Beer, Falco; Witte, Sophia; Henle, Thomas
2018-06-01
Glycation reactions (Maillard reactions) during the malting and brewing processes are important for the development of the characteristic color and flavor of beer. Recently, free and protein-bound Maillard reaction products (MRPs) such as pyrraline, formyline, and maltosine were found in beer. Furthermore, these amino acid derivatives are metabolized by Saccharomyces cerevisiae via the Ehrlich pathway. In this study, a method was developed for quantitation of individual Ehrlich intermediates derived from pyrraline, formyline, and maltosine. Following synthesis of the corresponding reference material, the MRP-derived new Ehrlich alcohols pyrralinol (up to 207 μg/L), formylinol (up to 50 μg/L), and maltosinol (up to 6.9 μg/L) were quantitated for the first time in commercial beer samples by reverse phase high performance liquid chromatography tandem mass spectrometry in the multiple reaction monitoring mode. This is equivalent to ca. 20-40% of the concentrations of the parent glycated amino acids. The metabolites were almost absent from alcohol-free beers and malt-based beverages. Two previously unknown valine-derived pyrrole derivatives were characterized and qualitatively identified in beer. The metabolites investigated represent new process-induced alkaloids that may influence brewing yeast performance due to structural similarities to quorum sensing and metal-binding molecules.
Šoškić, Milan; Porobić, Ivana
2016-01-01
Retention factors for 31 indole derivatives, most of them with auxin activity, were determined by high-performance liquid chromatography, using bonded β-cyclodextrin as a stationary phase. A three-parameter QSPR (quantitative structure-property relationship) model, based on physico-chemical and structural descriptors was derived, which accounted for about 98% variations in the retention factors. The model suggests that the indole nucleus occupies the relatively apolar cavity of β-cyclodextrin while the carboxyl group of the indole -3-carboxylic acids makes hydrogen bonds with the hydroxyl groups of β-cyclodextrin. The length and flexibility of the side chain containing carboxyl group strongly affect the binding of these compounds to β-cyclodextrin. Non-acidic derivatives, unlike the indole-3-carboxylic acids, are poorly retained on the column. A reasonably well correlation was found between the retention factors of the indole-3-acetic acids and their relative binding affinities for human serum albumin, a carrier protein in the blood plasma. A less satisfactory correlation was obtained when the retention factors of the indole derivatives were compared with their affinities for auxin-binding protein 1, a plant auxin receptor. PMID:27124734
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
...] Guidance for Industry on Chemistry, Manufacturing, and Controls Information--Fermentation-Derived... (CMC) Information-- Fermentation-Derived Intermediates, Drug Substances, and Related Drug Products for... to submit to support the CMC information for fermentation-derived intermediates, drug substances, and...
Multiplexed MRM-based assays for the quantitation of proteins in mouse plasma and heart tissue.
Percy, Andrew J; Michaud, Sarah A; Jardim, Armando; Sinclair, Nicholas J; Zhang, Suping; Mohammed, Yassene; Palmer, Andrea L; Hardie, Darryl B; Yang, Juncong; LeBlanc, Andre M; Borchers, Christoph H
2017-04-01
The mouse is the most commonly used laboratory animal, with more than 14 million mice being used for research each year in North America alone. The number and diversity of mouse models is increasing rapidly through genetic engineering strategies, but detailed characterization of these models is still challenging because most phenotypic information is derived from time-consuming histological and biochemical analyses. To expand the biochemists' toolkit, we generated a set of targeted proteomic assays for mouse plasma and heart tissue, utilizing bottom-up LC/MRM-MS with isotope-labeled peptides as internal standards. Protein quantitation was performed using reverse standard curves, with LC-MS platform and curve performance evaluated by quality control standards. The assays comprising the final panel (101 peptides for 81 proteins in plasma; 227 peptides for 159 proteins in heart tissue) have been rigorously developed under a fit-for-purpose approach and utilize stable-isotope labeled peptides for every analyte to provide high-quality, precise relative quantitation. In addition, the peptides have been tested to be interference-free and the assay is highly multiplexed, with reproducibly determined protein concentrations spanning >4 orders of magnitude. The developed assays have been used in a small pilot study to demonstrate their application to molecular phenotyping or biomarker discovery/verification studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Peptide-binding motifs of two common equine class I MHC molecules in Thoroughbred horses.
Bergmann, Tobias; Lindvall, Mikaela; Moore, Erin; Moore, Eugene; Sidney, John; Miller, Donald; Tallmadge, Rebecca L; Myers, Paisley T; Malaker, Stacy A; Shabanowitz, Jeffrey; Osterrieder, Nikolaus; Peters, Bjoern; Hunt, Donald F; Antczak, Douglas F; Sette, Alessandro
2017-05-01
Quantitative peptide-binding motifs of MHC class I alleles provide a valuable tool to efficiently identify putative T cell epitopes. Detailed information on equine MHC class I alleles is still very limited, and to date, only a single equine MHC class I allele, Eqca-1*00101 (ELA-A3 haplotype), has been characterized. The present study extends the number of characterized ELA class I specificities in two additional haplotypes found commonly in the Thoroughbred breed. Accordingly, we here report quantitative binding motifs for the ELA-A2 allele Eqca-16*00101 and the ELA-A9 allele Eqca-1*00201. Utilizing analyses of endogenously bound and eluted ligands and the screening of positional scanning combinatorial libraries, detailed and quantitative peptide-binding motifs were derived for both alleles. Eqca-16*00101 preferentially binds peptides with aliphatic/hydrophobic residues in position 2 and at the C-terminus, and Eqca-1*00201 has a preference for peptides with arginine in position 2 and hydrophobic/aliphatic residues at the C-terminus. Interestingly, the Eqca-16*00101 motif resembles that of the human HLA A02-supertype, while the Eqca-1*00201 motif resembles that of the HLA B27-supertype and two macaque class I alleles. It is expected that the identified motifs will facilitate the selection of candidate epitopes for the study of immune responses in horses.
Aronoff, Justin M; Yoon, Yang-soo; Soli, Sigfrid D
2010-06-01
Stratified sampling plans can increase the accuracy and facilitate the interpretation of a dataset characterizing a large population. However, such sampling plans have found minimal use in hearing aid (HA) research, in part because of a paucity of quantitative data on the characteristics of HA users. The goal of this study was to devise a quantitatively derived stratified sampling plan for HA research, so that such studies will be more representative and generalizable, and the results obtained using this method are more easily reinterpreted as the population changes. Pure-tone average (PTA) and age information were collected for 84,200 HAs acquired in 2006 and 2007. The distribution of PTA and age was quantified for each HA type and for a composite of all HA users. Based on their respective distributions, PTA and age were each divided into three groups, the combination of which defined the stratification plan. The most populous PTA and age group was also subdivided, allowing greater homogeneity within strata. Finally, the percentage of users in each stratum was calculated. This article provides a stratified sampling plan for HA research, based on a quantitative analysis of the distribution of PTA and age for HA users. Adopting such a sampling plan will make HA research results more representative and generalizable. In addition, data acquired using such plans can be reinterpreted as the HA population changes.
NASA Astrophysics Data System (ADS)
Fahey, R. T.; Tallant, J.; Gough, C. M.; Hardiman, B. S.; Atkins, J.; Scheuermann, C. M.
2016-12-01
Canopy structure can be an important driver of forest ecosystem functioning - affecting factors such as radiative transfer and light use efficiency, and consequently net primary production (NPP). Both above- (aerial) and below-canopy (terrestrial) remote sensing techniques are used to assess canopy structure and each has advantages and disadvantages. Aerial techniques can cover large geographical areas and provide detailed information on canopy surface and canopy height, but are generally unable to quantitatively assess interior canopy structure. Terrestrial methods provide high resolution information on interior canopy structure and can be cost-effectively repeated, but are limited to very small footprints. Although these methods are often utilized to derive similar metrics (e.g., rugosity, LAI) and to address equivalent ecological questions and relationships (e.g., link between LAI and productivity), rarely are inter-comparisons made between techniques. Our objective is to compare methods for deriving canopy structural complexity (CSC) metrics and to assess the capacity of commonly available aerial remote sensing products (and combinations) to match terrestrially-sensed data. We also assess the potential to combine CSC metrics with image-based analysis to predict plot-based NPP measurements in forests of different ages and different levels of complexity. We use combinations of data from drone-based imagery (RGB, NIR, Red Edge), aerial LiDAR (commonly available medium-density leaf-off), terrestrial scanning LiDAR, portable canopy LiDAR, and a permanent plot network - all collected at the University of Michigan Biological Station. Our results will highlight the potential for deriving functionally meaningful CSC metrics from aerial imagery, LiDAR, and combinations of data sources. We will also present results of modeling focused on predicting plot-level NPP from combinations of image-based vegetation indices (e.g., NDVI, EVI) with LiDAR- or image-derived metrics of CSC (e.g., rugosity, porosity), canopy density, (e.g., LAI), and forest structure (e.g., canopy height). This work builds toward future efforts that will use other data combinations, such as those available at NEON sites, and could be used to inform and test popular ecosystem models (e.g., ED2) incorporating structure.
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
Nazari, Fatemeh; Parham, Abbas; Maleki, Adham Fani
2015-01-01
Quantitative real time reverse transcription PCR (qRT-PCR) is one of the most important techniques for gene-expression analysis in molecular based studies. Selecting a proper internal control gene for normalizing data is a crucial step in gene expression analysis via this method. The expression levels of reference genes should be remained constant among cells in different tissues. However, it seems that the location of cells in different tissues might influence their expression. The purpose of this study was to determine whether the source of mesenchymal stem cells (MSCs) has any effect on expression level of three common reference genes (GAPDH, β-actin and β2-microglobulin) in equine marrow- and adipose- derived undifferentiated MSCs and consequently their reliability for comparative qRT-PCR. Adipose tissue (AT) and bone marrow (BM) samples were harvested from 3 mares. MSCs were isolated and cultured until passage 3 (P3). Total RNA of P3 cells was extracted for cDNA synthesis. The generated cDNAs were analyzed by quantitative real-time PCR. The PCR reactions were ended with a melting curve analysis to verify the specificity of amplicon. The expression levels of GAPDH were significantly different between AT- and BM- derived MSCs (p < 0.05). Differences in expression level of β-actin (P < 0.001) and B2M (P < 0.006.) between MSCs derived from AT and BM were substantially higher than GAPDH. In addition, the fold change in expression levels of GAPDH, β-actin and B2M in AT-derived MSCs compared to BM-derived MSCs were 2.38, 6.76 and 7.76, respectively. This study demonstrated that GAPDH and especially β-actin and B2M express in different levels in equine AT- and BM- derived MSCs. Thus they cannot be considered as reliable reference genes for comparative quantitative gene expression analysis in MSCs derived from equine bone marrow and adipose tissue.
van der Westhuizen, Rina; Ajam, Mariam; De Coning, Piet; Beens, Jan; de Villiers, André; Sandra, Pat
2011-07-15
Fully synthetic jet fuel (FSJF) produced via Fischer-Tropsch (FT) technology was recently approved by the international aviation fuel authorities. To receive approval, comparison of FSJF and crude-derived fuel and blends on their qualitative and quantitative hydrocarbon composition was of utmost importance. This was performed by comprehensive two-dimensional gas chromatography (GC×GC) in the reversed phase mode. The hydrocarbon composition of synthetic and crude-derived jet fuels is very similar and all compounds detected in the synthetic product are also present in crude-derived fuels. Quantitatively, the synthetic fuel consists of a higher degree of aliphatic branching with less than half the aromatic content of the crude-derived fuel. GC×GC analyses also indicated the presence of trace levels of hetero-atomic impurities in the crude-derived product that were absent in the synthetic product. While clay-treatment removed some of the impurities and improved the fuel stability, the crude-derived product still contained traces of cyclic and aromatic S-containing compounds afterwards. Lower level of aromatics and the absence of sulphur are some of the factors that contribute to the better fuel stability and environmental properties of the synthetic fuel. GC×GC was further applied for the analysis of products during Jet Fuel Thermal Oxidation Testing (JFTOT), which measures deposit formation of a fuel under simulated engine conditions. JFTOT showed the synthetic fuel to be much more stable than the crude-derived fuel. Copyright © 2011 Elsevier B.V. All rights reserved.
Machine learning approaches to diagnosis and laterality effects in semantic dementia discourse.
Garrard, Peter; Rentoumi, Vassiliki; Gesierich, Benno; Miller, Bruce; Gorno-Tempini, Maria Luisa
2014-06-01
Advances in automatic text classification have been necessitated by the rapid increase in the availability of digital documents. Machine learning (ML) algorithms can 'learn' from data: for instance a ML system can be trained on a set of features derived from written texts belonging to known categories, and learn to distinguish between them. Such a trained system can then be used to classify unseen texts. In this paper, we explore the potential of the technique to classify transcribed speech samples along clinical dimensions, using vocabulary data alone. We report the accuracy with which two related ML algorithms [naive Bayes Gaussian (NBG) and naive Bayes multinomial (NBM)] categorized picture descriptions produced by: 32 semantic dementia (SD) patients versus 10 healthy, age-matched controls; and SD patients with left- (n = 21) versus right-predominant (n = 11) patterns of temporal lobe atrophy. We used information gain (IG) to identify the vocabulary features that were most informative to each of these two distinctions. In the SD versus control classification task, both algorithms achieved accuracies of greater than 90%. In the right- versus left-temporal lobe predominant classification, NBM achieved a high level of accuracy (88%), but this was achieved by both NBM and NBG when the features used in the training set were restricted to those with high values of IG. The most informative features for the patient versus control task were low frequency content words, generic terms and components of metanarrative statements. For the right versus left task the number of informative lexical features was too small to support any specific inferences. An enriched feature set, including values derived from Quantitative Production Analysis (QPA) may shed further light on this little understood distinction. Copyright © 2013 Elsevier Ltd. All rights reserved.
The NOAA Carbon America Program A Focus on Products for Decision- Support
NASA Astrophysics Data System (ADS)
Butler, J. H.; Hofmann, D. J.; Tans, P. P.; Peters, W.; Andrews, A. E.; Sweeny, C.; Montzka, S. A.
2006-12-01
If society is to manage or reduce carbon emissions in the future, reliable and accurate information on atmospheric carbon dioxide levels for verification of emission reductions will be needed on local, regional, and global scales. The current global carbon dioxide observing network operated by NOAA/ESRL provides a foundation for monitoring and understanding carbon dioxide. For example, atmospheric measurements in Europe suggest that emissions inventories of methane are substantial underestimates. An expanded U.S. Carbon Cycle Atmospheric Observing System is being implemented. Carbon America will consist of approximately 24 aircraft and 12 tall towers obtaining concentrations of carbon gases and other trace species. This observing system needs to be capable of quantitative attribution of all major contributors to the carbon budget of the continent, both manmade and natural. Successful mitigation strategies need independent and credible assessments of their efficacy. Managing carbon emissions will require the involvement of industry, financial markets, and governments at all levels. Without good information, governments will be slow to act, private investments will likely be less than optimal, and financial markets will not develop as they might need to. The atmospheric data and the methods used to derive sources and sinks will be fully open and available in up-to-date form to scientists, the general public, and policymakers. This presentation will provide an overview of NOAA`s role in the North American Carbon Program, our current accomplishments, our plans for the future network, and the currently expected products, services, and information that derive from these and other associated studies. Today's products, while useful, will be eclipsed by those of tomorrow, which will focus heavily on regional emissions expressed on seasonal or shorter time-scales, and will provide needed information for improved predictions in the future.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
... quantitative studies. Focus groups serve the narrowly defined need for direct and informal opinion on a specific topic and as a qualitative research tool have three major purposes: To obtain information that is useful for developing variables and measures for quantitative studies, To better understand people's...
ERIC Educational Resources Information Center
Yu, Wei
2013-01-01
This dissertation applied the quantitative approach to the data gathered from online survey questionnaires regarding the three objects: Information Technology (IT) Portfolio Management, IT-Business Alignment, and IT Project Deliverables. By studying this data, this dissertation uncovered the underlying relationships that exist between the…
NASA Astrophysics Data System (ADS)
Koma, Zsófia; Székely, Balázs; Dorninger, Peter; Rasztovits, Sascha; Roncat, Andreas; Zámolyi, András; Krawczyk, Dominik; Pfeifer, Norbert
2014-05-01
Aerial imagery derivatives collected by the Unmanned Aerial Vehicle (UAV) technology can be used as input for generation of high resolution digital terrain model (DTM) data along with the Terrestrial Laser Scanning (TLS) method. Both types of datasets are suitable for detailed geological and geomorphometric analysis, because the data provide micro-topographical and structural geological information. Our study focuses on the comparison of the possibilities of the extracted geological information, which is available from high resolution DTMs. This research attempts to find an answer which technology is more effective for geological and geomorphological analysis. The measurements were taken at the Doren landslide (Vorarlberg, Austria), a complex rotational land slide situated in the Alpine molasse foreland. Several formations (Kojen Formation, Würmian glacial moraine sediments, Weissach Formation) were tectonized there in the course of the alpine orogeny (Oberhauser et al, 2007). The typical fault direction is WSW-ENE. The UAV measurements that were carried out simultaneously with the TLS campaign focused on the landslide scarp. The original image resolution was 4 mm/pixel. Image matching was implemented in pyramid level 2 and the achieved resolution of the DTM was 0.05 meter. The TLS dataset includes 18 scan positions and more than 300 million points for the whole landslide area. The achieved DTM has 0.2 meter resolution. The steps of the geological and geomorphological analysis were: (1) visual interpretation based on field work and geological maps, (2) quantitative DTM analysis. In the quantitative analysis input data provided by the different kinds of DTMs were used for further parameter calculations (e.g. slope, aspect, sigmaZ). In the next step an automatic classification method was used for the detection of faults and classification of different parts of the landslide. The conclusion was that for geological visualization interpretation UAV datasets are better, because the high resolution texture information allows for the extraction of the digital geomorphology indicators. For quantitative analysis both datasets are informative, but the TLS DTM has an advantage of accessing additional information on faults beneath the vegetation cover. These studies were carried out partly in the framework of Hybrid 3D project financed by the Austrian Research Promotion Agency (FFG) and Von-Oben and 4D-IT; the contribution of ZsK was partly funded by Campus Hungary Internship TÁMOP-424B1; BSz contributed partly as an Alexander von Humboldt Research Fellow.
Quantitative volumetric Raman imaging of three dimensional cell cultures
NASA Astrophysics Data System (ADS)
Kallepitis, Charalambos; Bergholt, Mads S.; Mazo, Manuel M.; Leonardo, Vincent; Skaalure, Stacey C.; Maynard, Stephanie A.; Stevens, Molly M.
2017-03-01
The ability to simultaneously image multiple biomolecules in biologically relevant three-dimensional (3D) cell culture environments would contribute greatly to the understanding of complex cellular mechanisms and cell-material interactions. Here, we present a computational framework for label-free quantitative volumetric Raman imaging (qVRI). We apply qVRI to a selection of biological systems: human pluripotent stem cells with their cardiac derivatives, monocytes and monocyte-derived macrophages in conventional cell culture systems and mesenchymal stem cells inside biomimetic hydrogels that supplied a 3D cell culture environment. We demonstrate visualization and quantification of fine details in cell shape, cytoplasm, nucleus, lipid bodies and cytoskeletal structures in 3D with unprecedented biomolecular specificity for vibrational microspectroscopy.
Origin of platelet-derived growth factor in megakaryocytes in guinea pigs.
Chernoff, A; Levine, R F; Goodman, D S
1980-01-01
Growth factor activity, as determined by the stimulation of [3H]thymidine incorporation into the DNA of quiescent 3T3 cells in culture, was found in lysates of guinea pig platelets and megakaryocytes. Quantitative dilution studies demonstrated that, of the cells present in the guinea pig bone marrow, only the megakaryocyte possessed quantitatively significant growth factor activity. The amount of activity present in one megakaryocyte was equivalent to that present in 1,000-5,000 platelets, a value approximately comparable to the number of platelets shed from a single megakaryocyte. It is suggested that guinea pig platelet-derived growth factor has its origin in the megakaryocyte. PMID:7358851
Two EST-derived marker systems for cultivar identification in tree peony.
Zhang, J J; Shu, Q Y; Liu, Z A; Ren, H X; Wang, L S; De Keyser, E
2012-02-01
Tree peony (Paeonia suffruticosa Andrews), a woody deciduous shrub, belongs to the section Moutan DC. in the genus of Paeonia of the Paeoniaceae family. To increase the efficiency of breeding, two EST-derived marker systems were developed based on a tree peony expressed sequence tag (EST) database. Using target region amplification polymorphism (TRAP), 19 of 39 primer pairs showed good amplification for 56 accessions with amplicons ranging from 120 to 3,000 bp long, among which 99.3% were polymorphic. In contrast, 7 of 21 primer pairs demonstrated adequate amplification with clear bands for simple sequence repeats (SSRs) developed from ESTs, and a total of 33 alleles were found in 56 accessions. The similarity matrices generated by TRAP and EST-SSR markers were compared, and the Mantel test (r = 0.57778, P = 0.0020) showed a moderate correlation between the two types of molecular markers. TRAP markers were suitable for DNA fingerprinting and EST-SSR markers were more appropriate for discriminating synonyms (the same cultivars with different names due to limited information exchanged among different geographic areas). The two sets of EST-derived markers will be used further for genetic linkage map construction and quantitative trait locus detection in tree peony.
Monitoring liver damage using hepatocyte-specific methylation markers in cell-free circulating DNA.
Lehmann-Werman, Roni; Magenheim, Judith; Moss, Joshua; Neiman, Daniel; Abraham, Ofri; Piyanzin, Sheina; Zemmour, Hai; Fox, Ilana; Dor, Talya; Grompe, Markus; Landesberg, Giora; Loza, Bao-Li; Shaked, Abraham; Olthoff, Kim; Glaser, Benjamin; Shemer, Ruth; Dor, Yuval
2018-06-21
Liver damage is typically inferred from serum measurements of cytoplasmic liver enzymes. DNA molecules released from dying hepatocytes are an alternative biomarker, unexplored so far, potentially allowing for quantitative assessment of liver cell death. Here we describe a method for detecting acute hepatocyte death, based on quantification of circulating, cell-free DNA (cfDNA) fragments carrying hepatocyte-specific methylation patterns. We identified 3 genomic loci that are unmethylated specifically in hepatocytes, and used bisulfite conversion, PCR, and massively parallel sequencing to quantify the concentration of hepatocyte-derived DNA in mixed samples. Healthy donors had, on average, 30 hepatocyte genomes/ml plasma, reflective of basal cell turnover in the liver. We identified elevations of hepatocyte cfDNA in patients shortly after liver transplantation, during acute rejection of an established liver transplant, and also in healthy individuals after partial hepatectomy. Furthermore, patients with sepsis had high levels of hepatocyte cfDNA, which correlated with levels of liver enzymes aspartate aminotransferase (AST) and alanine aminotransferase (ALT). Duchenne muscular dystrophy patients, in which elevated AST and ALT derive from damaged muscle rather than liver, did not have elevated hepatocyte cfDNA. We conclude that measurements of hepatocyte-derived cfDNA can provide specific and sensitive information on hepatocyte death, for monitoring human liver dynamics, disease, and toxicity.
Synthesising quantitative and qualitative research in evidence-based patient information.
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-03-01
Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review.
Kim, David M.; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C.; Berezin, Mikhail Y.
2015-01-01
The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices – a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health. PMID:26531782
Cielecka-Piontek, Judyta
2013-07-01
A simple and selective derivative spectrophotometric method was developed for the quantitative determination of faropenem in pure form and in pharmaceutical dosage. The method is based on the zero-crossing effect of first-derivative spectrophotometry (λ = 324 nm), which eliminates the overlapping effect caused by the excipients present in the pharmaceutical preparation, as well as degradation products, formed during hydrolysis, oxidation, photolysis, and thermolysis. The method was linear in the concentration range 2.5-300 μg/mL (r = 0.9989) at λ = 341 nm; the limits of detection and quantitation were 0.16 and 0.46 μg/mL, respectively. The method had good precision (relative standard deviation from 0.68 to 2.13%). Recovery of faropenem ranged from 97.9 to 101.3%. The first-order rate constants of the degradation of faropenem in pure form and in pharmaceutical dosage were determined by using first-derivative spectrophotometry. A statistical comparison of the validation results and the observed rate constants for faropenem degradation with these obtained with the high-performance liquid chromatography method demonstrated that both were compatible.
Wang, Haiqin; Liu, Wenlong; He, Fuyuan; Chen, Zuohong; Zhang, Xili; Xie, Xianggui; Zeng, Jiaoli; Duan, Xiaopeng
2012-02-01
To explore the once sampling quantitation of Houttuynia cordata through its DNA polymorphic bands that carried information entropy, from other form that the expression of traditional Chinese medicine polymorphism, genetic polymorphism, of traditional Chinese medicine. The technique of inter simple sequence repeat (ISSR) was applied to analyze genetic polymorphism of H. cordata samples from the same GAP producing area, the DNA genetic bands were transformed its into the information entropy, and the minimum once sampling quantitation with the mathematical mode was measured. One hundred and thirty-four DNA bands were obtained by using 9 screened ISSR primers to amplify from 46 strains DNA samples of H. cordata from the same GAP, the information entropy was H=0.365 6-0.978 6, and RSD was 14.75%. The once sampling quantitation was W=11.22 kg (863 strains). The "once minimum sampling quantitation" were calculated from the angle of the genetic polymorphism of H. cordata, and a great differences between this volume and the amount from the angle of fingerprint were found.
Kinetically limited weathering at low denudation rates in semi-arid climates
NASA Astrophysics Data System (ADS)
Vanacker, V.; Schoonejans, J.; Opfergelt, S.; Ameijeiras-Marino, Y.; Christl, M.
2016-12-01
On Earth, the Critical Zone supports terrestrial life, being the near-surface environment where interactions between the atmosphere, lithosphere, hydrosphere, and biosphere take place Quantitative understanding of the interaction between mechanical rock breakdown, chemical weathering, and physical erosion is essential for unraveling Earth's biogeochemical cycles. In this study, we explore the role of soil water balance on regulating soil chemical weathering under water deficit regimes. Weathering rates and intensities were evaluated for nine soil profiles located on convex ridge crests of three mountain ranges in the Spanish Betic Cordillera. We present and compare quantitative information on soil weathering, chemical depletion and total denudation that were derived based on geochemical mass balance, 10Be cosmogenic nuclides and U-series disequilibria. Soil production rates determined based on U-series isotopes (238U, 234U, 230Th and 226Ra) are of the same order of magnitude as 10Be-derived denudation rates, suggesting steady state soil thickness, in two out of three sampling sites. The chemical weathering intensities are relatively low (˜5 to 30% of the total denudation of the soil) and negatively correlated with the magnitude of the water deficit in soils. Soil weathering extents increase (nonlinearly) with soil thickness and decrease with increasing surface denudation rates, consistent with kinetically limited or controlled weathering. Our study suggests that soil residence time and water availability limit weathering processes in semi-arid climates, which has not been validated previously with field data. An important implication of this finding is that climatic regimes may strongly regulate soil weathering by modulating soil solute fluxes.
Disease quantification on PET/CT images without object delineation
NASA Astrophysics Data System (ADS)
Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.
2017-03-01
The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.
Lewis, Richard L; Shvartsman, Michael; Singh, Satinder
2013-07-01
We explore the idea that eye-movement strategies in reading are precisely adapted to the joint constraints of task structure, task payoff, and processing architecture. We present a model of saccadic control that separates a parametric control policy space from a parametric machine architecture, the latter based on a small set of assumptions derived from research on eye movements in reading (Engbert, Nuthmann, Richter, & Kliegl, 2005; Reichle, Warren, & McConnell, 2009). The eye-control model is embedded in a decision architecture (a machine and policy space) that is capable of performing a simple linguistic task integrating information across saccades. Model predictions are derived by jointly optimizing the control of eye movements and task decisions under payoffs that quantitatively express different desired speed-accuracy trade-offs. The model yields distinct eye-movement predictions for the same task under different payoffs, including single-fixation durations, frequency effects, accuracy effects, and list position effects, and their modulation by task payoff. The predictions are compared to-and found to accord with-eye-movement data obtained from human participants performing the same task under the same payoffs, but they are found not to accord as well when the assumptions concerning payoff optimization and processing architecture are varied. These results extend work on rational analysis of oculomotor control and adaptation of reading strategy (Bicknell & Levy, ; McConkie, Rayner, & Wilson, 1973; Norris, 2009; Wotschack, 2009) by providing evidence for adaptation at low levels of saccadic control that is shaped by quantitatively varying task demands and the dynamics of processing architecture. Copyright © 2013 Cognitive Science Society, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaotong; Liu, Jiaen; Van de Moortele, Pierre-Francois
2014-12-15
Electrical Properties Tomography (EPT) technique utilizes measurable radio frequency (RF) coil induced magnetic fields (B1 fields) in a Magnetic Resonance Imaging (MRI) system to quantitatively reconstruct the local electrical properties (EP) of biological tissues. Information derived from the same data set, e.g., complex numbers of B1 distribution towards electric field calculation, can be used to estimate, on a subject-specific basis, local Specific Absorption Rate (SAR). SAR plays a significant role in RF pulse design for high-field MRI applications, where maximum local tissue heating remains one of the most constraining limits. The purpose of the present work is to investigate themore » feasibility of such B1-based local SAR estimation, expanding on previously proposed EPT approaches. To this end, B1 calibration was obtained in a gelatin phantom at 7 T with a multi-channel transmit coil, under a particular multi-channel B1-shim setting (B1-shim I). Using this unique set of B1 calibration, local SAR distribution was subsequently predicted for B1-shim I, as well as for another B1-shim setting (B1-shim II), considering a specific set of parameter for a heating MRI protocol consisting of RF pulses plaid at 1% duty cycle. Local SAR results, which could not be directly measured with MRI, were subsequently converted into temperature change which in turn were validated against temperature changes measured by MRI Thermometry based on the proton chemical shift.« less
What Are We Doing When We Translate from Quantitative Models?
Critchfield, Thomas S; Reed, Derek D
2009-01-01
Although quantitative analysis (in which behavior principles are defined in terms of equations) has become common in basic behavior analysis, translational efforts often examine everyday events through the lens of narrative versions of laboratory-derived principles. This approach to translation, although useful, is incomplete because equations may convey concepts that are difficult to capture in words. To support this point, we provide a nontechnical introduction to selected aspects of quantitative analysis; consider some issues that translational investigators (and, potentially, practitioners) confront when attempting to translate from quantitative models; and discuss examples of relevant translational studies. We conclude that, where behavior-science translation is concerned, the quantitative features of quantitative models cannot be ignored without sacrificing conceptual precision, scientific and practical insights, and the capacity of the basic and applied wings of behavior analysis to communicate effectively. PMID:22478533
Sherrouse, Benson C.; Semmens, Darius J.; Clement, Jessica M.
2014-01-01
Despite widespread recognition that social-value information is needed to inform stakeholders and decision makers regarding trade-offs in environmental management, it too often remains absent from ecosystem service assessments. Although quantitative indicators of social values need to be explicitly accounted for in the decision-making process, they need not be monetary. Ongoing efforts to map such values demonstrate how they can also be made spatially explicit and relatable to underlying ecological information. We originally developed Social Values for Ecosystem Services (SolVES) as a tool to assess, map, and quantify nonmarket values perceived by various groups of ecosystem stakeholders. With SolVES 2.0 we have extended the functionality by integrating SolVES with Maxent maximum entropy modeling software to generate more complete social-value maps from available value and preference survey data and to produce more robust models describing the relationship between social values and ecosystems. The current study has two objectives: (1) evaluate how effectively the value index, a quantitative, nonmonetary social-value indicator calculated by SolVES, reproduces results from more common statistical methods of social-survey data analysis and (2) examine how the spatial results produced by SolVES provide additional information that could be used by managers and stakeholders to better understand more complex relationships among stakeholder values, attitudes, and preferences. To achieve these objectives, we applied SolVES to value and preference survey data collected for three national forests, the Pike and San Isabel in Colorado and the Bridger–Teton and the Shoshone in Wyoming. Value index results were generally consistent with results found through more common statistical analyses of the survey data such as frequency, discriminant function, and correlation analyses. In addition, spatial analysis of the social-value maps produced by SolVES provided information that was useful for explaining relationships between stakeholder values and forest uses. Our results suggest that SolVES can effectively reproduce information derived from traditional statistical analyses while adding spatially explicit, social-value information that can contribute to integrated resource assessment, planning, and management of forests and other ecosystems.
NASA Astrophysics Data System (ADS)
Ivanova, Bojidarka; Spiteller, Michael
2018-04-01
The problematic that we consider in this paper treats the quantitative correlation model equations between experimental kinetic and thermodynamic parameters of coupled electrospray ionization (ESI) mass spectrometry (MS) or atmospheric pressure chemical ionization (APCI) mass spectrometry with collision induced dissociation mass spectrometry, accounting for the fact that the physical phenomena and mechanisms of ESI- and APCI-ion formation are completely different. There are described forty two fragment reactions of three analytes under independent ESI- and APCI-measurements. The developed new quantitative models allow us to study correlatively the reaction kinetics and thermodynamics using the methods of mass spectrometry, which complementary application with the methods of the quantum chemistry provide 3D structural information of the analytes. Both static and dynamic quantum chemical computations are carried out. The object of analyses are [2,3-dimethyl-4-(4-methyl-benzoyl)-2,3-di-p-tolyl-cyclobutyl]-p-tolyl-methanone (1) and the polycyclic aromatic hydrocarbons derivatives of dibenzoperylen (2) and tetrabenzo [a,c,fg,op]naphthacene (3), respectively. As far as (1) is known to be a product of [2π+2π] cycloaddition reactions of chalcone (1,3-di-p-tolyl-propenone), however producing cyclic derivatives with different stereo selectivity, so that the study provide crucial data about the capability of mass spectrometry to provide determine the stereo selectivity of the analytes. This work also first provides quantitative treatment of the relations '3D molecular/electronic structures'-'quantum chemical diffusion coefficient'-'mass spectrometric diffusion coefficient', thus extending the capability of the mass spectrometry for determination of the exact 3D structure of the analytes using independent measurements and computations of the diffusion coefficients. The determination of the experimental diffusion parameters is carried out within the 'current monitoring method' evaluating the translation diffusion of charged analytes, while the theoretical modelling of MS ions and computations of theoretical diffusion coefficients are based on the Arrhenius type behavior of the charged species under ESI- and APCI-conditions. Although the study provide certain sound considerations for the quantitative relations between the reaction kinetic-thermodynamics and 3D structure of the analytes together with correlations between 3D molecular/electronic structures-quantum chemical diffusion coefficient-mass spectrometric diffusion coefficient, which contribute significantly to the structural analytical chemistry, the results have importance to other areas such as organic synthesis and catalysis as well.
Clinical applications of a quantitative analysis of regional lift ventricular wall motion
NASA Technical Reports Server (NTRS)
Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.
1975-01-01
Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.
USDA-ARS?s Scientific Manuscript database
A genome-wide scan for quantitative trait loci (QTL) affecting gastrointestinal (GI) nematode resistance was completed using a double backcross sheep population derived from Red Maasai and Dorper ewes bred to F1 rams. These breeds were chosen, because Red Maasai sheep are known to be more tolerant ...
Autonomous quantum Maxwell's demon based on two exchange-coupled quantum dots
NASA Astrophysics Data System (ADS)
Ptaszyński, Krzysztof
2018-01-01
I study an autonomous quantum Maxwell's demon based on two exchange-coupled quantum dots attached to the spin-polarized leads. The principle of operation of the demon is based on the coherent oscillations between the spin states of the system which act as a quantum iSWAP gate. Due to the operation of the iSWAP gate, one of the dots acts as a feedback controller which blocks the transport with the bias in the other dot, thus inducing the electron pumping against the bias; this leads to the locally negative entropy production. Operation of the demon is associated with the information transfer between the dots, which is studied quantitatively by mapping the analyzed setup onto the thermodynamically equivalent auxiliary system. The calculated entropy production in a single subsystem and information flow between the subsystems are shown to obey a local form of the second law of thermodynamics, similar to the one previously derived for classical bipartite systems.
Yao, Yan; Lenhoff, Abraham M
2004-05-28
The macroscopic properties of porous chromatographic adsorbents are directly influenced by the pore structure, with the pore size distribution (PSD) playing a major role beyond simply the mean pore size. Inverse size-exclusion chromatography (ISEC), a widely used chromatographic method for determining the PSD of porous media, provides more relevant information on liquid chromatographic materials in situ than traditional methods, such as gas sorption and mercury intrusion. The fundamentals and applications of ISEC in the characterization of the pore structure are reviewed. The description of the probe solutes and the pore space, as well as theoretical models for deriving the PSD from solute partitioning behavior, are discussed. Precautions to ensure integrity of the experiments are also outlined, including accounting for probe polydispersity and minimization of solute-adsorbent interactions. The results that emerge are necessarily model-dependent, but ISEC nonetheless represents a powerful and non-destructive source of quantitative pore structure information that can help to elucidate chromatographic performance observations covering both retention and rate aspects.
Configuration of Pluto's Volatile Ices
NASA Astrophysics Data System (ADS)
Grundy, William M.; Binzel, R. P.; Cook, J. C.; Cruikshank, D. P.; Dalle Ore, C. M.; Earle, A. M.; Ennico, K.; Jennings, D. E.; Howett, C. J. A.; Linscott, I. R.; Lunsford, A. W.; Olkin, C. B.; Parker, A. H.; Parker, J. Wm; Protopapa, S.; Reuter, D. C.; Singer, K. N.; Spencer, J. R.; Stern, S. A.; Tsang, C. C. C.; Verbiscer, A. J.; Weaver, H. A.; Young, L. A.; Berry, K.; Buie, M. W.; Stansberry, J. A.
2015-11-01
We report on near-infrared remote sensing by New Horizons' Ralph instrument (Reuter et al. 2008, Space Sci. Rev. 140, 129-154) of Pluto's N2, CO, and CH4 ices. These especially volatile ices are mobile even at Pluto's cryogenic surface temperatures. Sunlight reflected from these ices becomes imprinted with their characteristic spectral absorption bands. The detailed appearance of these absorption features depends on many aspects of local composition, thermodynamic state, and texture. Multiple-scattering radiative transfer models are used to retrieve quantitative information about these properties and to map how they vary across Pluto's surface. Using parameter maps derived from New Horizons observations, we investigate the striking regional differences in the abundances and scattering properties of Pluto's volatile ices. Comparing these spatial patterns with the underlying geology provides valuable constraints on processes actively modifying the planet's surface, over a variety of spatial scales ranging from global latitudinal patterns to more regional and local processes within and around the feature informally known as Sputnik Planum. This work was supported by the NASA New Horizons Project.
Barrett, Christian L.; Cho, Byung-Kwan
2011-01-01
Immuno-precipitation of protein–DNA complexes followed by microarray hybridization is a powerful and cost-effective technology for discovering protein–DNA binding events at the genome scale. It is still an unresolved challenge to comprehensively, accurately and sensitively extract binding event information from the produced data. We have developed a novel strategy composed of an information-preserving signal-smoothing procedure, higher order derivative analysis and application of the principle of maximum entropy to address this challenge. Importantly, our method does not require any input parameters to be specified by the user. Using genome-scale binding data of two Escherichia coli global transcription regulators for which a relatively large number of experimentally supported sites are known, we show that ∼90% of known sites were resolved to within four probes, or ∼88 bp. Over half of the sites were resolved to within two probes, or ∼38 bp. Furthermore, we demonstrate that our strategy delivers significant quantitative and qualitative performance gains over available methods. Such accurate and sensitive binding site resolution has important consequences for accurately reconstructing transcriptional regulatory networks, for motif discovery, for furthering our understanding of local and non-local factors in protein–DNA interactions and for extending the usefulness horizon of the ChIP-chip platform. PMID:21051353
TRIPATHI, ASHUTOSH; DURRANT, DAVID; LEE, RAY M.; BARUCHELLO, RICCARDO; ROMAGNOLI, ROMEO; SIMONI, DANIELE; KELLOGG, GLEN E.
2009-01-01
The crucial role of the microtubule in the cell division has identified tubulin as a target for the development of therapeutics for cancer; in particular tubulin is a target for antineoplastic agents that act by interfering with the dynamic stability of microtubules. A molecular modeling study was carried out to accurately represent the complex structure and the binding mode of a new class of stilbene-based tubulin inhibitors that bind at the αβ-tubulin colchicine site. Computational docking along with HINT score analysis fitted these inhibitors into the colchicine site and revealed detailed structure-activity information useful for inhibitor design. Quantitative analysis of the results was in good agreement with the in vitro antiproliferative activity of these derivatives (ranging from 3 nM to 100 μM) such that calculated and measured free energies of binding correlate with an r2 of 0.89 (standard error ± 0.85 kcal mol−1). This correlation suggests that the activity of unknown compounds may be predicted. PMID:19912057
Dark matter vs. astrophysics in the interpretation of AMS-02 electron and positron data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mauro, Mattia Di; Donato, Fiorenza; Fornengo, Nicolao
We perform a detailed quantitative analysis of the recent AMS-02 electron and positron data. We investigate the interplay between the emission from primary astrophysical sources, namely Supernova Remnants and Pulsar Wind Nebulae, and the contribution from a dark matter annihilation or decay signal. Our aim is to assess the information that can be derived on dark matter properties when both dark matter and primary astrophysical sources are assumed to jointly contribute to the leptonic observables measured by the AMS-02 experiment. We investigate both the possibility to set robust constraints on the dark matter annihilation/decay rate and the possibility to lookmore » for dark matter signals within realistic models that take into account the full complexity of the astrophysical background. Our results show that AMS-02 data enable to probe efficiently vast regions of the dark matter parameter space and, in some cases, to set constraints on the dark matter annihilation/decay rate that are comparable or even stronger than the ones derived from other indirect detection channels.« less
NASA Astrophysics Data System (ADS)
Pan, Y.; Shen, W.; Hwang, C.
2015-12-01
As an elastic Earth, the surface vertical deformation is in response to hydrological mass change on or near Earth's surface. The continuous GPS (CGPS) records show surface vertical deformations which are significant information to estimate the variation of terrestrial water storage. We compute the loading deformations at GPS stations based on synthetic models of seasonal water load distribution and then invert the synthetic GPS data for surface mass distribution. We use GRACE gravity observations and hydrology models to evaluate seasonal water storage variability in Nepal and Himalayas. The coherence among GPS inversion results, GRACE and hydrology models indicate that GPS can provide quantitative estimates of terrestrial water storage variations by inverting the surface deformation observations. The annual peak-to-peak surface mass change derived from GPS and GRACE results reveal seasonal loads oscillations of water, snow and ice. Meanwhile, the present uplifting of Nepal and Himalayas indicates the hydrology mass loss. This study is supported by National 973 Project China (grant Nos. 2013CB733302 and 2013CB733305), NSFC (grant Nos. 41174011, 41429401, 41210006, 41128003, 41021061).
NASA Astrophysics Data System (ADS)
Marra, Francesco; Morin, Efrat
2018-02-01
Small scale rainfall variability is a key factor driving runoff response in fast responding systems, such as mountainous, urban and arid catchments. In this paper, the spatial-temporal autocorrelation structure of convective rainfall is derived with extremely high resolutions (60 m, 1 min) using estimates from an X-Band weather radar recently installed in a semiarid-arid area. The 2-dimensional spatial autocorrelation of convective rainfall fields and the temporal autocorrelation of point-wise and distributed rainfall fields are examined. The autocorrelation structures are characterized by spatial anisotropy, correlation distances 1.5-2.8 km and rarely exceeding 5 km, and time-correlation distances 1.8-6.4 min and rarely exceeding 10 min. The observed spatial variability is expected to negatively affect estimates from rain gauges and microwave links rather than satellite and C-/S-Band radars; conversely, the temporal variability is expected to negatively affect remote sensing estimates rather than rain gauges. The presented results provide quantitative information for stochastic weather generators, cloud-resolving models, dryland hydrologic and agricultural models, and multi-sensor merging techniques.
A Direct Approach to In-Plane Stress Separation using Photoelastic Ptychography
NASA Astrophysics Data System (ADS)
Anthony, Nicholas; Cadenazzi, Guido; Kirkwood, Henry; Huwald, Eric; Nugent, Keith; Abbey, Brian
2016-08-01
The elastic properties of materials, either under external load or in a relaxed state, influence their mechanical behaviour. Conventional optical approaches based on techniques such as photoelasticity or thermoelasticity can be used for full-field analysis of the stress distribution within a specimen. The circular polariscope in combination with holographic photoelasticity allows the sum and difference of principal stress components to be determined by exploiting the temporary birefringent properties of materials under load. Phase stepping and interferometric techniques have been proposed as a method for separating the in-plane stress components in two-dimensional photoelasticity experiments. In this paper we describe and demonstrate an alternative approach based on photoelastic ptychography which is able to obtain quantitative stress information from far fewer measurements than is required for interferometric based approaches. The complex light intensity equations based on Jones calculus for this setup are derived. We then apply this approach to the problem of a disc under diametrical compression. The experimental results are validated against the analytical solution derived by Hertz for the theoretical displacement fields for an elastic disc subject to point loading.
A Direct Approach to In-Plane Stress Separation using Photoelastic Ptychography
Anthony, Nicholas; Cadenazzi, Guido; Kirkwood, Henry; Huwald, Eric; Nugent, Keith; Abbey, Brian
2016-01-01
The elastic properties of materials, either under external load or in a relaxed state, influence their mechanical behaviour. Conventional optical approaches based on techniques such as photoelasticity or thermoelasticity can be used for full-field analysis of the stress distribution within a specimen. The circular polariscope in combination with holographic photoelasticity allows the sum and difference of principal stress components to be determined by exploiting the temporary birefringent properties of materials under load. Phase stepping and interferometric techniques have been proposed as a method for separating the in-plane stress components in two-dimensional photoelasticity experiments. In this paper we describe and demonstrate an alternative approach based on photoelastic ptychography which is able to obtain quantitative stress information from far fewer measurements than is required for interferometric based approaches. The complex light intensity equations based on Jones calculus for this setup are derived. We then apply this approach to the problem of a disc under diametrical compression. The experimental results are validated against the analytical solution derived by Hertz for the theoretical displacement fields for an elastic disc subject to point loading. PMID:27488605
Yu, Qingyue; Hao, Guodong; Zhou, Jianxin; Wang, Jingying; Evivie, Ejiroghene Ruona; Li, Jing
2018-06-22
Glucosinolates are a class of amino acid-derived specialized metabolites characteristic of the Brassicales order. Trp derived indolic glucosinolates are essential for the effective plant defense responses to a wide range of pathogens and herbivores. In Arabidopsis, MYB51 is the key transcription factor positively regulates indolic glucosinolate production by activating certain biosynthetic genes. In this study, we report the isolation and identification of a MYB51 from broccoli designated as BoMYB51. Overexpression of BoMYB51 in Arabidopsis increased indolic glucosinolate production by upregulating biosynthetic genes and resulted in enhanced flagellin22 (Flg22) induced callose deposition. The spatial expression pattern and responsive expression of BoMYB51 to several hormones and stress treatments were investigated by expressing the β-glucuronidase (GUS) reporter gene driven by BoMYB51 promotor in Arabidopsis and quantitative real-time PCR analysis in broccoli. Our study provides information on molecular characteristics of BoMYB51 and possible physiological process BoMYB51 may involve. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zeng, Chao; Long, Di; Shen, Huanfeng; Wu, Penghai; Cui, Yaokui; Hong, Yang
2018-07-01
Land surface temperature (LST) is one of the most important parameters in land surface processes. Although satellite-derived LST can provide valuable information, the value is often limited by cloud contamination. In this paper, a two-step satellite-derived LST reconstruction framework is proposed. First, a multi-temporal reconstruction algorithm is introduced to recover invalid LST values using multiple LST images with reference to corresponding remotely sensed vegetation index. Then, all cloud-contaminated areas are temporally filled with hypothetical clear-sky LST values. Second, a surface energy balance equation-based procedure is used to correct for the filled values. With shortwave irradiation data, the clear-sky LST is corrected to the real LST under cloudy conditions. A series of experiments have been performed to demonstrate the effectiveness of the developed approach. Quantitative evaluation results indicate that the proposed method can recover LST in different surface types with mean average errors in 3-6 K. The experiments also indicate that the time interval between the multi-temporal LST images has a greater impact on the results than the size of the contaminated area.
Lehnert, Teresa; Figge, Marc Thilo
2017-01-01
Mathematical modeling and computer simulations have become an integral part of modern biological research. The strength of theoretical approaches is in the simplification of complex biological systems. We here consider the general problem of receptor-ligand binding in the context of antibody-antigen binding. On the one hand, we establish a quantitative mapping between macroscopic binding rates of a deterministic differential equation model and their microscopic equivalents as obtained from simulating the spatiotemporal binding kinetics by stochastic agent-based models. On the other hand, we investigate the impact of various properties of B cell-derived receptors-such as their dimensionality of motion, morphology, and binding valency-on the receptor-ligand binding kinetics. To this end, we implemented an algorithm that simulates antigen binding by B cell-derived receptors with a Y-shaped morphology that can move in different dimensionalities, i.e., either as membrane-anchored receptors or as soluble receptors. The mapping of the macroscopic and microscopic binding rates allowed us to quantitatively compare different agent-based model variants for the different types of B cell-derived receptors. Our results indicate that the dimensionality of motion governs the binding kinetics and that this predominant impact is quantitatively compensated by the bivalency of these receptors.
Di Tullio, Maurizio; Maccallini, Cristina; Ammazzalorso, Alessandra; Giampietro, Letizia; Amoroso, Rosa; De Filippis, Barbara; Fantacuzzi, Marialuigia; Wiczling, Paweł; Kaliszan, Roman
2012-07-01
A series of 27 analogues of clofibric acid, mostly heteroarylalkanoic derivatives, have been analyzed by a novel high-throughput reversed-phase HPLC method employing combined gradient of eluent's pH and organic modifier content. The such determined hydrophobicity (lipophilicity) parameters, log kw , and acidity constants, pKa , were subjected to multiple regression analysis to get a QSRR (Quantitative StructureRetention Relationships) and a QSPR (Quantitative Structure-Property Relationships) equation, respectively, describing these pharmacokinetics-determining physicochemical parameters in terms of the calculation chemistry derived structural descriptors. The previously determined in vitro log EC50 values - transactivation activity towards PPARα (human Peroxisome Proliferator-Activated Receptor α) - have also been described in a QSAR (Quantitative StructureActivity Relationships) equation in terms of the 3-D-MoRSE descriptors (3D-Molecule Representation of Structures based on Electron diffraction descriptors). The QSAR model derived can serve for an a priori prediction of bioactivity in vitro of any designed analogue, whereas the QSRR and the QSPR models can be used to evaluate lipophilicity and acidity, respectively, of the compounds, and hence to rational guide selection of structures of proper pharmacokinetics. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Goertz, Ruediger S; Klett, Daniel; Wildner, Dane; Atreya, Raja; Neurath, Markus F; Strobel, Deike
2018-01-01
Background Microvascularization of the bowel wall can be visualized and quantified non-invasively by software-assisted analysis of derived time-intensity curves. Purpose To perform software-based quantification of bowel wall perfusion using quantitative contrast-enhanced ultrasound (CEUS) according to clinical response in patients with inflammatory bowel disease treated with vedolizumab. Material and Methods In a prospective study, in 18 out of 34 patients, high-frequency ultrasound of bowel wall thickness using color Doppler flow combined with CEUS was performed at baseline and after 14 weeks of treatment with vedolizumab. Clinical activity scores at week 14 were used to differentiate between responders and non-responders. CEUS parameters were calculated by software analysis of the video loops. Results Nine of 18 patients (11 with Crohn's disease and seven with ulcerative colitis) showed response to treatment with vedolizumab. Overall, the responder group showed a significant decrease in the semi-quantitative color Doppler vascularization score. Amplitude-derived CEUS parameters of mural microvascularization such as peak enhancement or wash-in rate decreased in responders, in contrast with non-responders. Time-derived parameters remained stable or increased during treatment in all patients. Conclusion Analysis of bowel microvascularization by CEUS shows statistically significant changes in the wash-in-rate related to response of vedolizumab therapy.
Lehnert, Teresa; Figge, Marc Thilo
2017-01-01
Mathematical modeling and computer simulations have become an integral part of modern biological research. The strength of theoretical approaches is in the simplification of complex biological systems. We here consider the general problem of receptor–ligand binding in the context of antibody–antigen binding. On the one hand, we establish a quantitative mapping between macroscopic binding rates of a deterministic differential equation model and their microscopic equivalents as obtained from simulating the spatiotemporal binding kinetics by stochastic agent-based models. On the other hand, we investigate the impact of various properties of B cell-derived receptors—such as their dimensionality of motion, morphology, and binding valency—on the receptor–ligand binding kinetics. To this end, we implemented an algorithm that simulates antigen binding by B cell-derived receptors with a Y-shaped morphology that can move in different dimensionalities, i.e., either as membrane-anchored receptors or as soluble receptors. The mapping of the macroscopic and microscopic binding rates allowed us to quantitatively compare different agent-based model variants for the different types of B cell-derived receptors. Our results indicate that the dimensionality of motion governs the binding kinetics and that this predominant impact is quantitatively compensated by the bivalency of these receptors. PMID:29250071
Harn, Nicholas R; Hunt, Suzanne L; Hill, Jacqueline; Vidoni, Eric; Perry, Mark; Burns, Jeffrey M
2017-08-01
Establishing reliable methods for interpreting elevated cerebral amyloid-β plaque on PET scans is increasingly important for radiologists, as availability of PET imaging in clinical practice increases. We examined a 3-step method to detect plaque in cognitively normal older adults, focusing on the additive value of quantitative information during the PET scan interpretation process. Fifty-five F-florbetapir PET scans were evaluated by 3 experienced raters. Scans were first visually interpreted as having "elevated" or "nonelevated" plaque burden ("Visual Read"). Images were then processed using a standardized quantitative analysis software (MIMneuro) to generate whole brain and region of interest SUV ratios. This "Quantitative Read" was considered elevated if at least 2 of 6 regions of interest had an SUV ratio of more than 1.1. The final interpretation combined both visual and quantitative data together ("VisQ Read"). Cohen kappa values were assessed as a measure of interpretation agreement. Plaque was elevated in 25.5% to 29.1% of the 165 total Visual Reads. Interrater agreement was strong (kappa = 0.73-0.82) and consistent with reported values. Quantitative Reads were elevated in 45.5% of participants. Final VisQ Reads changed from initial Visual Reads in 16 interpretations (9.7%), with most changing from "nonelevated" Visual Reads to "elevated." These changed interpretations demonstrated lower plaque quantification than those initially read as "elevated" that remained unchanged. Interrater variability improved for VisQ Reads with the addition of quantitative information (kappa = 0.88-0.96). Inclusion of quantitative information increases consistency of PET scan interpretations for early detection of cerebral amyloid-β plaque accumulation.
Integrating animal movement with habitat suitability for estimating dynamic landscape connectivity
van Toor, Mariëlle L.; Kranstauber, Bart; Newman, Scott H.; Prosser, Diann J.; Takekawa, John Y.; Technitis, Georgios; Weibel, Robert; Wikelski, Martin; Safi, Kamran
2018-01-01
Context High-resolution animal movement data are becoming increasingly available, yet having a multitude of empirical trajectories alone does not allow us to easily predict animal movement. To answer ecological and evolutionary questions at a population level, quantitative estimates of a species’ potential to link patches or populations are of importance. Objectives We introduce an approach that combines movement-informed simulated trajectories with an environment-informed estimate of the trajectories’ plausibility to derive connectivity. Using the example of bar-headed geese we estimated migratory connectivity at a landscape level throughout the annual cycle in their native range. Methods We used tracking data of bar-headed geese to develop a multi-state movement model and to estimate temporally explicit habitat suitability within the species’ range. We simulated migratory movements between range fragments, and calculated a measure we called route viability. The results are compared to expectations derived from published literature. Results Simulated migrations matched empirical trajectories in key characteristics such as stopover duration. The viability of the simulated trajectories was similar to that of the empirical trajectories. We found that, overall, the migratory connectivity was higher within the breeding than in wintering areas, corroborating previous findings for this species. Conclusions We show how empirical tracking data and environmental information can be fused for meaningful predictions of animal movements throughout the year and even outside the spatial range of the available data. Beyond predicting migratory connectivity, our framework will prove useful for modelling ecological processes facilitated by animal movement, such as seed dispersal or disease ecology.
Three-dimensional diffuse optical mammography with ultrasound localization in a human subject
NASA Astrophysics Data System (ADS)
Holboke, Monica J.; Tromberg, Bruce J.; Li, Xingde; Shah, Natasha; Fishkin, Joshua B.; Kidney, D.; Butler, J.; Chance, Britton; Yodh, Arjun G.
2000-04-01
We describe an approach that combines clinical ultrasound and photon migration techniques to enhance the sensitivity and information content of diffuse optical tomography. Measurements were performed on a postmenopausal woman with a single 1.8 X 0.9 cm malignant ductal carcinoma in situ approximately 7.4 mm beneath the skin surface (UCI IRB protocol 95-563). The ultrasound-derived information about tumor geometry enabled us to segment the breast tissue into tumor and background regions. Optical data was obtained with a multifrequency, multiwavelength hand-held frequency-domain photon migration backscattering probe. The optical properties of the tumor and background were then computed using the ultrasound-derived geometrical constraints. An iterative perturbative approach, using parallel processing, provided quantitative information about scattering and absorption simultaneously with the ability to incorporate and resolve complex boundary conditions and geometries. A three to four fold increase in the tumor absorption coefficient and nearly 50% reduction in scattering coefficient relative to background was observed ((lambda) equals 674, 782, 803, and 849 nm). Calculations of the mean physiological parameters reveal fourfold greater tumor total hemoglobin concentration [Hbtot] than normal breast (67 (mu) M vs 16 (mu) M) and tumor hemoglobin oxygen saturation (SOx) values of 63% (vs 73% and 68% in the region surrounding the tumor and the opposite normal tissue, respectively). Comparison of semi-infinite to heterogeneous models shows superior tumor/background contrast for the latter in both absorption and scattering. Sensitivity studies assessing the impact of tumor size and refractive index assumptions, as well as scan direction, demonstrate modest effects on recovered properties.
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2017-02-01
Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.
De Medeiros, R C G; Soares, J D; De Sousa, F B
2012-05-01
Lesion area measurement of enamel caries using polarized light microscopy (PLM) is currently performed in a large number of studies, but measurements are based mainly on a mislead qualitative interpretation of enamel birefringence in a single immersion medium. Here, five natural enamel caries lesions are analysed by microradiography and in PLM, and the differences in their histopathological features derived from a qualitative versus a quantitative interpretation of enamel birefringence are described. Enamel birefringence in different immersion media (air, water and quinoline) is interpreted by both qualitative and quantitative approaches, the former leading to an underestimation of the depth of enamel caries mainly when the criterion of validating sound enamel as a negatively birefringent area in immersion in water is used (a current common practice in dental research). Procedures to avoid the shortcomings of a qualitative interpretation of enamel birefringence are presented and discussed. © 2012 The Authors Journal of Microscopy © 2012 Royal Microscopical Society.
Silady, Rebecca A; Effgen, Sigi; Koornneef, Maarten; Reymond, Matthieu
2011-01-01
A Quantitative Trait Locus (QTL) analysis was performed using two novel Recombinant Inbred Line (RIL) populations, derived from the progeny between two Arabidopsis thaliana genotypes collected at the same site in Kyoto (Japan) crossed with the reference laboratory strain Landsberg erecta (Ler). We used these two RIL populations to determine the genetic basis of seed dormancy and flowering time, which are assumed to be the main traits controlling life history variation in Arabidopsis. The analysis revealed quantitative variation for seed dormancy that is associated with allelic variation at the seed dormancy QTL DOG1 (for Delay Of Germination 1) in one population and at DOG6 in both. These DOG QTL have been previously identified using mapping populations derived from accessions collected at different sites around the world. Genetic variation within a population may enhance its ability to respond accurately to variation within and between seasons. In contrast, variation for flowering time, which also segregated within each mapping population, is mainly governed by the same QTL.
McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R
2007-05-01
The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.
The potential of satellite data to study individual wildfire events
NASA Astrophysics Data System (ADS)
Benali, Akli; López-Saldana, Gerardo; Russo, Ana; Sá, Ana C. L.; Pinto, Renata M. S.; Nikos, Koutsias; Owen, Price; Pereira, Jose M. C.
2014-05-01
Large wildfires have important social, economic and environmental impacts. In order to minimize their impacts, understand their main drivers and study their dynamics, different approaches have been used. The reconstruction of individual wildfire events is usually done by collection of field data, interviews and by implementing fire spread simulations. All these methods have clear limitations in terms of spatial and temporal coverage, accuracy, subjectivity of the collected information and lack of objective independent validation information. In this sense, remote sensing is a promising tool with the potential to provide relevant information for stakeholders and the research community, by complementing or filling gaps in existing information and providing independent accurate quantitative information. In this work we show the potential of satellite data to provide relevant information regarding the dynamics of individual large wildfire events, filling an important gap in wildfire research. We show how MODIS active-fire data, acquired up to four times per day, and satellite-derived burnt perimeters can be combined to extract relevant information wildfire events by describing the methods involved and presenting results for four regions of the world: Portugal, Greece, SE Australia and California. The information that can be retrieved encompasses the start and end date of a wildfire event and its ignition area. We perform an evaluation of the information retrieved by comparing the satellite-derived parameters with national databases, highlighting the strengths and weaknesses of both and showing how the former can complement the latter leading to more complete and accurate datasets. We also show how the spatio-temporal distribution of wildfire spread dynamics can be reconstructed using satellite-derived active-fires and how relevant descriptors can be extracted. Applying graph theory to satellite active-fire data, we define the major fire spread paths that yield information about the major spatial corridors through which fires spread, and their relative importance in the full fire event. These major fire paths are then used to extract relevant descriptors, such as the distribution of fire spread direction, rate of spread and fire intensity (i.e. energy emitted). The reconstruction of the fire spread is shown for some case studies for Portugal and is also compared with fire progressions obtained by air-borne sensors for SE Australia. The approach shows solid results, providing a valuable tool for the reconstruction of individual fire events, understand their complex spread patterns and their main drivers of fire propagation. The major fire pathsand the spatio-temporal distribution of active fires are being currently combined with fire spread simulations within the scope oftheFIRE-MODSATproject, to provideuseful information to support and improve fire suppression strategies.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-19
... panel will be employed to collect this information, which serves the need for direct and quantitative measurement of our target population, and which, as a quantitative research tool has some major benefits: To...
Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E
2014-01-01
Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and subjective assessment of difficulty in fingerprint comparisons.
28 CFR 17.26 - Derivative classification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons need not possess original classification authority to derivatively classify information based on source...
28 CFR 17.26 - Derivative classification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons need not possess original classification authority to derivatively classify information based on source...
28 CFR 17.26 - Derivative classification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons need not possess original classification authority to derivatively classify information based on source...
28 CFR 17.26 - Derivative classification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons need not possess original classification authority to derivatively classify information based on source...
Biswal, Ajaya K; Tan, Li; Atmodjo, Melani A; DeMartini, Jaclyn; Gelineo-Albersheim, Ivana; Hunt, Kimberly; Black, Ian M; Mohanty, Sushree S; Ryno, David; Wyman, Charles E; Mohnen, Debra
2017-01-01
The effective use of plant biomass for biofuel and bioproduct production requires a comprehensive glycosyl residue composition analysis to understand the different cell wall polysaccharides present in the different biomass sources. Here we compared four methods side-by-side for their ability to measure the neutral and acidic sugar composition of cell walls from herbaceous, grass, and woody model plants and bioenergy feedstocks. Arabidopsis, Populus , rice, and switchgrass leaf cell walls, as well as cell walls from Populus wood, rice stems, and switchgrass tillers, were analyzed by (1) gas chromatography-mass spectrometry (GC-MS) of alditol acetates combined with a total uronic acid assay; (2) carbodiimide reduction of uronic acids followed by GC-MS of alditol acetates; (3) GC-MS of trimethylsilyl (TMS) derivatives; and (4) high-pressure, anion-exchange chromatography (HPAEC). All four methods gave comparable abundance ranking of the seven neutral sugars, and three of the methods were able to quantify unique acidic sugars. The TMS, HPAEC, and carbodiimide methods provided comparable quantitative results for the specific neutral and acidic sugar content of the biomass, with the TMS method providing slightly greater yield of specific acidic sugars and high total sugar yields. The alditol acetate method, while providing comparable information on the major neutral sugars, did not provide the requisite quantitative information on the specific acidic sugars in plant biomass. Thus, the alditol acetate method is the least informative of the four methods. This work provides a side-by-side comparison of the efficacy of four different established glycosyl residue composition analysis methods in the analysis of the glycosyl residue composition of cell walls from both dicot (Arabidopsis and Populus ) and grass (rice and switchgrass) species. Both primary wall-enriched leaf tissues and secondary wall-enriched wood/stem tissues were analyzed for mol% and mass yield of the non-cellulosic sugars. The TMS, HPAEC, and carbodiimide methods were shown to provide comparable quantitative data on the nine neutral and acidic sugars present in all plant cell walls.
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.
Fitterer, Jessica L; Nelson, Trisalyn A
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).
Strong ion calculator--a practical bedside application of modern quantitative acid-base physiology.
Lloyd, P
2004-12-01
To review acid-base balance by considering the physical effects of ions in solution and describe the use of a calculator to derive the strong ion difference and Atot and strong ion gap. A review of articles reporting on the use of strong ion difference and Atot in the interpretation of acid base balance. Tremendous progress has been made in the last decade in our understanding of acid-base physiology. We now have a quantitative understanding of the mechanisms underlying the acidity of an aqueous solution. We can now predict the acidity given information about the concentration of the various ion-forming species within it. We can predict changes in acid-base status caused by disturbance of these factors, and finally, we can detect unmeasured anions with greater sensitivity than was previously possible with the anion gap, using either arterial or venous blood sampling. Acid-base interpretation has ceased to be an intuitive and arcane art. Much of it is now an exact computation that can be automated and incorporated into an online hospital laboratory information system. All diseases and all therapies can affect a patient's acid-base status only through the final common pathway of one or more of the three independent factors. With Constable's equations we can now accurately predict the acidity of plasma. When there is a discrepancy between the observed and predicted acidity we can deduce the net concentration of unmeasured ions to account for the difference.
Kβ Mainline X-ray Emission Spectroscopy as an Experimental Probe of Metal–Ligand Covalency
2015-01-01
The mainline feature in metal Kβ X-ray emission spectroscopy (XES) has long been recognized as an experimental marker for the spin state of the metal center. However, even within a series of metal compounds with the same nominal oxidation and spin state, significant changes are observed that cannot be explained on the basis of overall spin. In this work, the origin of these effects is explored, both experimentally and theoretically, in order to develop the chemical information content of Kβ mainline XES. Ligand field expressions are derived that describe the behavior of Kβ mainlines for first row transition metals with any dn count, allowing for a detailed analysis of the factors governing mainline shape. Further, due to limitations associated with existing computational approaches, we have developed a new methodology for calculating Kβ mainlines using restricted active space configuration interaction (RAS–CI) calculations. This approach eliminates the need for empirical parameters and provides a powerful tool for investigating the effects that chemical environment exerts on the mainline spectra. On the basis of a detailed analysis of the intermediate and final states involved in these transitions, we confirm the known sensitivity of Kβ mainlines to metal spin state via the 3p–3d exchange coupling. Further, a quantitative relationship between the splitting of the Kβ mainline features and the metal–ligand covalency is established. Thus, this study furthers the quantitative electronic structural information that can be extracted from Kβ mainline spectroscopy. PMID:24914450
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime
Fitterer, Jessica L.; Nelson, Trisalyn A.
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016
NASA Astrophysics Data System (ADS)
Prat, O. P.; Nelson, B. R.; Stevens, S. E.; Nickl, E.; Seo, D. J.; Kim, B.; Zhang, J.; Qi, Y.
2015-12-01
The processing of radar-only precipitation via the reanalysis from the National Mosaic and Multi-Sensor Quantitative (NMQ/Q2) based on the WSR-88D Next-generation Radar (Nexrad) network over the Continental United States (CONUS) is completed for the period covering from 2002 to 2011. While this constitutes a unique opportunity to study precipitation processes at higher resolution than conventionally possible (1-km, 5-min), the long-term radar-only product needs to be merged with in-situ information in order to be suitable for hydrological, meteorological and climatological applications. The radar-gauge merging is performed by using rain gauge information at daily (Global Historical Climatology Network-Daily: GHCN-D), hourly (Hydrometeorological Automated Data System: HADS), and 5-min (Automated Surface Observing Systems: ASOS; Climate Reference Network: CRN) resolution. The challenges related to incorporating differing resolution and quality networks to generate long-term large-scale gridded estimates of precipitation are enormous. In that perspective, we are implementing techniques for merging the rain gauge datasets and the radar-only estimates such as Inverse Distance Weighting (IDW), Simple Kriging (SK), Ordinary Kriging (OK), and Conditional Bias-Penalized Kriging (CBPK). An evaluation of the different radar-gauge merging techniques is presented and we provide an estimate of uncertainty for the gridded estimates. In addition, comparisons with a suite of lower resolution QPEs derived from ground based radar measurements (Stage IV) are provided in order to give a detailed picture of the improvements and remaining challenges.
Cai, Congbo; Chen, Zhong; van Zijl, Peter C.M.
2017-01-01
The reconstruction of MR quantitative susceptibility mapping (QSM) from local phase measurements is an ill posed inverse problem and different regularization strategies incorporating a priori information extracted from magnitude and phase images have been proposed. However, the anatomy observed in magnitude and phase images does not always coincide spatially with that in susceptibility maps, which could give erroneous estimation in the reconstructed susceptibility map. In this paper, we develop a structural feature based collaborative reconstruction (SFCR) method for QSM including both magnitude and susceptibility based information. The SFCR algorithm is composed of two consecutive steps corresponding to complementary reconstruction models, each with a structural feature based l1 norm constraint and a voxel fidelity based l2 norm constraint, which allows both the structure edges and tiny features to be recovered, whereas the noise and artifacts could be reduced. In the M-step, the initial susceptibility map is reconstructed by employing a k-space based compressed sensing model incorporating magnitude prior. In the S-step, the susceptibility map is fitted in spatial domain using weighted constraints derived from the initial susceptibility map from the M-step. Simulations and in vivo human experiments at 7T MRI show that the SFCR method provides high quality susceptibility maps with improved RMSE and MSSIM. Finally, the susceptibility values of deep gray matter are analyzed in multiple head positions, with the supine position most approximate to the gold standard COSMOS result. PMID:27019480
Towards flash-flood prediction in the dry Dead Sea region utilizing radar rainfall information
NASA Astrophysics Data System (ADS)
Morin, Efrat; Jacoby, Yael; Navon, Shilo; Bet-Halachmi, Erez
2009-07-01
Flash-flood warning models can save lives and protect various kinds of infrastructure. In dry climate regions, rainfall is highly variable and can be of high-intensity. Since rain gauge networks in such areas are sparse, rainfall information derived from weather radar systems can provide useful input for flash-flood models. This paper presents a flash-flood warning model which utilizes radar rainfall data and applies it to two catchments that drain into the dry Dead Sea region. Radar-based quantitative precipitation estimates (QPEs) were derived using a rain gauge adjustment approach, either on a daily basis (allowing the adjustment factor to change over time, assuming available real-time gauge data) or using a constant factor value (derived from rain gauge data) over the entire period of the analysis. The QPEs served as input for a continuous hydrological model that represents the main hydrological processes in the region, namely infiltration, flow routing and transmission losses. The infiltration function is applied in a distributed mode while the routing and transmission loss functions are applied in a lumped mode. Model parameters were found by calibration based on the 5 years of data for one of the catchments. Validation was performed for a subsequent 5-year period for the same catchment and then for an entire 10-year record for the second catchment. The probability of detection and false alarm rates for the validation cases were reasonable. Probabilistic flash-flood prediction is presented applying Monte Carlo simulations with an uncertainty range for the QPEs and model parameters. With low probability thresholds, one can maintain more than 70% detection with no more than 30% false alarms. The study demonstrates that a flash-flood warning model is feasible for catchments in the area studied.
Towards flash flood prediction in the dry Dead Sea region utilizing radar rainfall information
NASA Astrophysics Data System (ADS)
Morin, E.; Jacoby, Y.; Navon, S.; Bet-Halachmi, E.
2009-04-01
Flash-flood warning models can save lives and protect various kinds of infrastructure. In dry climate regions, rainfall is highly variable and can be of high-intensity. Since rain gauge networks in such areas are sparse, rainfall information derived from weather radar systems can provide useful input for flash-flood models. This paper presents a flash-flood warning model utilizing radar rainfall data and applies it to two catchments that drain into the dry Dead Sea region. Radar-based quantitative precipitation estimates (QPEs) were derived using a rain gauge adjustment approach, either on a daily basis (allowing the adjustment factor to change over time, assuming available real-time gauge data) or using a constant factor value (derived from rain gauge data) over the entire period of the analysis. The QPEs served as input for a continuous hydrological model that represents the main hydrological processes in the region, namely infiltration, flow routing and transmission losses. The infiltration function is applied in a distributed mode while the routing and transmission loss functions are applied in a lumped mode. Model parameters were found by calibration based on five years of data for one of the catchments. Validation was performed for a subsequent five-year period for the same catchment and then for an entire ten year record for the second catchment. The probability of detection and false alarm rates for the validation cases were reasonable. Probabilistic flash-flood prediction is presented applying Monte Carlo simulations with an uncertainty range for the QPEs and model parameters. With low probability thresholds, one can maintain more than 70% detection with no more than 30% false alarms. The study demonstrates that a flash-flood-warning model is feasible for catchments in the area studied.
USDA-ARS?s Scientific Manuscript database
High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiological amino acids in selected fruits and vegetables. This method was found to be particularly useful because the dabsyl derivatives of glutamine and citrulline were sufficiently se...
Hahn, Rosane Christine; Hamdan, Júnia Soares
2000-01-01
Yeast cells of five different strains of Paracoccidioides brasiliensis were obtained for partial analysis of lipid composition, and sterol content was determined quantitatively and qualitatively. The determinations were conducted with cells cultured in the presence and absence of amphotericin B and azole derivatives at levels below the MIC. PMID:10858371
NASA Astrophysics Data System (ADS)
Sunardi, O.
2017-12-01
Medium-sized food manufacturing enterprises in Indonesia are significant in a number of contexts, in terms of their part to the national production (GDP) and their establishment to the employment. In term of their role to national production, manufacturing sector contributes the highest GDP by 85%. In this sector, food manufacturing subsector contributes the highest GDP. Nevertheless, they faced the same common problems: quality of human capital and sustainability issues. Previous government supplementary programs have been established to expand the human capital capability amongst medium enterprises. Adequate amount of fund has been apportioned to develop human capital, though, the medium enterprises sustainability is still in question. This study proposes and examines the human capital role from informal knowledge sharing perspective. By conducting qualitative approach through interviews to four informants in Indonesian medium-sized food manufacturing enterprises, a set of hypotheses is derived from this study for future quantitative study. This study indicates that human capital traits (diverse education background, employee skills, and employee experience) could leverage the practice of informal knowledge sharing. Constructs such as mutual trust and reciprocal intention could play as mediating variables, and cultural interpretation perspective could act as moderating factor to informal knowledge sharing effectiveness. In final, informal knowledge sharing is indicated to play as moderating variable for human capital policy and practice to support enterprise sustainability.
Ulgen, Ayse; Han, Zhihua; Li, Wentian
2003-12-31
We address the question of whether statistical correlations among quantitative traits lead to correlation of linkage results of these traits. Five measured quantitative traits (total cholesterol, fasting glucose, HDL cholesterol, blood pressure, and triglycerides), and one derived quantitative trait (total cholesterol divided by the HDL cholesterol) are used for phenotype correlation studies. Four of them are used for linkage analysis. We show that although correlation among phenotypes partially reflects the correlation among linkage analysis results, the LOD-score correlations are on average low. The most significant peaks found by using different traits do not often overlap. Studying covariances at specific locations in LOD scores may provide clues for further bivariate linkage analyses.
Lightning charge moment changes estimated by high speed photometric observations from ISS
NASA Astrophysics Data System (ADS)
Hobara, Y.; Kono, S.; Suzuki, K.; Sato, M.; Takahashi, Y.; Adachi, T.; Ushio, T.; Suzuki, M.
2017-12-01
Optical observations by the CCD camera using the orbiting satellite is generally used to derive the spatio-temporal global distributions of the CGs and ICs. However electrical properties of the lightning such as peak current and lightning charge are difficult to obtain from the space. In particular, CGs with considerably large lightning charge moment changes (CMC) and peak currents are crucial parameters to generate red sprites and elves, respectively, and so it must be useful to obtain these parameters from space. In this paper, we obtained the lightning optical signatures by using high speed photometric observations from the International Space Station GLIMS (Global Lightning and Sprit MeasurementS JEM-EF) mission. These optical signatures were compared quantitatively with radio signatures recognized as truth values derived from ELF electromagnetic wave observations on the ground to verify the accuracy of the optically derived values. High correlation (R > 0.9) was obtained between lightning optical irradiance and current moment, and quantitative relational expression between these two parameters was derived. Rather high correlation (R > 0.7) was also obtained between the integrated irradiance and the lightning CMC. Our results indicate the possibility to derive lightning electrical properties (current moment and CMC) from optical measurement from space. Moreover, we hope that these results will also contribute to forthcoming French microsatellite mission TARANIS.
Zhong, Xuefei; Hao, Ling; Lu, Jianfeng; Ye, Hui; Zhang, Su-Chun; Li, Lingjun
2016-04-01
A CE-ESI-MRM-based assay was developed for targeted analysis of serotonin released by human embryonic stem cells-derived serotonergic neurons in a chemically defined environment. A discontinuous electrolyte system was optimized for pH-mediated online stacking of serotonin. Combining with a liquid-liquid extraction procedure, LOD of serotonin in the Krebs'-Ringer's solution by CE-ESI-MS/MS on a 3D ion trap MS was0.15 ng/mL. The quantitative results confirmed the serotonergic identity of the in vitro developed neurons and the capacity of these neurons to release serotonin in response to stimulus. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Saavedra, Laura M; Romanelli, Gustavo P; Rozo, Ciro E; Duchowicz, Pablo R
2018-01-01
The insecticidal activity of a series of 62 plant derived molecules against the chikungunya, dengue and zika vector, the Aedes aegypti (Diptera:Culicidae) mosquito, is subjected to a Quantitative Structure-Activity Relationships (QSAR) analysis. The Replacement Method (RM) variable subset selection technique based on Multivariable Linear Regression (MLR) proves to be successful for exploring 4885 molecular descriptors calculated with Dragon 6. The predictive capability of the obtained models is confirmed through an external test set of compounds, Leave-One-Out (LOO) cross-validation and Y-Randomization. The present study constitutes a first necessary computational step for designing less toxic insecticides. Copyright © 2017 Elsevier B.V. All rights reserved.
A Bayesian spatial model for neuroimaging data based on biologically informed basis functions.
Huertas, Ismael; Oldehinkel, Marianne; van Oort, Erik S B; Garcia-Solis, David; Mir, Pablo; Beckmann, Christian F; Marquand, Andre F
2017-11-01
The dominant approach to neuroimaging data analysis employs the voxel as the unit of computation. While convenient, voxels lack biological meaning and their size is arbitrarily determined by the resolution of the image. Here, we propose a multivariate spatial model in which neuroimaging data are characterised as a linearly weighted combination of multiscale basis functions which map onto underlying brain nuclei or networks or nuclei. In this model, the elementary building blocks are derived to reflect the functional anatomy of the brain during the resting state. This model is estimated using a Bayesian framework which accurately quantifies uncertainty and automatically finds the most accurate and parsimonious combination of basis functions describing the data. We demonstrate the utility of this framework by predicting quantitative SPECT images of striatal dopamine function and we compare a variety of basis sets including generic isotropic functions, anatomical representations of the striatum derived from structural MRI, and two different soft functional parcellations of the striatum derived from resting-state fMRI (rfMRI). We found that a combination of ∼50 multiscale functional basis functions accurately represented the striatal dopamine activity, and that functional basis functions derived from an advanced parcellation technique known as Instantaneous Connectivity Parcellation (ICP) provided the most parsimonious models of dopamine function. Importantly, functional basis functions derived from resting fMRI were more accurate than both structural and generic basis sets in representing dopamine function in the striatum for a fixed model order. We demonstrate the translational validity of our framework by constructing classification models for discriminating parkinsonian disorders and their subtypes. Here, we show that ICP approach is the only basis set that performs well across all comparisons and performs better overall than the classical voxel-based approach. This spatial model constitutes an elegant alternative to voxel-based approaches in neuroimaging studies; not only are their atoms biologically informed, they are also adaptive to high resolutions, represent high dimensions efficiently, and capture long-range spatial dependencies, which are important and challenging objectives for neuroimaging data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Optical Ptychographic Microscope for Quantitative Bio-Mechanical Imaging
NASA Astrophysics Data System (ADS)
Anthony, Nicholas; Cadenazzi, Guido; Nugent, Keith; Abbey, Brian
The role that mechanical forces play in biological processes such as cell movement and death is becoming of significant interest to further develop our understanding of the inner workings of cells. The most common method used to obtain stress information is photoelasticity which maps a samples birefringence, or its direction dependent refractive indices, using polarized light. However this method only provides qualitative data and for stress information to be useful quantitative data is required. Ptychography is a method for quantitatively determining the phase of a samples complex transmission function. The technique relies upon the collection of multiple overlapping coherent diffraction patterns from laterally displaced points on the sample. The overlap of measurement points provides complementary information that significantly aids in the reconstruction of the complex wavefield exiting the sample and allows for quantitative imaging of weakly interacting specimens. Here we describe recent advances at La Trobe University Melbourne on achieving quantitative birefringence mapping using polarized light ptychography with applications in cell mechanics. Australian Synchrotron, ARC Centre of Excellence for Advanced Molecular Imaging.
Looking at cell mechanics with atomic force microscopy: experiment and theory.
Benitez, Rafael; Toca-Herrera, José L
2014-11-01
This review reports on the use of the atomic force microscopy in the investigation of the mechanical properties of cells. It is shown that the technique is able to deliver information about the cell surface properties (e.g., topography), the Young modulus, the viscosity, and the cell the relaxation times. Another aspect that this short review points out is the utilization of the atomic force microscope to investigate basic questions related to materials physics, biology, and medicine. The review is written in a chronological way to offer an overview of phenomenological facts and quantitative results to the reader. The final section discusses in detail the advantages and disadvantages of the Hertz and JKR models. A new implementation of the JKR model derived by Dufresne is presented. © 2014 Wiley Periodicals, Inc.
Self-organized global control of carbon emissions
NASA Astrophysics Data System (ADS)
Zhao, Zhenyuan; Fenn, Daniel J.; Hui, Pak Ming; Johnson, Neil F.
2010-09-01
There is much disagreement concerning how best to control global carbon emissions. We explore quantitatively how different control schemes affect the collective emission dynamics of a population of emitting entities. We uncover a complex trade-off which arises between average emissions (affecting the global climate), peak pollution levels (affecting citizens’ everyday health), industrial efficiency (affecting the nation’s economy), frequency of institutional intervention (affecting governmental costs), common information (affecting trading behavior) and market volatility (affecting financial stability). Our findings predict that a self-organized free-market approach at the level of a sector, state, country or continent can provide better control than a top-down regulated scheme in terms of market volatility and monthly pollution peaks. The control of volatility also has important implications for any future derivative carbon emissions market.
Ionospheric chemistry. [minor neutrals and ionized constituents of thermosphere
NASA Technical Reports Server (NTRS)
Torr, D. G.
1979-01-01
This report deals primarily with progress in the chemistry of minor neutrals and ionized constituents of the thermosphere. Significant progress was made over the last few years in quantitative studies of many chemical processes. This success was primarily due to the advent of multiparameter multisatellite programs which permitted accurate simultaneous measurements to be made of many important parameters. In many cases studies of chemical reactions were made with laboratory-like precision. Rate coefficients have been derived as functions of temperature for a number of important reactions. New information has been acquired on nearly every major process which occurs in the thermosphere, including the recombination rates of all major molecular ions, charge transfer reactions, ion atom interchange reactions, and reactions of neutral and ionized metastable atoms and molecules.
Uncertainty Analysis for Angle Calibrations Using Circle Closure
Estler, W. Tyler
1998-01-01
We analyze two types of full-circle angle calibrations: a simple closure in which a single set of unknown angular segments is sequentially compared with an unknown reference angle, and a dual closure in which two divided circles are simultaneously calibrated by intercomparison. In each case, the constraint of circle closure provides auxiliary information that (1) enables a complete calibration process without reference to separately calibrated reference artifacts, and (2) serves to reduce measurement uncertainty. We derive closed-form expressions for the combined standard uncertainties of angle calibrations, following guidelines published by the International Organization for Standardization (ISO) and NIST. The analysis includes methods for the quantitative evaluation of the standard uncertainty of small angle measurement using electronic autocollimators, including the effects of calibration uncertainty and air turbulence. PMID:28009359
A primer on thermodynamic-based models for deciphering transcriptional regulatory logic.
Dresch, Jacqueline M; Richards, Megan; Ay, Ahmet
2013-09-01
A rigorous analysis of transcriptional regulation at the DNA level is crucial to the understanding of many biological systems. Mathematical modeling has offered researchers a new approach to understanding this central process. In particular, thermodynamic-based modeling represents the most biophysically informed approach aimed at connecting DNA level regulatory sequences to the expression of specific genes. The goal of this review is to give biologists a thorough description of the steps involved in building, analyzing, and implementing a thermodynamic-based model of transcriptional regulation. The data requirements for this modeling approach are described, the derivation for a specific regulatory region is shown, and the challenges and future directions for the quantitative modeling of gene regulation are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Hötzel, Fabian; Seino, Kaori; Huck, Christian; Skibbe, Olaf; Bechstedt, Friedhelm; Pucci, Annemarie
2015-06-10
The metal-atom chains on the Si(111) - 5 × 2 - Au surface represent an exceedingly interesting system for the understanding of one-dimensional electrical interconnects. While other metal-atom chain structures on silicon suffer from metal-to-insulator transitions, Si(111) - 5 × 2 - Au stays metallic at least down to 20 K as we have proven by the anisotropic absorption from localized plasmon polaritons in the infrared. A quantitative analysis of the infrared plasmonic signal done here for the first time yields valuable band structure information in agreement with the theoretically derived data. The experimental and theoretical results are consistently explained in the framework of the atomic geometry, electronic structure, and IR spectra of the recent Kwon-Kang model.
The Integral Method, a new approach to quantify bactericidal activity.
Gottardi, Waldemar; Pfleiderer, Jörg; Nagl, Markus
2015-08-01
The bactericidal activity (BA) of antimicrobial agents is generally derived from the results of killing assays. A reliable quantitative characterization and particularly a comparison of these substances, however, are impossible with this information. We here propose a new method that takes into account the course of the complete killing curve for assaying BA and that allows a clear-cut quantitative comparison of antimicrobial agents with only one number. The new Integral Method, based on the reciprocal area below the killing curve, reliably calculates an average BA [log10 CFU/min] and, by implementation of the agent's concentration C, the average specific bactericidal activity SBA=BA/C [log10 CFU/min/mM]. Based on experimental killing data, the pertaining BA and SBA values of exemplary active halogen compounds were established, allowing quantitative assertions. N-chlorotaurine (NCT), chloramine T (CAT), monochloramine (NH2Cl), and iodine (I2) showed extremely diverging SBA values of 0.0020±0.0005, 1.11±0.15, 3.49±0.22, and 291±137log10 CFU/min/mM, respectively, against Staphylococcus aureus. This immediately demonstrates an approximately 550-fold stronger activity of CAT, 1730-fold of NH2Cl, and 150,000-fold of I2 compared to NCT. The inferred quantitative assertions and conclusions prove the new method suitable for characterizing bactericidal activity. Its application comprises the effect of defined agents on various bacteria, the consequence of temperature shifts, the influence of varying drug structure, dose-effect relationships, ranking of isosteric agents, comparison of competing commercial antimicrobial formulations, and the effect of additives. Copyright © 2015 Elsevier B.V. All rights reserved.
Critical appraisal of emergency medicine education research: the best publications of 2012.
Lin, Michelle; Fisher, Jonathan; Coates, Wendy C; Farrell, Susan E; Shayne, Philip; Maggio, Lauren; Kuhn, Gloria
2014-03-01
The objective was to critically appraise and highlight medical education research published in 2012 that was methodologically superior and whose outcomes were pertinent to teaching and education in emergency medicine (EM). A search of the English language literature in 2012 querying Education Resources Information Center (ERIC), PsychInfo, PubMed, and Scopus identified EM studies using hypothesis-testing or observational investigations of educational interventions. Two reviewers independently screened all of the publications and removed articles using established exclusion criteria. This year, publications limited to a single-site survey design that measured satisfaction or self-assessment on unvalidated instruments were not formally reviewed. Six reviewers then independently ranked all remaining publications using one of two scoring systems depending on whether the study methodology was primarily qualitative or quantitative. Each scoring system had nine criteria, including four related to methodology, that were chosen a priori, to standardize evaluation by reviewers. The quantitative study scoring system was used previously to appraise medical education published annually in 2008 through 2011, while a separate, new qualitative study scoring system was derived and implemented consisting of parallel metrics. Forty-eight medical education research papers met the a priori criteria for inclusion, and 33 (30 quantitative and three qualitative studies) were reviewed. Seven quantitative and two qualitative studies met the criteria for inclusion as exemplary and are summarized in this article. This critical appraisal series aims to promote superior education research by reviewing and highlighting nine of the 48 major education research studies with relevance to EM published in 2012. Current trends and common methodologic pitfalls in the 2012 papers are noted. © 2014 by the Society for Academic Emergency Medicine.
Guo, Hailin; Ding, Wanwen; Chen, Jingbo; Chen, Xuan; Zheng, Yiqi; Wang, Zhiyong; Liu, Jianxiu
2014-01-01
Zoysiagrass (Zoysia Willd.) is an important warm season turfgrass that is grown in many parts of the world. Salt tolerance is an important trait in zoysiagrass breeding programs. In this study, a genetic linkage map was constructed using sequence-related amplified polymorphism markers and random amplified polymorphic DNA markers based on an F1 population comprising 120 progeny derived from a cross between Zoysia japonica Z105 (salt-tolerant accession) and Z061 (salt-sensitive accession). The linkage map covered 1211 cM with an average marker distance of 5.0 cM and contained 24 linkage groups with 242 marker loci (217 sequence-related amplified polymorphism markers and 25 random amplified polymorphic DNA markers). Quantitative trait loci affecting the salt tolerance of zoysiagrass were identified using the constructed genetic linkage map. Two significant quantitative trait loci (qLF-1 and qLF-2) for leaf firing percentage were detected; qLF-1 at 36.3 cM on linkage group LG4 with a logarithm of odds value of 3.27, which explained 13.1% of the total variation of leaf firing and qLF-2 at 42.3 cM on LG5 with a logarithm of odds value of 2.88, which explained 29.7% of the total variation of leaf firing. A significant quantitative trait locus (qSCW-1) for reduced percentage of dry shoot clipping weight was detected at 44.1 cM on LG5 with a logarithm of odds value of 4.0, which explained 65.6% of the total variation. This study provides important information for further functional analysis of salt-tolerance genes in zoysiagrass. Molecular markers linked with quantitative trait loci for salt tolerance will be useful in zoysiagrass breeding programs using marker-assisted selection.
Sender–receiver systems and applying information theory for quantitative synthetic biology
Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark
2015-01-01
Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688
Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhe; Wu, Chaochao; Xie, Fang
Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective andmore » robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Additionally, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. In conclusion, peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.« less
Braid Entropy of Two-Dimensional Turbulence
NASA Astrophysics Data System (ADS)
Francois, Nicolas; Xia, Hua; Punzmann, Horst; Faber, Benjamin; Shats, Michael
2015-12-01
The evolving shape of material fluid lines in a flow underlies the quantitative prediction of the dissipation and material transport in many industrial and natural processes. However, collecting quantitative data on this dynamics remains an experimental challenge in particular in turbulent flows. Indeed the deformation of a fluid line, induced by its successive stretching and folding, can be difficult to determine because such description ultimately relies on often inaccessible multi-particle information. Here we report laboratory measurements in two-dimensional turbulence that offer an alternative topological viewpoint on this issue. This approach characterizes the dynamics of a braid of Lagrangian trajectories through a global measure of their entanglement. The topological length of material fluid lines can be derived from these braids. This length is found to grow exponentially with time, giving access to the braid topological entropy . The entropy increases as the square root of the turbulent kinetic energy and is directly related to the single-particle dispersion coefficient. At long times, the probability distribution of is positively skewed and shows strong exponential tails. Our results suggest that may serve as a measure of the irreversibility of turbulence based on minimal principles and sparse Lagrangian data.
Cerebral capillary velocimetry based on temporal OCT speckle contrast.
Choi, Woo June; Li, Yuandong; Qin, Wan; Wang, Ruikang K
2016-12-01
We propose a new optical coherence tomography (OCT) based method to measure red blood cell (RBC) velocities of single capillaries in the cortex of rodent brain. This OCT capillary velocimetry exploits quantitative laser speckle contrast analysis to estimate speckle decorrelation rate from the measured temporal OCT speckle signals, which is related to microcirculatory flow velocity. We hypothesize that OCT signal due to sub-surface capillary flow can be treated as the speckle signal in the single scattering regime and thus its time scale of speckle fluctuations can be subjected to single scattering laser speckle contrast analysis to derive characteristic decorrelation time. To validate this hypothesis, OCT measurements are conducted on a single capillary flow phantom operating at preset velocities, in which M-mode B-frames are acquired using a high-speed OCT system. Analysis is then performed on the time-varying OCT signals extracted at the capillary flow, exhibiting a typical inverse relationship between the estimated decorrelation time and absolute RBC velocity, which is then used to deduce the capillary velocities. We apply the method to in vivo measurements of mouse brain, demonstrating that the proposed approach provides additional useful information in the quantitative assessment of capillary hemodynamics, complementary to that of OCT angiography.
Gong, Kuang; Yang, Jaewon; Kim, Kyungsang; El Fakhri, Georges; Seo, Youngho; Li, Quanzheng
2018-05-23
Positron Emission Tomography (PET) is a functional imaging modality widely used in neuroscience studies. To obtain meaningful quantitative results from PET images, attenuation correction is necessary during image reconstruction. For PET/MR hybrid systems, PET attenuation is challenging as Magnetic Resonance (MR) images do not reflect attenuation coefficients directly. To address this issue, we present deep neural network methods to derive the continuous attenuation coefficients for brain PET imaging from MR images. With only Dixon MR images as the network input, the existing U-net structure was adopted and analysis using forty patient data sets shows it is superior than other Dixon based methods. When both Dixon and zero echo time (ZTE) images are available, we have proposed a modified U-net structure, named GroupU-net, to efficiently make use of both Dixon and ZTE information through group convolution modules when the network goes deeper. Quantitative analysis based on fourteen real patient data sets demonstrates that both network approaches can perform better than the standard methods, and the proposed network structure can further reduce the PET quantification error compared to the U-net structure. © 2018 Institute of Physics and Engineering in Medicine.
NASA Astrophysics Data System (ADS)
Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard
2004-10-01
We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).
Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes
Xu, Zhe; Wu, Chaochao; Xie, Fang; ...
2014-10-28
Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective andmore » robust analytical platform for comprehensive analyses of tissue peptidomes, and which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Additionally, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. In conclusion, peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.« less
Braid Entropy of Two-Dimensional Turbulence
Francois, Nicolas; Xia, Hua; Punzmann, Horst; Faber, Benjamin; Shats, Michael
2015-01-01
The evolving shape of material fluid lines in a flow underlies the quantitative prediction of the dissipation and material transport in many industrial and natural processes. However, collecting quantitative data on this dynamics remains an experimental challenge in particular in turbulent flows. Indeed the deformation of a fluid line, induced by its successive stretching and folding, can be difficult to determine because such description ultimately relies on often inaccessible multi-particle information. Here we report laboratory measurements in two-dimensional turbulence that offer an alternative topological viewpoint on this issue. This approach characterizes the dynamics of a braid of Lagrangian trajectories through a global measure of their entanglement. The topological length of material fluid lines can be derived from these braids. This length is found to grow exponentially with time, giving access to the braid topological entropy . The entropy increases as the square root of the turbulent kinetic energy and is directly related to the single-particle dispersion coefficient. At long times, the probability distribution of is positively skewed and shows strong exponential tails. Our results suggest that may serve as a measure of the irreversibility of turbulence based on minimal principles and sparse Lagrangian data. PMID:26689261
Henshall, John M; Dierens, Leanne; Sellars, Melony J
2014-09-02
While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.