A quantitative witness for Greenberger-Horne-Zeilinger entanglement.
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431
Exploration of Action Figure Appeals Using Evaluation Grid Method and Quantification Theory Type I
ERIC Educational Resources Information Center
Chang, Hua-Cheng; Chen, Hung-Yuan
2017-01-01
Contemporary toy is characterized by accelerating social, cultural and technological change. An attractive action figure can grab consumers' attention, influence the latent consuming preference and evoke their pleasure. However, traditional design of action figure is always dependent on designer's opinion, subjective experience and preference. It…
Electrical detection and quantification of single and mixed DNA nucleotides in suspension
NASA Astrophysics Data System (ADS)
Ahmad, Mahmoud Al; Panicker, Neena G.; Rizvi, Tahir A.; Mustafa, Farah
2016-09-01
High speed sequential identification of the building blocks of DNA, (deoxyribonucleotides or nucleotides for short) without labeling or processing in long reads of DNA is the need of the hour. This can be accomplished through exploiting their unique electrical properties. In this study, the four different types of nucleotides that constitute a DNA molecule were suspended in a buffer followed by performing several types of electrical measurements. These electrical parameters were then used to quantify the suspended DNA nucleotides. Thus, we present a purely electrical counting scheme based on the semiconductor theory that allows one to determine the number of nucleotides in a solution by measuring their capacitance-voltage dependency. The nucleotide count was observed to be similar to the multiplication of the corresponding dopant concentration and debye volume after de-embedding the buffer contribution. The presented approach allows for a fast and label-free quantification of single and mixed nucleotides in a solution.
Game Theory and Uncertainty Quantification for Cyber Defense Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna
Cyber-system defenders face the challenging task of protecting critical assets and information continually against multiple types of malicious attackers. Defenders typically operate within resource constraints while attackers operate at relatively low costs. As a result, design and development of resilient cyber-systems that can support mission goals under attack while accounting for the dynamics between attackers and defenders is an important research problem.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
food science. Matthew's research at NREL is focused on applying uncertainty quantification techniques . Research Interests Uncertainty quantification Computational multilinear algebra Approximation theory of and the Canonical Tensor Decomposition, Journal of Computational Physics (2017) Randomized Alternating
Interpreting quantum coherence through a quantum measurement process
NASA Astrophysics Data System (ADS)
Yao, Yao; Dong, G. H.; Xiao, Xing; Li, Mo; Sun, C. P.
2017-11-01
Recently, there has been a renewed interest in the quantification of coherence or other coherencelike concepts within the framework of quantum resource theory. However, rigorously defined or not, the notion of coherence or decoherence has already been used by the community for decades since the advent of quantum theory. Intuitively, the definitions of coherence and decoherence should be two sides of the same coin. Therefore, a natural question is raised: How can the conventional decoherence processes, such as the von Neumann-Lüders (projective) measurement postulation or partially dephasing channels, fit into the bigger picture of the recently established theoretical framework? Here we show that the state collapse rules of the von Neumann or Lüders-type measurements, as special cases of genuinely incoherent operations (GIOs), are consistent with the resource theories of quantum coherence. New hierarchical measures of coherence are proposed for the Lüders-type measurement and their relationship with measurement-dependent discord is addressed. Moreover, utilizing the fixed-point theory for C* algebra, we prove that GIOs indeed represent a particular type of partially dephasing (phase-damping) channels which have a matrix representation based on the Schur product. By virtue of the Stinespring dilation theorem, the physical realizations of incoherent operations are investigated in detail and we find that GIOs in fact constitute the core of strictly incoherent operations and generally incoherent operations and the unspeakable notion of coherence induced by GIOs can be transferred to the theories of speakable coherence by the corresponding permutation or relabeling operators.
Cherif, Alhaji; Barley, Kamal
2010-01-01
Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies. PMID:21206911
Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin
2015-01-01
Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals).
2015-01-01
Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals). PMID:26019809
MOTIVATION INTERNALIZATION AND SIMPLEX STRUCTURE IN SELF-DETERMINATION THEORY.
Ünlü, Ali; Dettweiler, Ulrich
2015-12-01
Self-determination theory, as proposed by Deci and Ryan, postulated different types of motivation regulation. As to the introjected and identified regulation of extrinsic motivation, their internalizations were described as "somewhat external" and "somewhat internal" and remained undetermined in the theory. This paper introduces a constrained regression analysis that allows these vaguely expressed motivations to be estimated in an "optimal" manner, in any given empirical context. The approach was even generalized and applied for simplex structure analysis in self-determination theory. The technique was exemplified with an empirical study comparing science teaching in a classical school class versus an expeditionary outdoor program. Based on a sample of 84 German pupils (43 girls, 41 boys, 10 to 12 years old), data were collected using the German version of the Academic Self-Regulation Questionnaire. The science-teaching format was seen to not influence the pupils' internalization of identified regulation. The internalization of introjected regulation differed and shifted more toward the external pole in the outdoor teaching format. The quantification approach supported the simplex structure of self-determination theory, whereas correlations may disconfirm the simplex structure.
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a..., or statistical theory, precept, or assumption. A change in quantification technique should not change...
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a..., or statistical theory, precept, or assumption. A change in quantification technique should not change...
Kelava, Augustin; Muma, Michael; Deja, Marlene; Dagdagan, Jack Y.; Zoubir, Abdelhak M.
2015-01-01
Emotion eliciting situations are accompanied by changes of multiple variables associated with subjective, physiological and behavioral responses. The quantification of the overall simultaneous synchrony of psychophysiological reactions plays a major role in emotion theories and has received increased attention in recent years. From a psychometric perspective, the reactions represent multivariate non-stationary intra-individual time series. In this paper, a new time-frequency based latent variable approach for the quantification of the synchrony of the responses is presented. The approach is applied to empirical data, collected during an emotion eliciting situation. The results are compared with a complementary inter-individual approach of Hsieh et al. (2011). Finally, the proposed approach is discussed in the context of emotion theories, and possible future applications and limitations are provided. PMID:25653624
Traditional Chinese medicine: potential approaches from modern dynamical complexity theories.
Ma, Yan; Zhou, Kehua; Fan, Jing; Sun, Shuchen
2016-03-01
Despite the widespread use of traditional Chinese medicine (TCM) in clinical settings, proving its effectiveness via scientific trials is still a challenge. TCM views the human body as a complex dynamical system, and focuses on the balance of the human body, both internally and with its external environment. Such fundamental concepts require investigations using system-level quantification approaches, which are beyond conventional reductionism. Only methods that quantify dynamical complexity can bring new insights into the evaluation of TCM. In a previous article, we briefly introduced the potential value of Multiscale Entropy (MSE) analysis in TCM. This article aims to explain the existing challenges in TCM quantification, to introduce the consistency of dynamical complexity theories and TCM theories, and to inspire future system-level research on health and disease.
Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando
2016-05-27
Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers' activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers' health systematically. The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. A Health Self-Quantification Activity Framework is presented, which shows SQ tool use in context, in relation to the goals, plans, and competence of the user. This makes it easier to analyze issues affecting SQ activity, and thereby makes it more feasible to address them. This review makes two significant contributions to research in this field: it explores health SQ work and its constructs thoroughly and it adapts Activity Theory to describe health SQ activity systematically.
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., mathematical, or statistical theory, precept, or assumption applied by the Postal Service in producing a... manipulation technique whose validity does not require the acceptance of a particular economic, mathematical, or statistical theory, precept, or assumption. A change in quantification technique should not change...
Practicing universal design to actual hand tool design process.
Lin, Kai-Chieh; Wu, Chih-Fu
2015-09-01
UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.
Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel
2013-09-01
In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Schwabe, O.; Shehab, E.; Erkoyuncu, J.
2015-08-01
The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.
2016-01-01
Background Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers’ activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers’ health systematically. Objective The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? Methods A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. Results The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. Conclusions A Health Self-Quantification Activity Framework is presented, which shows SQ tool use in context, in relation to the goals, plans, and competence of the user. This makes it easier to analyze issues affecting SQ activity, and thereby makes it more feasible to address them. This review makes two significant contributions to research in this field: it explores health SQ work and its constructs thoroughly and it adapts Activity Theory to describe health SQ activity systematically. PMID:27234343
Assessing the effects of threonyl-tRNA synthetase on angiogenesis-related responses.
Mirando, Adam C; Abdi, Khadar; Wo, Peibin; Lounsbury, Karen M
2017-01-15
Several recent reports have found a connection between specific aminoacyl-tRNA synthetases and the regulation of angiogenesis. As this new area of research is explored, it is important to have reliable assays to assess the specific angiogenesis functions of these enzymes. This review provides information about specific in vitro and in vivo methods that were used to assess the angiogenic functions of threonyl-tRNA synthetase including endothelial cell migration and tube assays as well as chorioallantoic membrane and tumor vascularization assays. The theory and discussion include best methods of analysis and quantification along with the advantages and limitations of each type of assay. Copyright © 2016 Elsevier Inc. All rights reserved.
[DNA quantification of blood samples pre-treated with pyramidon].
Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan
2014-06-01
To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.
A quantification model for the structure of clay materials.
Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian
2016-07-04
In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.
Determining the optimal forensic DNA analysis procedure following investigation of sample quality.
Hedell, Ronny; Hedman, Johannes; Mostad, Petter
2018-07-01
Crime scene traces of various types are routinely sent to forensic laboratories for analysis, generally with the aim of addressing questions about the source of the trace. The laboratory may choose to analyse the samples in different ways depending on the type and quality of the sample, the importance of the case and the cost and performance of the available analysis methods. Theoretically well-founded guidelines for the choice of analysis method are, however, lacking in most situations. In this paper, it is shown how such guidelines can be created using Bayesian decision theory. The theory is applied to forensic DNA analysis, showing how the information from the initial qPCR analysis can be utilized. It is assumed the alternatives for analysis are using a standard short tandem repeat (STR) DNA analysis assay, using the standard assay and a complementary assay, or the analysis may be cancelled following quantification. The decision is based on information about the DNA amount and level of DNA degradation of the forensic sample, as well as case circumstances and the cost for analysis. Semi-continuous electropherogram models are used for simulation of DNA profiles and for computation of likelihood ratios. It is shown how tables and graphs, prepared beforehand, can be used to quickly find the optimal decision in forensic casework.
Quantification of DNA using the luminescent oxygen channeling assay.
Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S
2000-09-01
Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schunert, Sebastian; Wang, Congjian; Wang, Yaqi
Rattlesnake and MAMMOTH are the designated TREAT analysis tools currently being developed at the Idaho National Laboratory. Concurrent with development of the multi-physics, multi-scale capabilities, sensitivity analysis and uncertainty quantification (SA/UQ) capabilities are required for predicitive modeling of the TREAT reactor. For steady-state SA/UQ, that is essential for setting initial conditions for the transients, generalized perturbation theory (GPT) will be used. This work describes the implementation of a PETSc based solver for the generalized adjoint equations that constitute a inhomogeneous, rank deficient problem. The standard approach is to use an outer iteration strategy with repeated removal of the fundamental modemore » contamination. The described GPT algorithm directly solves the GPT equations without the need of an outer iteration procedure by using Krylov subspaces that are orthogonal to the operator’s nullspace. Three test problems are solved and provide sufficient verification for the Rattlesnake’s GPT capability. We conclude with a preliminary example evaluating the impact of the Boron distribution in the TREAT reactor using perturbation theory.« less
Prediction of autosomal STR typing success in ancient and Second World War bone samples.
Zupanič Pajnič, Irena; Zupanc, Tomaž; Balažic, Jože; Geršak, Živa Miriam; Stojković, Oliver; Skadrić, Ivan; Črešnar, Matija
2017-03-01
Human-specific quantitative PCR (qPCR) has been developed for forensic use in the last 10 years and is the preferred DNA quantification technique since it is very accurate, sensitive, objective, time-effective and automatable. The amount of information that can be gleaned from a single quantification reaction using commercially available quantification kits has increased from the quantity of nuclear DNA to the amount of male DNA, presence of inhibitors and, most recently, to the degree of DNA degradation. In skeletal remains samples from disaster victims, missing persons and war conflict victims, the DNA is usually degraded. Therefore the new commercial qPCR kits able to assess the degree of degradation are potentially able to predict the success of downstream short tandem repeat (STR) typing. The goal of this study was to verify the quantification step using the PowerQuant kit with regard to its suitability as a screening method for autosomal STR typing success on ancient and Second World War (WWII) skeletal remains. We analysed 60 skeletons excavated from five archaeological sites and four WWII mass graves from Slovenia. The bones were cleaned, surface contamination was removed and the bones ground to a powder. Genomic DNA was obtained from 0.5g of bone powder after total demineralization. The DNA was purified using a Biorobot EZ1 device. Following PowerQuant quantification, DNA samples were subjected to autosomal STR amplification using the NGM kit. Up to 2.51ng DNA/g of powder were extracted. No inhibition was detected in any of bones analysed. 82% of the WWII bones gave full profiles while 73% of the ancient bones gave profiles not suitable for interpretation. Four bone extracts yielded no detectable amplification or zero quantification results and no profiles were obtained from any of them. Full or useful partial profiles were produced only from bone extracts where short autosomal (Auto) and long degradation (Deg) PowerQuant targets were detected. It is concluded that STR typing of old bones after quantification with the PowerQuant should be performed only when both Auto and Deg targets are detected simultaneously with no respect to [Auto]/[Deg] ratio. Prediction of STR typing success could be made according to successful amplification of Deg fragment. The PowerQuant kit is capable of identifying bone DNA samples that will not yield useful STR profiles using the NGM kit, and it can be used as a predictor of autosomal STR typing success of bone extracts obtained from ancient and WWII skeletal remains. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Kim, Jeong Hyun; Yoo, Seung Min; Sohn, Yong Hak; Jin, Chan Hee; Yang, Yun Suk; Hwang, In Taek; Oh, Kwan Young
2017-10-01
To investigate the predominant Lactobacillus species types (LSTs) of vaginal microbiota in pregnant Korean women by quantifying five Lactobacillus species and two anaerobes. In all, 168 pregnant Korean women under antenatal care at Eulji University Hospital and local clinics were enrolled in the prospective cohort study during pregnancy (10-14 weeks). Vaginal samples were collected with Eswab for Quantitative polymerase chain reaction (qPCR) and stored in a -80 °C freezer. qPCR was performed for five Lactobacillus species and two anaerobes. To identify the predominant LSTs, quantifications were analyzed by the Cluster and Tree View programs of Eisen Lab. Also the quantifications were compared among classified groups. L. crispatus and L. iners were most commonly found in pregnant Korean women, followed by L. gasseri and L. jensenii; L. vaginalis was nearly absent. Five types (four predominant LSTs and one predominant anaerobe type without predominant Lactobacillus species) were classified. Five predominant LSTs were identified in vaginal microbiota of pregnant Korean women. L. crispatus and L. iners predominant types comprised a large proportion.
Martinon, Alice; Cronin, Ultan P; Wilkinson, Martin G
2012-01-01
In this article, four types of standards were assessed in a SYBR Green-based real-time PCR procedure for the quantification of Staphylococcus aureus (S. aureus) in DNA samples. The standards were purified S. aureus genomic DNA (type A), circular plasmid DNA containing a thermonuclease (nuc) gene fragment (type B), DNA extracted from defined populations of S. aureus cells generated by Fluorescence Activated Cell Sorting (FACS) technology with (type C) or without purification of DNA by boiling (type D). The optimal efficiency of 2.016 was obtained on Roche LightCycler(®) 4.1. software for type C standards, whereas the lowest efficiency (1.682) corresponded to type D standards. Type C standards appeared to be more suitable for quantitative real-time PCR because of the use of defined populations for construction of standard curves. Overall, Fieller Confidence Interval algorithm may be improved for replicates having a low standard deviation in Cycle Threshold values such as found for type B and C standards. Stabilities of diluted PCR standards stored at -20°C were compared after 0, 7, 14 and 30 days and were lower for type A or C standards compared with type B standards. However, FACS generated standards may be useful for bacterial quantification in real-time PCR assays once optimal storage and temperature conditions are defined.
Xia, Yun; Yan, Shuangqian; Zhang, Xian; Ma, Peng; Du, Wei; Feng, Xiaojun; Liu, Bi-Feng
2017-03-21
Digital loop-mediated isothermal amplification (dLAMP) is an attractive approach for absolute quantification of nucleic acids with high sensitivity and selectivity. Theoretical and numerical analysis of dLAMP provides necessary guidance for the design and analysis of dLAMP devices. In this work, a mathematical model was proposed on the basis of the Monte Carlo method and the theories of Poisson statistics and chemometrics. To examine the established model, we fabricated a spiral chip with 1200 uniform and discrete reaction chambers (9.6 nL) for absolute quantification of pathogenic DNA samples by dLAMP. Under the optimized conditions, dLAMP analysis on the spiral chip realized quantification of nucleic acids spanning over 4 orders of magnitude in concentration with sensitivity as low as 8.7 × 10 -2 copies/μL in 40 min. The experimental results were consistent with the proposed mathematical model, which could provide useful guideline for future development of dLAMP devices.
Digital Droplet PCR: CNV Analysis and Other Applications.
Mazaika, Erica; Homsy, Jason
2014-07-14
Digital droplet PCR (ddPCR) is an assay that combines state-of-the-art microfluidics technology with TaqMan-based PCR to achieve precise target DNA quantification at high levels of sensitivity and specificity. Because quantification is achieved without the need for standard assays in an easy to interpret, unambiguous digital readout, ddPCR is far simpler, faster, and less error prone than real-time qPCR. The basic protocol can be modified with minor adjustments to suit a wide range of applications, such as CNV analysis, rare variant detection, SNP genotyping, and transcript quantification. This unit describes the ddPCR workflow in detail for the Bio-Rad QX100 system, but the theory and data interpretation are generalizable to any ddPCR system. Copyright © 2014 John Wiley & Sons, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Qian, Weijun
2013-02-01
Highly sensitive technologies for multiplexed quantification of a large number of candidate proteins will play an increasingly important role in clinical biomarker discovery, systems biology, and general biomedical research. Herein we introduce the new PRISM-SRM technology, which represents a highly sensitive multiplexed quantification technology capable of simultaneous quantification of many low-abundance proteins without the need of affinity reagents. The versatility of antibody-free PRISM-SRM for quantifying various types of targets including protein isoforms, protein modifications, metabolites, and others, thus offering new competition with immunoassays.
2010-01-01
throughout the entire 3D volume which made quantification of the different tissues in the breast possible. The p eaks representing glandular and fat in...coefficients. Keywords: tissue quantification , absolute attenuation coefficient, scatter correction, computed tomography, tomography... tissue types. 1-4 Accurate measurements of t he quantification and di fferentiation of numerous t issues can be useful to identify di sease from
Martinez, G T; Rosenauer, A; De Backer, A; Verbeeck, J; Van Aert, S
2014-02-01
High angle annular dark field scanning transmission electron microscopy (HAADF STEM) images provide sample information which is sensitive to the chemical composition. The image intensities indeed scale with the mean atomic number Z. To some extent, chemically different atomic column types can therefore be visually distinguished. However, in order to quantify the atomic column composition with high accuracy and precision, model-based methods are necessary. Therefore, an empirical incoherent parametric imaging model can be used of which the unknown parameters are determined using statistical parameter estimation theory (Van Aert et al., 2009, [1]). In this paper, it will be shown how this method can be combined with frozen lattice multislice simulations in order to evolve from a relative toward an absolute quantification of the composition of single atomic columns with mixed atom types. Furthermore, the validity of the model assumptions are explored and discussed. © 2013 Published by Elsevier B.V. All rights reserved.
A python framework for environmental model uncertainty analysis
White, Jeremy; Fienen, Michael N.; Doherty, John E.
2016-01-01
We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C
2016-07-21
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods
NASA Astrophysics Data System (ADS)
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.
2016-07-01
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantification of Cannabinoid Content in Cannabis
NASA Astrophysics Data System (ADS)
Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.
2015-09-01
Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.
Resource Theory of Superposition
NASA Astrophysics Data System (ADS)
Theurer, T.; Killoran, N.; Egloff, D.; Plenio, M. B.
2017-12-01
The superposition principle lies at the heart of many nonclassical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialized to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.
A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less
Quantification of three-dimensional cell-mediated collagen remodeling using graph theory.
Bilgin, Cemal Cagatay; Lund, Amanda W; Can, Ali; Plopper, George E; Yener, Bülent
2010-09-30
Cell cooperation is a critical event during tissue development. We present the first precise metrics to quantify the interaction between mesenchymal stem cells (MSCs) and extra cellular matrix (ECM). In particular, we describe cooperative collagen alignment process with respect to the spatio-temporal organization and function of mesenchymal stem cells in three dimensions. We defined two precise metrics: Collagen Alignment Index and Cell Dissatisfaction Level, for quantitatively tracking type I collagen and fibrillogenesis remodeling by mesenchymal stem cells over time. Computation of these metrics was based on graph theory and vector calculus. The cells and their three dimensional type I collagen microenvironment were modeled by three dimensional cell-graphs and collagen fiber organization was calculated from gradient vectors. With the enhancement of mesenchymal stem cell differentiation, acceleration through different phases was quantitatively demonstrated. The phases were clustered in a statistically significant manner based on collagen organization, with late phases of remodeling by untreated cells clustering strongly with early phases of remodeling by differentiating cells. The experiments were repeated three times to conclude that the metrics could successfully identify critical phases of collagen remodeling that were dependent upon cooperativity within the cell population. Definition of early metrics that are able to predict long-term functionality by linking engineered tissue structure to function is an important step toward optimizing biomaterials for the purposes of regenerative medicine.
Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.
2015-01-01
Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636
The quantification of pattern is a key element of landscape analyses. One aspect of this quantification of particular importance to landscape ecologists regards the classification of continuous variables to produce categorical variables such as land-cover type or elevation strat...
NASA Astrophysics Data System (ADS)
Dupéy, Lauren Nicole; Smith, Jordan W.
2018-06-01
Social science research from a variety of disciplines has generated a collective understanding of how individuals prepare for, and respond to, the risks associated with prescribed burning and wildfire. We provide a systematic compilation, review, and quantification of dominant trends in this literature by collecting all empirical research conducted within the U.S. that has addressed perceptions and behaviors surrounding various aspects of prescribed burning and wildfire. We reviewed and quantified this literature using four thematic categories covering: (1) the theory and methods that have been used in previous research; (2) the psychosocial aspects of prescribed burning and wildfire that have been studied; (3) the biophysical characteristics of the fires which have been studied; and (4) the types of fire and management approaches that have been examined. Our integrative review builds on previous literature reviews on the subject by offering new insight on the dominant trends, underutilized approaches, and under-studied topics within each thematic category. For example, we found that a select set of theories (e.g., Protection Motivation Theory, Attribution Theory, etc.) and approaches (e.g., mixed-methods) have only been used sparingly in previous research, even though these theories and approaches can produce insightful results that can readily be implemented by fire-management professionals and decision makers. By identifying trends and gaps in the literature across the thematic categories, we were able to answer four questions that address how future research can make the greatest contribution to our understanding of perceptions and behaviors related to prescribed burning and wildfire.
Dupéy, Lauren Nicole; Smith, Jordan W
2018-06-01
Social science research from a variety of disciplines has generated a collective understanding of how individuals prepare for, and respond to, the risks associated with prescribed burning and wildfire. We provide a systematic compilation, review, and quantification of dominant trends in this literature by collecting all empirical research conducted within the U.S. that has addressed perceptions and behaviors surrounding various aspects of prescribed burning and wildfire. We reviewed and quantified this literature using four thematic categories covering: (1) the theory and methods that have been used in previous research; (2) the psychosocial aspects of prescribed burning and wildfire that have been studied; (3) the biophysical characteristics of the fires which have been studied; and (4) the types of fire and management approaches that have been examined. Our integrative review builds on previous literature reviews on the subject by offering new insight on the dominant trends, underutilized approaches, and under-studied topics within each thematic category. For example, we found that a select set of theories (e.g., Protection Motivation Theory, Attribution Theory, etc.) and approaches (e.g., mixed-methods) have only been used sparingly in previous research, even though these theories and approaches can produce insightful results that can readily be implemented by fire-management professionals and decision makers. By identifying trends and gaps in the literature across the thematic categories, we were able to answer four questions that address how future research can make the greatest contribution to our understanding of perceptions and behaviors related to prescribed burning and wildfire.
Papers in Semantics. Working Papers in Linguistics No. 49.
ERIC Educational Resources Information Center
Yoon, Jae-Hak, Ed.; Kathol, Andreas, Ed.
1996-01-01
Papers on semantic theory and research include: "Presupposition, Congruence, and Adverbs of Quantification" (Mike Calcagno); "A Unified Account of '(Ta)myen'-Conditionals in Korean" (Chan Chung); "Spanish 'imperfecto' and 'preterito': Truth Conditions and Aktionsart Effects in a Situation Semantics" (Alicia Cipria,…
NASA Astrophysics Data System (ADS)
Doyle, Laurance R.; McCowan, Brenda; Hanser, Sean F.
2002-01-01
Information theory allows a quantification of the complexity of a given signaling system. We are applying information theory to dolphin whistle vocalizations, humpback whale songs, squirrel monkey chuck calls, and several other animal communication systems' in order to develop a quantitative and objective way to compare inter species communication systems' complexity. Once signaling units have been correctly classified the communication system must obey certain statistical distributions in order to contain complexity whether it is human languages, dolphin whistle vocalizations, or even a system of communication signals received from an extraterrestrial source.
Convex geometry of quantum resource quantification
NASA Astrophysics Data System (ADS)
Regula, Bartosz
2018-01-01
We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \
Creative Stories: A Storytelling Game Fostering Creativity
ERIC Educational Resources Information Center
Koukourikos, Antonis; Karampiperis, Pythagoras; Panagopoulos, George
2014-01-01
The process of identifying techniques for fostering creativity, and applying these theoretical constructs in real-world educational activities, is, by nature, multifaceted and not straightforward, pertaining to several fields such as cognitive theory and psychology. Furthermore, the quantification of the impact of different activities on…
NASA Astrophysics Data System (ADS)
Fitkov-Norris, Elena; Yeghiazarian, Ara
2016-11-01
The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.
NASA Astrophysics Data System (ADS)
Louarroudi, E.; Pintelon, R.; Lataire, J.
2014-10-01
Time-periodic (TP) phenomena occurring, for instance, in wind turbines, helicopters, anisotropic shaft-bearing systems, and cardiovascular/respiratory systems, are often not addressed when classical frequency response function (FRF) measurements are performed. As the traditional FRF concept is based on the linear time-invariant (LTI) system theory, it is only approximately valid for systems with varying dynamics. Accordingly, the quantification of any deviation from this ideal LTI framework is more than welcome. The “measure of deviation” allows us to define the notion of the best LTI (BLTI) approximation, which yields the best - in mean square sense - LTI description of a linear time-periodic LTP system. By taking into consideration the TP effects, it is shown in this paper that the variability of the BLTI measurement can be reduced significantly compared with that of classical FRF estimators. From a single experiment, the proposed identification methods can handle (non-)linear time-periodic [(N)LTP] systems in open-loop with a quantification of (i) the noise and/or the NL distortions, (ii) the TP distortions and (iii) the transient (leakage) errors. Besides, a geometrical interpretation of the BLTI approximation is provided, leading to a framework called vector FRF analysis. The theory presented is supported by numerical simulations as well as real measurements mimicking the well-known mechanical Mathieu oscillator.
Prodinger, Birgit; Fellinghauer, Carolina Saskia; Tennant, Alan
2018-01-01
Objective To examine the use of the term ‘metric’ in health and social sciences’ literature, focusing on the interval scale implication of the term in Modern Test Theory (MTT). Materials and methods A systematic search and review on MTT studies including ‘metric’ or ‘interval scale’ was performed in the health and social sciences literature. The search was restricted to 2001–2005 and 2011–2015. A Text Mining algorithm was employed to operationalize the eligibility criteria and to explore the uses of ‘metric’. The paradigm of each included article (Rasch Measurement Theory (RMT), Item Response Theory (IRT) or both), as well as its type (Theoretical, Methodological, Teaching, Application, Miscellaneous) were determined. An inductive thematic analysis on the first three types was performed. Results 70.6% of the 1337 included articles were allocated to RMT, and 68.4% were application papers. Among the number of uses of ‘metric’, it was predominantly a synonym of ‘scale’; as adjective, it referred to measurement or quantification. Three incompatible themes ‘only RMT/all MTT/no MTT models can provide interval measures’ were identified, but ‘interval scale’ was considerably more mentioned in RMT than in IRT. Conclusion ‘Metric’ is used in many different ways, and there is no consensus on which MTT metric has interval scale properties. Nevertheless, when using the term ‘metric’, the authors should specify the level of the metric being used (ordinal, ordered, interval, ratio), and justify why according to them the metric is at that level. PMID:29509813
Adjoint-Based Uncertainty Quantification with MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifried, Jeffrey E.
2011-09-01
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less
pyQms enables universal and accurate quantification of mass spectrometry data.
Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian
2017-10-01
Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt
2007-01-01
This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510
A Variational Statistical-Field Theory for Polar Liquid Mixtures
NASA Astrophysics Data System (ADS)
Zhuang, Bilin; Wang, Zhen-Gang
Using a variational field-theoretic approach, we derive a molecularly-based theory for polar liquid mixtures. The resulting theory consists of simple algebraic expressions for the free energy of mixing and the dielectric constant as functions of mixture composition. Using only the dielectric constants and the molar volumes of the pure liquid constituents, the theory evaluates the mixture dielectric constants in good agreement with the experimental values for a wide range of liquid mixtures, without using adjustable parameters. In addition, the theory predicts that liquids with similar dielectric constants and molar volumes dissolve well in each other, while sufficient disparity in these parameters result in phase separation. The calculated miscibility map on the dielectric constant-molar volume axes agrees well with known experimental observations for a large number of liquid pairs. Thus the theory provides a quantification for the well-known empirical ``like-dissolves-like'' rule. Bz acknowledges the A-STAR fellowship for the financial support.
NASA Technical Reports Server (NTRS)
Dill, Loren H.; Choo, Yung K. (Technical Monitor)
2004-01-01
Software was developed to construct approximating NURBS curves for iced airfoil geometries. Users specify a tolerance that determines the extent to which the approximating curve follows the rough ice. The user can therefore smooth the ice geometry in a controlled manner, thereby enabling the generation of grids suitable for numerical aerodynamic simulations. Ultimately, this ability to smooth the ice geometry will permit studies of the effects of smoothing upon the aerodynamics of iced airfoils. The software was applied to several different types of iced airfoil data collected in the Icing Research Tunnel at NASA Glenn Research Center, and in all cases was found to efficiently generate suitable approximating NURBS curves. This method is an improvement over the current "control point formulation" of Smaggice (v.1.2). In this report, we present the relevant theory of approximating NURBS curves and discuss typical results of the software.
Spatial Uncertainty Modeling of Fuzzy Information in Images for Pattern Classification
Pham, Tuan D.
2014-01-01
The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction. PMID:25157744
ERIC Educational Resources Information Center
Yeung, Brendan; Ng, Tuck Wah; Tan, Han Yen; Liew, Oi Wah
2012-01-01
The use of different types of stains in the quantification of proteins separated on gels using electrophoresis offers the capability of deriving good outcomes in terms of linear dynamic range, sensitivity, and compatibility with specific proteins. An inexpensive, simple, and versatile lighting system based on liquid crystal display backlighting is…
Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leclercq, Florent; Wandelt, Benjamin; Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: jasche@iap.fr, E-mail: wandelt@iap.fr
Recent application of the Bayesian algorithm \\textsc(borg) to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments, and clusters) on the basis of themore » tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis of the origin and growth of the early traces of the cosmic web present in the initial density field and of the evolution of global quantities such as the volume and mass filling fractions of different structures. For the problem of web-type classification, the results described in this work constitute the first connection between theory and observations at non-linear scales including a physical model of structure formation and the demonstrated capability of uncertainty quantification. A connection between cosmology and information theory using real data also naturally emerges from our probabilistic approach. Our results constitute quantitative chrono-cosmography of the complex web-like patterns underlying the observed galaxy distribution.« less
Analyzing the management and disturbance in European forest based on self-thinning theory
NASA Astrophysics Data System (ADS)
Yan, Y.; Gielen, B.; Schelhaas, M.; Mohren, F.; Luyssaert, S.; Janssens, I. A.
2012-04-01
There is increasing awareness that natural and anthropogenic disturbance in forests affects exchange of CO2, H2O and energy between the ecosystem and the atmosphere. Consequently quantification of land use and disturbance intensity is one of the next steps needed to improve our understanding of the carbon cycle, its interactions with the atmosphere and its main drivers at local as well as at global level. The conventional NPP-based approaches to quantify the intensity of land management are limited because they lack a sound ecological basis. Here we apply a new way of characterising the degree of management and disturbance in forests using the self- thinning theory and observations of diameter at breast height and stand density. We used plot level information on dominant tree species, diameter at breast height, stand density and soil type from the French national forest inventory from 2005 to 2010. Stand density and diameter at breast height were used to parameterize the intercept of the self-thinning relationship and combined with theoretical slope to obtain an upper boundary for stand productivity given its density. Subsequently, we tested the sensitivity of the self-thinning relationship for tree species, soil type, climate and other environmental characteristics. We could find statistical differences in the self-thinning relationship between species and soil types, mainly due to the large uncertainty of the parameter estimates. Deviation from the theoretical self-thinning line defined as DBH=αN-3/4, was used as a proxy for disturbances, allowing to make spatially explicit maps of forest disturbance over France. The same framework was used to quantify the density-DBH trajectory of even-aged stand management of beech and oak over France. These trajectories will be used as a driver of forest management in the land surface model ORCHIDEE.
Other Historical and Philosophical Perspectives on Invariance in Measurement
ERIC Educational Resources Information Center
Fisher, William P., Jr.
2008-01-01
Engelhard draws out the similarities and differences in Guttman's, Rasch's, and Mokken's perspectives on invariance in measurement. He provides a valuable model in evaluating the extent to which different measurement theories and methods serve as a basis for achieving the fundamental goals of quantification. The full extent of this point will…
Strauss, Daniel J; Delb, Wolfgang; D'Amelio, Roberto; Low, Yin Fen; Falkai, Peter
2008-02-01
Large-scale neural correlates of the tinnitus decompensation might be used for an objective evaluation of therapies and neurofeedback based therapeutic approaches. In this study, we try to identify large-scale neural correlates of the tinnitus decompensation using wavelet phase stability criteria of single sweep sequences of late auditory evoked potentials as synchronization stability measure. The extracted measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. We provide an interpretation for our results by a neural model of top-down projections based on the Jastreboff tinnitus model combined with the adaptive resonance theory which has not been applied to model tinnitus so far. Using this model, our stability measure of evoked potentials can be linked to the focus of attention on the tinnitus signal. It is concluded that the wavelet phase stability of late auditory evoked potential single sweeps might be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory.
An information theory account of cognitive control.
Fan, Jin
2014-01-01
Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.
Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.
De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik
2018-01-01
Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.
Renormalization group theory for percolation in time-varying networks.
Karschau, Jens; Zimmerling, Marco; Friedrich, Benjamin M
2018-05-22
Motivated by multi-hop communication in unreliable wireless networks, we present a percolation theory for time-varying networks. We develop a renormalization group theory for a prototypical network on a regular grid, where individual links switch stochastically between active and inactive states. The question whether a given source node can communicate with a destination node along paths of active links is equivalent to a percolation problem. Our theory maps the temporal existence of multi-hop paths on an effective two-state Markov process. We show analytically how this Markov process converges towards a memoryless Bernoulli process as the hop distance between source and destination node increases. Our work extends classical percolation theory to the dynamic case and elucidates temporal correlations of message losses. Quantification of temporal correlations has implications for the design of wireless communication and control protocols, e.g. in cyber-physical systems such as self-organized swarms of drones or smart traffic networks.
NASA Astrophysics Data System (ADS)
Akram, Muhammad Farooq Bin
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
Bauzá, Antonio; Quiñonero, David; Frontera, Antonio; Ballester, Pablo
2015-01-01
In this manuscript we consider from a theoretical point of view the recently reported experimental quantification of anion–π interactions (the attractive force between electron deficient aromatic rings and anions) in solution using aryl extended calix[4]pyrrole receptors as model systems. Experimentally, two series of calix[4]pyrrole receptors functionalized, respectively, with two and four aryl rings at the meso positions, were used to assess the strength of chloride–π interactions in acetonitrile solution. As a result of these studies the contribution of each individual chloride–π interaction was quantified to be very small (<1 kcal/mol). This result is in contrast with the values derived from most theoretical calculations. Herein we report a theoretical study using high-level density functional theory (DFT) calculations that provides a plausible explanation for the observed disagreement between theory and experiment. The study reveals the existence of molecular interactions between solvent molecules and the aromatic walls of the receptors that strongly modulate the chloride–π interaction. In addition, the obtained theoretical results also suggest that the chloride-calix[4]pyrrole complex used as reference to dissect experimentally the contribution of the chloride–π interactions to the total binding energy for both the two and four-wall aryl-extended calix[4]pyrrole model systems is probably not ideal. PMID:25913375
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Toyota, Akie; Akiyama, Hiroshi; Sugimura, Mitsunori; Watanabe, Takahiro; Kikuchi, Hiroyuki; Kanamori, Hisayuki; Hino, Akihiro; Esaka, Muneharu; Maitani, Tamio
2006-04-01
Because the labeling of grains and feed- and foodstuffs is mandatory if the genetically modified organism (GMO) content exceeds a certain level of approved genetically modified varieties in many countries, there is a need for a rapid and useful method of GMO quantification in food samples. In this study, a rapid detection system was developed for Roundup Ready Soybean (RRS) quantification using a combination of a capillary-type real-time PCR system, a LightCycler real-time PCR system, and plasmid DNA as the reference standard. In addition, we showed for the first time that the plasmid and genomic DNA should be similar in the established detection system because the PCR efficiencies of using plasmid DNA and using genomic DNA were not significantly different. The conversion factor (Cf) to calculate RRS content (%) was further determined from the average value analyzed in three laboratories. The accuracy and reproducibility of this system for RRS quantification at a level of 5.0% were within a range from 4.46 to 5.07% for RRS content and within a range from 2.0% to 7.0% for the relative standard deviation (RSD) value, respectively. This system rapidly monitored the labeling system and had allowable levels of accuracy and precision.
2016-01-01
Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154
An information theory account of cognitive control
Fan, Jin
2014-01-01
Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory. PMID:25228875
Quantification of brain lipids by FTIR spectroscopy and partial least squares regression
NASA Astrophysics Data System (ADS)
Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph
2009-01-01
Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.
Uncertainty quantification and propagation in nuclear density functional theory
Schunck, N.; McDonnell, J. D.; Higdon, D.; ...
2015-12-23
Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this study, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statisticalmore » analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.« less
Salvadó, Humbert
2016-09-01
Bulking and foaming phenomena in activated sludge wastewater treatment plants are in most cases related to the abundance of filamentous microorganisms. Quantifying these microorganisms should be a preliminary stage in their control. In this paper, the simplicity of quantifying them based on the intersection method is demonstrated, by redescribing the theory and applying a new improved protocol; new data of interest are also provided. The improved method allows us to use it with stained smears, including epifluorescence techniques. The error that could be made, when considering the distribution of filamentous bacteria in fresh microscope preparations in two dimensions rather than three is negligible. The effect of the different types of filamentous microorganisms on the settleability was also studied. The effect of the total extended filament length on the sludge settleability was shown to depend on the type of filamentous organism and how it aggregates. When these groups of filamentous organisms are found in small aggregations and there is an increase in the number of filamentous organisms, the sludge volume index (SVI) increases proportionally to the filament length. However, when aggregation increases, the impact on the SVI is significantly lower.
Quantification of Noise Sources in EMI Surveys
2012-04-09
Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/ 6110 --12-9400 Quantification of Noise Sources in EMI Surveys ESTCP MR-0508 Final Guidance...NUMBER 2 . REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING ORGANIZATION REPORT NUMBER 7. PERFORMING...Barrow,‡ Jonathan T. Miller,‡ and Thomas H. Bell,‡ Naval Research Laboratory, Code 6110 4555 Overlook Avenue, SW Washington, DC 20375-5320 NRL/MR
Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman
2015-03-01
To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P < 0.002), 10.389 (P < 0.001), and 4.984 (P = 0.303), respectively. There appears to be only a weak correlation between amount of pain medication prescribed and patients' reported subjective pain intensity within this limited patient population. The Medication Quantification Scale is a viable tool for the analysis of pharmaceutical treatment of CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.
Champion, Christophe; Quinto, Michele A.; Bug, Marion U.; ...
2014-07-29
Electron-induced ionization of the commonly used surrogate of the DNA sugar-phosphate backbone, namely, the tetrahydrofuran molecule, is here theoretically described within the 1 st Born approximation by means of quantum-mechanical approach. Comparisons between theory and recent experiments are reported in terms of doubly and singly differential cross sections.
NASA Astrophysics Data System (ADS)
Chang, Li-Na; Luo, Shun-Long; Sun, Yuan
2017-11-01
The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2012-12-01
In the past decade much progress has been made in the treatment of uncertainty in earth systems modeling. Whereas initial approaches has focused mostly on quantification of parameter and predictive uncertainty, recent methods attempt to disentangle the effects of parameter, forcing (input) data, model structural and calibration data errors. In this talk I will highlight some of our recent work involving theory, concepts and applications of Bayesian parameter and/or state estimation. In particular, new methods for sequential Monte Carlo (SMC) and Markov Chain Monte Carlo (MCMC) simulation will be presented with emphasis on massively parallel distributed computing and quantification of model structural errors. The theoretical and numerical developments will be illustrated using model-data synthesis problems in hydrology, hydrogeology and geophysics.
McDonnell, J. D.; Schunck, N.; Higdon, D.; ...
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonnell, J. D.; Schunck, N.; Higdon, D.
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
DOT National Transportation Integrated Search
2005-09-01
The Long-Term Pavement Performance (LTPP) program was designed as a 20-year study of pavement performance. A major data collection effort at LTPP test sections is the collection of longitudinal profile data using inertial profilers. Three types of in...
Sensor Selection for Aircraft Engine Performance Estimation and Gas Path Fault Diagnostics
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.
2015-01-01
This paper presents analytical techniques for aiding system designers in making aircraft engine health management sensor selection decisions. The presented techniques, which are based on linear estimation and probability theory, are tailored for gas turbine engine performance estimation and gas path fault diagnostics applications. They enable quantification of the performance estimation and diagnostic accuracy offered by different candidate sensor suites. For performance estimation, sensor selection metrics are presented for two types of estimators including a Kalman filter and a maximum a posteriori estimator. For each type of performance estimator, sensor selection is based on minimizing the theoretical sum of squared estimation errors in health parameters representing performance deterioration in the major rotating modules of the engine. For gas path fault diagnostics, the sensor selection metric is set up to maximize correct classification rate for a diagnostic strategy that performs fault classification by identifying the fault type that most closely matches the observed measurement signature in a weighted least squares sense. Results from the application of the sensor selection metrics to a linear engine model are presented and discussed. Given a baseline sensor suite and a candidate list of optional sensors, an exhaustive search is performed to determine the optimal sensor suites for performance estimation and fault diagnostics. For any given sensor suite, Monte Carlo simulation results are found to exhibit good agreement with theoretical predictions of estimation and diagnostic accuracies.
Sensor Selection for Aircraft Engine Performance Estimation and Gas Path Fault Diagnostics
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.
2016-01-01
This paper presents analytical techniques for aiding system designers in making aircraft engine health management sensor selection decisions. The presented techniques, which are based on linear estimation and probability theory, are tailored for gas turbine engine performance estimation and gas path fault diagnostics applications. They enable quantification of the performance estimation and diagnostic accuracy offered by different candidate sensor suites. For performance estimation, sensor selection metrics are presented for two types of estimators including a Kalman filter and a maximum a posteriori estimator. For each type of performance estimator, sensor selection is based on minimizing the theoretical sum of squared estimation errors in health parameters representing performance deterioration in the major rotating modules of the engine. For gas path fault diagnostics, the sensor selection metric is set up to maximize correct classification rate for a diagnostic strategy that performs fault classification by identifying the fault type that most closely matches the observed measurement signature in a weighted least squares sense. Results from the application of the sensor selection metrics to a linear engine model are presented and discussed. Given a baseline sensor suite and a candidate list of optional sensors, an exhaustive search is performed to determine the optimal sensor suites for performance estimation and fault diagnostics. For any given sensor suite, Monte Carlo simulation results are found to exhibit good agreement with theoretical predictions of estimation and diagnostic accuracies.
Winpenny, David; Clark, Mellissa
2016-01-01
Background and Purpose Biased GPCR ligands are able to engage with their target receptor in a manner that preferentially activates distinct downstream signalling and offers potential for next generation therapeutics. However, accurate quantification of ligand bias in vitro is complex, and current best practice is not amenable for testing large numbers of compound. We have therefore sought to apply ligand bias theory to an industrial scale screening campaign for the identification of new biased μ receptor agonists. Experimental Approach μ receptor assays with appropriate dynamic range were developed for both Gαi‐dependent signalling and β‐arrestin2 recruitment. Δlog(Emax/EC50) analysis was validated as an alternative for the operational model of agonism in calculating pathway bias towards Gαi‐dependent signalling. The analysis was applied to a high throughput screen to characterize the prevalence and nature of pathway bias among a diverse set of compounds with μ receptor agonist activity. Key Results A high throughput screening campaign yielded 440 hits with greater than 10‐fold bias relative to DAMGO. To validate these results, we quantified pathway bias of a subset of hits using the operational model of agonism. The high degree of correlation across these biased hits confirmed that Δlog(Emax/EC50) was a suitable method for identifying genuine biased ligands within a large collection of diverse compounds. Conclusions and Implications This work demonstrates that using Δlog(Emax/EC50), drug discovery can apply the concept of biased ligand quantification on a large scale and accelerate the deliberate discovery of novel therapeutics acting via this complex pharmacology. PMID:26791140
1985-12-01
Confirmation/Quantification. Moody AFB- GA _____ S12. PERSONAL AUTHOR(S) .. ’ Steinberg J.A. and Thiess, W.G. 13.& TYPE OF REPORT 13b. TIME COVERED 14I. DATE...2.3.2 Soils On the high ground western portion of the base, the surface soils are mostly in the Tifton series. The soil profile consists of about 2 to...Florida Department of Environmental Regulation FWQS Florida Water Quality Standards gpd Gallons per day gpm Gallons per minute GC Gas chromatograph
Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi
2008-07-23
A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.
Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S
2016-03-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).
Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David
2016-01-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.
Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.
Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew
2018-02-01
Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
Krishnaiah, Yellela S R; Katragadda, Usha; Khan, Mansoor A
2014-05-01
Cold flow is a phenomenon occurring in drug-in-adhesive type of transdermal drug delivery systems (DIA-TDDS) because of the migration of DIA coat beyond the edge. Excessive cold flow can affect their therapeutic effectiveness, make removal of DIA-TDDS difficult from the pouch, and potentially decrease available dose if any drug remains adhered to pouch. There are no compendial or noncompendial methods available for quantification of this critical quality attribute. The objective was to develop a method for quantification of cold flow using stereomicroscopic imaging technique. Cold flow was induced by applying 1 kg force on punched-out samples of marketed estradiol DIA-TDDS (model product) stored at 25°C, 32°C, and 40°C/60% relative humidity (RH) for 1, 2, or 3 days. At the end of testing period, dimensional change in the area of DIA-TDDS samples was measured using image analysis software, and expressed as percent of cold flow. The percent of cold flow significantly decreased (p < 0.001) with increase in size of punched-out DIA-TDDS samples and increased (p < 0.001) with increase in cold flow induction temperature and time. This first ever report suggests that dimensional change in the area of punched-out samples stored at 32°C/60%RH for 2 days applied with 1 kg force could be used for quantification of cold flow in DIA-TDDS. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Morariu, Cosmin Adrian; Terheiden, Tobias; Dohle, Daniel Sebastian; Tsagakis, Konstantinos; Pauli, Josef
2016-02-01
Our goal is to provide precise measurements of the aortic dimensions in case of dissection pathologies. Quantification of surface lengths and aortic radii/diameters together with the visualization of the dissection membrane represents crucial prerequisites for enabling minimally invasive treatment of type A dissections, which always also imply the ascending aorta. We seek a measure invariant to luminance and contrast for aortic outer wall segmentation. Therefore, we propose a 2D graph-based approach using phase congruency combined with additional features. Phase congruency is extended to 3D by designing a novel conic directional filter and adding a lowpass component to the 3D Log-Gabor filterbank for extracting the fine dissection membrane, which separates the true lumen from the false one within the aorta. The result of the outer wall segmentation is compared with manually annotated axial slices belonging to 11 CTA datasets. Quantitative assessment of our novel 2D/3D membrane extraction algorithms has been obtained for 10 datasets and reveals subvoxel accuracy in all cases. Aortic inner and outer surface lengths, determined within 2 cadaveric CT datasets, are validated against manual measurements performed by a vascular surgeon on excised aortas of the body donors. This contribution proposes a complete pipeline for segmentation and quantification of aortic dissections. Validation against ground truth of the 3D contour lengths quantification represents a significant step toward custom-designed stent-grafts.
Higuchi equation: derivation, applications, use and misuse.
Siepmann, Juergen; Peppas, Nicholas A
2011-10-10
Fifty years ago, the legendary Professor Takeru Higuchi published the derivation of an equation that allowed for the quantification of drug release from thin ointment films, containing finely dispersed drug into a perfect sink. This became the famous Higuchi equation whose fiftieth anniversary we celebrate this year. Despite the complexity of the involved mass transport processes, Higuchi derived a very simple equation, which is easy to use. Based on a pseudo-steady-state approach, a direct proportionality between the cumulative amount of drug released and the square root of time can be demonstrated. In contrast to various other "square root of time" release kinetics, the constant of proportionality in the classical Higuchi equation has a specific, physically realistic meaning. The major benefits of this equation include the possibility to: (i) facilitate device optimization, and (ii) to better understand the underlying drug release mechanisms. The equation can also be applied to other types of drug delivery systems than thin ointment films, e.g., controlled release transdermal patches or films for oral controlled drug delivery. Later, the equation was extended to other geometries and related theories have been proposed. The aim of this review is to highlight the assumptions the derivation of the classical Higuchi equation is based on and to give an overview on the use and potential misuse of this equation as well as of related theories. Copyright © 2011 Elsevier B.V. All rights reserved.
Russek, Natanya S; Jensen, Matthew B
2014-03-01
Ischemic stroke is a leading cause of death and disability, and current treatments to limit tissue injury and improve recovery are limited. Cerebral infarction is accompanied by intense brain tissue inflammation involving many inflammatory cell types that may cause both negative and positive effects on outcomes. Many potential neuroprotective and neurorestorative treatments may affect, and be affected by, this inflammatory cell infiltration, so that accurate quantification of this tissue response is needed. We performed a systematic review of histological methods to quantify brain tissue inflammatory cell infiltration after cerebral infarction. We found reports of multiple techniques to quantify different inflammatory cell types. We found no direct comparison studies and conclude that more research is needed to optimize the assessment of this important stroke outcome.
van Dooren, Ines; Foubert, Kenn; Theunis, Mart; Naessens, Tania; Pieters, Luc; Apers, Sandra
2018-01-30
The berries of Vaccinium macrocarpon, cranberry, are widely used for the prevention of urinary tract infections. This species contains A-type proanthocyanidins (PACs), which intervene in the initial phase of the development of urinary tract infections by preventing the adherence of Escherichia coli by their P-type fimbriae to uroepithelial cells. Unfortunately, the existing clinical studies used different cranberry preparations, which were poorly standardized. Because of this, the results were hard to compare, which led sometimes to conflicting results. Currently, PACs are quantified using the rather non-specific spectrophotometric 4-dimethylaminocinnamaldehyde (DMAC) method. In addition, a normal phase HPTLC-densitometric method, a HPLC-UV method and three LC-MS/MS methods for quantification of procyanidin A2 were recently published. All these methods contain some shortcomings and errors. Hence, the development and validation of a fast and sensitive standard addition LC-MS/MS method for the simultaneous quantification of A-type dimers and trimers in a cranberry dry extract was carried out. A linear calibration model could be adopted for dimers and, after logaritmic transformation, for trimers. The maximal interday and interconcentration precision was found to be 4.86% and 4.28% for procyanidin A2, and 5.61% and 7.65% for trimeric PACs, which are all acceptable values for an analytical method using LC-MS/MS. In addition, twelve different cranberry extracts were analyzed by means of the newly validated method and other widely used methods. There appeared to be an enormous variation in dimeric and trimeric PAC content. Comparison of these results with LC-MS/MS analysis without standard addition showed the presence of matrix effects for some of the extracts and proved the necessity of standard addition. A comparison of the well-known and widely used DMAC method, the butanol-HCl assay and this newly developed LC-MS/MS method clearly indicated the need for a reliable method able to quantify A-type PACs, which are considered to be the pharmacologically active constituents of cranberry, since neither the DMAC or butanol-HCl assays are capable of distinguishing between A and B-type PACs and therefore cannot detect adulterations with, for example, extracts with a high B-type PAC content. Hence, the combination of the DMAC method or butanol-HCl assay with this more specific LC-MS/MS assay could overcome these shortcomings. Copyright © 2017 Elsevier B.V. All rights reserved.
Quantification of complex modular architecture in plants.
Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain
2018-04-01
Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.
Learning and Information Approaches for Inference in Dynamic Data-Driven Geophysical Applications
NASA Astrophysics Data System (ADS)
Ravela, S.
2015-12-01
Many Geophysical inference problems are characterized by non-linear processes, high-dimensional models and complex uncertainties. A dynamic coupling between models, estimation, and sampling is typically sought to efficiently characterize and reduce uncertainty. This process is however fraught with several difficulties. Among them, the key difficulties are the ability to deal with model errors, efficacy of uncertainty quantification and data assimilation. In this presentation, we present three key ideas from learning and intelligent systems theory and apply them to two geophysical applications. The first idea is the use of Ensemble Learning to compensate for model error, the second is to develop tractable Information Theoretic Learning to deal with non-Gaussianity in inference, and the third is a Manifold Resampling technique for effective uncertainty quantification. We apply these methods, first to the development of a cooperative autonomous observing system using sUAS for studying coherent structures. We apply this to Second, we apply this to the problem of quantifying risk from hurricanes and storm surges in a changing climate. Results indicate that learning approaches can enable new effectiveness in cases where standard approaches to model reduction, uncertainty quantification and data assimilation fail.
Uncertainty quantification of effective nuclear interactions
Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz
2016-03-02
We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.
Uncertainty quantification of effective nuclear interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz
We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.
Quantification of correlations in quantum many-particle systems.
Byczuk, Krzysztof; Kuneš, Jan; Hofstetter, Walter; Vollhardt, Dieter
2012-02-24
We introduce a well-defined and unbiased measure of the strength of correlations in quantum many-particle systems which is based on the relative von Neumann entropy computed from the density operator of correlated and uncorrelated states. The usefulness of this general concept is demonstrated by quantifying correlations of interacting electrons in the Hubbard model and in a series of transition-metal oxides using dynamical mean-field theory.
Theory and data for simulating fine-scale human movement in an urban environment
Perkins, T. Alex; Garcia, Andres J.; Paz-Soldán, Valerie A.; Stoddard, Steven T.; Reiner, Robert C.; Vazquez-Prokopec, Gonzalo; Bisanzio, Donal; Morrison, Amy C.; Halsey, Eric S.; Kochel, Tadeusz J.; Smith, David L.; Kitron, Uriel; Scott, Thomas W.; Tatem, Andrew J.
2014-01-01
Individual-based models of infectious disease transmission depend on accurate quantification of fine-scale patterns of human movement. Existing models of movement either pertain to overly coarse scales, simulate some aspects of movement but not others, or were designed specifically for populations in developed countries. Here, we propose a generalizable framework for simulating the locations that an individual visits, time allocation across those locations, and population-level variation therein. As a case study, we fit alternative models for each of five aspects of movement (number, distance from home and types of locations visited; frequency and duration of visits) to interview data from 157 residents of the city of Iquitos, Peru. Comparison of alternative models showed that location type and distance from home were significant determinants of the locations that individuals visited and how much time they spent there. We also found that for most locations, residents of two neighbourhoods displayed indistinguishable preferences for visiting locations at various distances, despite differing distributions of locations around those neighbourhoods. Finally, simulated patterns of time allocation matched the interview data in a number of ways, suggesting that our framework constitutes a sound basis for simulating fine-scale movement and for investigating factors that influence it. PMID:25142528
NASA Astrophysics Data System (ADS)
Köhler, Reinhard
2014-12-01
We have long been used to the domination of qualitative methods in modern linguistics. Indeed, qualitative methods have advantages such as ease of use and wide applicability to many types of linguistic phenomena. However, this shall not overshadow the fact that a great part of human language is amenable to quantification. Moreover, qualitative methods may lead to over-simplification by employing the rigid yes/no scale. When variability and vagueness of human language must be taken into account, qualitative methods will prove inadequate and give way to quantitative methods [1, p. 11]. In addition to such advantages as exactness and precision, quantitative concepts and methods make it possible to find laws of human language which are just like those in natural sciences. These laws are fundamental elements of linguistic theories in the spirit of the philosophy of science [2,3]. Theorization effort of this type is what quantitative linguistics [1,4,5] is devoted to. The review of Cong and Liu [6] has provided an informative and insightful survey of linguistic complex networks as a young field of quantitative linguistics, including the basic concepts and measures, the major lines of research with linguistic motivation, and suggestions for future research.
Quantification of dichlorvos released from kill strips used in boll weevil eradication programs
USDA-ARS?s Scientific Manuscript database
Two types of kill strips, Hercon Vaportape II and Plato Insecticide Strip, are used by boll weevil, Anthonomus grandis (Boheman), eradication programs in the U.S. Both types utilize dichlorvos as the killing agent and are marketed to last up to a month in traps. Consequently, programs typically re...
Applications of Jungian Type Theory to Counselor Education.
ERIC Educational Resources Information Center
Dilley, Josiah S.
1987-01-01
Describes Carl Jung's theory of psychological type and the Myers-Briggs Type Indicator (MBTI), an instrument to assess Jungian type. Cites sources of information on the research and application of the theory and the MBTI. Explores how knowledge of type theory can be useful to counselor educators. (Author)
Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method
NASA Astrophysics Data System (ADS)
Zhang, Xiangnan
2018-03-01
A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.
Error and Uncertainty Quantification in the Numerical Simulation of Complex Fluid Flows
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2010-01-01
The failure of numerical simulation to predict physical reality is often a direct consequence of the compounding effects of numerical error arising from finite-dimensional approximation and physical model uncertainty resulting from inexact knowledge and/or statistical representation. In this topical lecture, we briefly review systematic theories for quantifying numerical errors and restricted forms of model uncertainty occurring in simulations of fluid flow. A goal of this lecture is to elucidate both positive and negative aspects of applying these theories to practical fluid flow problems. Finite-element and finite-volume calculations of subsonic and hypersonic fluid flow are presented to contrast the differing roles of numerical error and model uncertainty. for these problems.
Fundamentals of multiplexing with digital PCR.
Whale, Alexandra S; Huggett, Jim F; Tzonev, Svilen
2016-12-01
Over the past decade numerous publications have demonstrated how digital PCR (dPCR) enables precise and sensitive quantification of nucleic acids in a wide range of applications in both healthcare and environmental analysis. This has occurred in parallel with the advances in partitioning fluidics that enable a reaction to be subdivided into an increasing number of partitions. As the majority of dPCR systems are based on detection in two discrete optical channels, most research to date has focused on quantification of one or two targets within a single reaction. Here we describe 'higher order multiplexing' that is the unique ability of dPCR to precisely measure more than two targets in the same reaction. Using examples, we describe the different types of duplex and multiplex reactions that can be achieved. We also describe essential experimental considerations to ensure accurate quantification of multiple targets.
NASA Astrophysics Data System (ADS)
He, Jingjing; Wang, Dengjiang; Zhang, Weifang
2015-03-01
This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Ibarra-Castanedo, Clemente; Bendada, AbdelHakim; Maldague, Xavier; Loaiza, Humberto; Caicedo, Eduardo
2008-01-01
It is well known that the methods of thermographic non-destructive testing based on the thermal contrast are strongly affected by non-uniform heating at the surface. Hence, the results obtained from these methods considerably depend on the chosen reference point. The differential absolute contrast (DAC) method was developed to eliminate the need of determining a reference point that defined the thermal contrast with respect to an ideal sound area. Although, very useful at early times, the DAC accuracy decreases when the heat front approaches the sample rear face. We propose a new DAC version by explicitly introducing the sample thickness using the thermal quadrupoles theory and showing that the new DAC range of validity increases for long times while preserving the validity for short times. This new contrast is used for defect quantification in composite, Plexiglas™ and aluminum samples.
Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong
2017-01-01
Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101
Rupert, Déborah L M; Claudio, Virginia; Lässer, Cecilia; Bally, Marta
2017-01-01
Our body fluids contain a multitude of cell-derived vesicles, secreted by most cell types, commonly referred to as extracellular vesicles. They have attracted considerable attention for their function as intercellular communication vehicles in a broad range of physiological processes and pathological conditions. Extracellular vesicles and especially the smallest type, exosomes, have also generated a lot of excitement in view of their potential as disease biomarkers or as carriers for drug delivery. In this context, state-of-the-art techniques capable of comprehensively characterizing vesicles in biological fluids are urgently needed. This review presents the arsenal of techniques available for quantification and characterization of physical properties of extracellular vesicles, summarizes their working principles, discusses their advantages and limitations and further illustrates their implementation in extracellular vesicle research. The small size and physicochemical heterogeneity of extracellular vesicles make their physical characterization and quantification an extremely challenging task. Currently, structure, size, buoyant density, optical properties and zeta potential have most commonly been studied. The concentration of vesicles in suspension can be expressed in terms of biomolecular or particle content depending on the method at hand. In addition, common quantification methods may either provide a direct quantitative measurement of vesicle concentration or solely allow for relative comparison between samples. The combination of complementary methods capable of detecting, characterizing and quantifying extracellular vesicles at a single particle level promises to provide new exciting insights into their modes of action and to reveal the existence of vesicle subpopulations fulfilling key biological tasks. Copyright © 2016 Elsevier B.V. All rights reserved.
Use of multiple competitors for quantification of human immunodeficiency virus type 1 RNA in plasma.
Vener, T; Nygren, M; Andersson, A; Uhlén, M; Albert, J; Lundeberg, J
1998-07-01
Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens.
Defect identification in semiconductors with positron annihilation: experiment and theory
NASA Astrophysics Data System (ADS)
Tuomisto, Filip
2015-03-01
Positron annihilation spectroscopy is a very powerful technique for the detection, identification and quantification of vacancy-type defects in semiconductors. In the past decades, it has been used to reveal the relationship between opto-electronic properties and specific defects in a wide variety of materials - examples include parasitic yellow luminescence in GaN, dominant acceptor defects in ZnO and broad-band absorption causing brown coloration in natural diamond. In typical binary compound semiconductors, the selective sensitivity of the technique is rather strongly limited to cation vacancies that possess significant open volume and suitable charge (negative of neutral). On the other hand, oxygen vacancies in oxide semiconductors are a widely debated topic. The properties attributed to oxygen vacancies include the inherent n-type conduction, poor p-type dopability, coloration (absorption), deep level luminescence and non-radiative recombination, while the only direct experimental evidence of their existence has been obtained on the crystal surface. We will present recent advances in combining state-of-the-art positron annihilation experiments and ab initio computational approaches. The latter can be used to model both the positron lifetime and the electron-positron momentum distribution - quantities that can be directly compared with experimental results. We have applied these methods to study vacancy-type defects in III-nitride semiconductors (GaN, AlN, InN) and oxides such as ZnO, SnO2, In2O3andGa2O3. We will show that cation-vacancy-related defects are important compensating centers in all these materials when they are n-type. In addition, we will show that anion (N, O) vacancies can be detected when they appear as complexes with cation vacancies.
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
RNA-Skim: a rapid method for RNA-Seq quantification at transcript level
Zhang, Zhaojun; Wang, Wei
2014-01-01
Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995
2011-01-01
Purpose Eddy current induced velocity offsets are of concern for accuracy in cardiovascular magnetic resonance (CMR) volume flow quantification. However, currently known theoretical aspects of eddy current behavior have not led to effective guidelines for the optimization of flow quantification sequences. This study is aimed at identifying correlations between protocol parameters and the resulting velocity error in clinical CMR flow measurements in a multi-vendor study. Methods Nine 1.5T scanners of three different types/vendors were studied. Measurements were performed on a large stationary phantom. Starting from a clinical breath-hold flow protocol, several protocol parameters were varied. Acquisitions were made in three clinically relevant orientations. Additionally, a time delay between the bipolar gradient and read-out, asymmetric versus symmetric velocity encoding, and gradient amplitude and slew rate were studied in adapted sequences as exploratory measurements beyond the protocol. Image analysis determined the worst-case offset for a typical great-vessel flow measurement. Results The results showed a great variation in offset behavior among scanners (standard deviation among samples of 0.3, 0.4, and 0.9 cm/s for the three different scanner types), even for small changes in the protocol. Considering the absolute values, none of the tested protocol settings consistently reduced the velocity offsets below the critical level of 0.6 cm/s neither for all three orientations nor for all three scanner types. Using multilevel linear model analysis, oblique aortic and pulmonary slices showed systematic higher offsets than the transverse aortic slices (oblique aortic 0.6 cm/s, and pulmonary 1.8 cm/s higher than transverse aortic). The exploratory measurements beyond the protocol yielded some new leads for further sequence development towards reduction of velocity offsets; however those protocols were not always compatible with the time-constraints of breath-hold imaging and flow-related artefacts. Conclusions This study showed that with current systems there was no generic protocol which resulted into acceptable flow offset values. Protocol optimization would have to be performed on a per scanner and per protocol basis. Proper optimization might make accurate (transverse) aortic flow quantification possible for most scanners. Pulmonary flow quantification would still need further (offline) correction. PMID:21388521
Bimanual coordination: A missing piece of arm rehabilitation after stroke.
Kantak, Shailesh; Jax, Steven; Wittenberg, George
2017-01-01
Inability to use the arm in daily actions significantly lowers quality of life after stroke. Most contemporary post-stroke arm rehabilitation strategies that aspire to re-engage the weaker arm in functional activities have been greatly limited in their effectiveness. Most actions of daily life engage the two arms in a highly coordinated manner. In contrast, most rehabilitation approaches predominantly focus on restitution of the impairments and unilateral practice of the weaker hand alone. We present a perspective that this misalignment between real world requirements and intervention strategies may limit the transfer of unimanual capability to spontaneous arm use and functional recovery. We propose that if improving spontaneous engagement and use of the weaker arm in real life is the goal, arm rehabilitation research and treatment need to address the coordinated interaction between arms in targeted theory-guided interventions. Current narrow focus on unimanual deficits alone, difficulty in quantifying bimanual coordination in real-world actions and limited theory-guided focus on control and remediation of different coordination modes are some of the biggest obstacles to successful implementation of effective interventions to improve bimanual coordination in the real world. We present a theory-guided taxonomy of bimanual actions that will facilitate quantification of coordination for different real-world tasks and provide treatment targets for addressing coordination deficits. We then present evidence in the literature that points to bimanual coordination deficits in stroke survivors and demonstrate how current rehabilitation approaches are limited in their impact on bimanual coordination. Importantly, we suggest theory-based areas of future investigation that may assist quantification, identification of neural mechanisms and scientifically-based training/remediation approaches for bimanual coordination deficits post-stroke. Advancing the science and practice of arm rehabilitation to incorporate bimanual coordination will lead to a more complete functional recovery of the weaker arm, thus improving the effectiveness of rehabilitation interventions and augmenting quality of life after stroke.
Nemirovskiy, Olga; Li, Wenlin Wendy; Szekely-Klepser, Gabriella
2010-01-01
Biomarkers play an increasingly important role for drug efficacy and safety evaluation in all stages of drug development. It is especially important to develop and validate sensitive and selective biomarkers for diseases where the onset of the disease is very slow and/or the disease progression is hard to follow, i.e., osteoarthritis (OA). The degradation of Type II collagen has been associated with the disease state of OA. Matrix metalloproteinases (MMPs) are enzymes that catalyze the degradation of collagen and therefore pursued as potential targets for the treatment of OA. Peptide biomarkers of MMP activity related to type II collagen degradation were identified and the presence of these peptides in MMP digests of human articular cartilage (HAC) explants and human urine were confirmed. An immunoaffinity LC/MS/MS assay for the quantification of the most abundant urinary type II collagen neoepitope (uTIINE) peptide, a 45-mer with 5 HO-proline residues was developed and clinically validated. The assay has subsequently been applied to analyze human urine samples from clinical studies. We have shown that the assay is able to differentiate between symptomatic OA and normal subjects, indicating that uTIINE can be used as potential biomarker for OA. This chapter discusses the assay procedure and provides information on the validation experiments used to evaluate the accuracy, precision, and selectivity data with attention to the specific challenges related to the quantification of endogenous protein/peptide biomarker analytes. The generalized approach can be used as a follow-up to studies whereby proteomics-based urinary biomarkers are identified and an assay needs to be developed. Considerations for the validation of such an assay are described.
Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won
2015-01-01
Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746
NASA Astrophysics Data System (ADS)
Spleiss, Martin; Weber, Lothar W.; Meier, Thomas H.; Treffler, Bernd
1995-01-01
Liver and muscle tissue have been irradiated with a surgical CO2-laser. The prefiltered fumes were adsorbed on different sorbents (activated charcoal type NIOSH and Carbotrap) and desorbed with different solvents (carbondisulphide and acetone). Analysis was done by gas chromatography/mass spectrometry. An updated list of identified substances is shown. Typical Maillard reaction products as found in warmed over flavour as aldehydes, aromatics, heterocyclic and sulphur compounds were detected. Quantification of some toxicological relevant substances is presented. The amounts of these substances are given in relation to the laser parameters and different tissues for further toxicological assessment.
Quantification of confocal images of biofilms grown on irregular surfaces
Ross, Stacy Sommerfeld; Tu, Mai Han; Falsetta, Megan L.; Ketterer, Margaret R.; Kiedrowski, Megan R.; Horswill, Alexander R.; Apicella, Michael A.; Reinhardt, Joseph M.; Fiegel, Jennifer
2014-01-01
Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515
A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.
Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo
2008-01-01
Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.
New approach for the quantification of processed animal proteins in feed using light microscopy.
Veys, P; Baeten, V
2010-07-01
A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.
Rempp, K A; Brix, G; Wenz, F; Becker, C R; Gückel, F; Lorenz, W J
1994-12-01
Quantification of regional cerebral blood flow (rCBF) and volume (rCBV) with dynamic magnetic resonance (MR) imaging. After bolus administration of a paramagnetic contrast medium, rapid T2*-weighted gradient-echo images of two sections were acquired for the simultaneous creation of concentration-time curves in the brain-feeding arteries and in brain tissue. Absolute rCBF and rCBV values were determined for gray and white brain matter in 12 subjects with use of principles of the indicator dilution theory. The mean rCBF value in gray matter was 69.7 mL/min +/- 29.7 per 100 g tissue and in white matter, 33.6 mL/min +/- 11.5 per 100 g tissue; the average rCBV was 8.0 mL +/- 3.1 per 100 g tissue and 4.2 mL +/- 1.0 per 100 g tissue, respectively. An age-related decrease in rCBF and rCBV for gray and white matter was observed. Preliminary data demonstrate that the proposed technique allows the quantification of rCBF and rCBV. Although the results are in good agreement with data from positron emission tomography studies, further evaluation is needed to establish the validity of method.
Dependence of quantitative accuracy of CT perfusion imaging on system parameters
NASA Astrophysics Data System (ADS)
Li, Ke; Chen, Guang-Hong
2017-03-01
Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.
Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir
2018-05-01
In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.
Quantification of Neural Ethanol and Acetaldehyde Using Headspace GC-MS
Heit, Claire; Eriksson, Peter; Thompson, David C; Fritz, Kristofer S; Vasiliou, Vasilis
2016-01-01
BACKGROUND There is controversy regarding the active agent responsible for alcohol addiction. The theory that ethanol itself was the agent in alcohol drinking behavior was widely accepted until acetaldehyde was found in the brain. The importance of acetaldehyde formation in the brain role is still subject to speculation due to the lack of a method to accurately assay the acetaldehyde levels directly. A highly sensitive GC-MS method to reliably determine acetaldehyde concentration with certainty is needed to address whether neural acetaldehyde is indeed responsible for increased alcohol consumption. METHODS A headspace gas chromatograph coupled to selected ion monitoring mass spectrometry was utilized to develop a quantitative assay for acetaldehyde and ethanol. Our GC-MS approach was carried out using a Bruker Scion 436-GC SQ MS. RESULTS Our approach yields limits of detection of acetaldehyde in the nanomolar range and limits of quantification in the low micromolar range. Our linear calibration includes 5 concentrations with a least square regression greater than 0.99 for both acetaldehyde and ethanol. Tissue analyses using this method revealed the capacity to quantify ethanol and acetaldehyde in blood, brain, and liver tissue from mice. CONCLUSIONS By allowing quantification of very low concentrations, this method may be used to examine the formation of ethanol metabolites, specifically acetaldehyde, in murine brain tissue in alcohol research. PMID:27501276
Space syntax in healthcare facilities research: a review.
Haq, Saif; Luo, Yang
2012-01-01
Space Syntax is a theory and method that has been developing for the last 40 years. Originally conceived as a theory of "society and space," it has expanded to other areas. An important aspect of this is technical; it allows the quantification of layouts, and unit spaces within a layout, so that the environment itself can produce independent variables in quantitative research. Increasingly, it is being used to study healthcare facilities. Space Syntax has thereby become relevant to healthcare facilities researchers and designers. This paper attempts to explain Space Syntax to a new audience of healthcare designers, administrators, and researchers; it provides a literature review on the use of Space Syntax in healthcare facility research and suggests some possibilities for future application.
Investigation of image archiving for pavement surface distress survey
DOT National Transportation Integrated Search
1999-07-26
The categorization and quantification of the type, severity, and extent of pavement surface distress is a primary method for assessing pavement condition. The current data collection system in the Arkansas State Highway and Transportation Department ...
Educational Challenges in Toxicology.
ERIC Educational Resources Information Center
Dixon, Robert L.
1984-01-01
Issues and topics related to educational challenges in toxicology at all levels are discussed. They include public awareness and understanding, general approach to toxicology, quality structure-activity relationships, epidemiological studies, quantification of risk, and the types of toxicants studied. (JN)
Großekathöfer, Ulf; Manyakov, Nikolay V.; Mihajlović, Vojkan; Pandina, Gahan; Skalkin, Andrew; Ness, Seth; Bangerter, Abigail; Goodwin, Matthew S.
2017-01-01
A number of recent studies using accelerometer features as input to machine learning classifiers show promising results for automatically detecting stereotypical motor movements (SMM) in individuals with Autism Spectrum Disorder (ASD). However, replicating these results across different types of accelerometers and their position on the body still remains a challenge. We introduce a new set of features in this domain based on recurrence plot and quantification analyses that are orientation invariant and able to capture non-linear dynamics of SMM. Applying these features to an existing published data set containing acceleration data, we achieve up to 9% average increase in accuracy compared to current state-of-the-art published results. Furthermore, we provide evidence that a single torso sensor can automatically detect multiple types of SMM in ASD, and that our approach allows recognition of SMM with high accuracy in individuals when using a person-independent classifier. PMID:28261082
Großekathöfer, Ulf; Manyakov, Nikolay V; Mihajlović, Vojkan; Pandina, Gahan; Skalkin, Andrew; Ness, Seth; Bangerter, Abigail; Goodwin, Matthew S
2017-01-01
A number of recent studies using accelerometer features as input to machine learning classifiers show promising results for automatically detecting stereotypical motor movements (SMM) in individuals with Autism Spectrum Disorder (ASD). However, replicating these results across different types of accelerometers and their position on the body still remains a challenge. We introduce a new set of features in this domain based on recurrence plot and quantification analyses that are orientation invariant and able to capture non-linear dynamics of SMM. Applying these features to an existing published data set containing acceleration data, we achieve up to 9% average increase in accuracy compared to current state-of-the-art published results. Furthermore, we provide evidence that a single torso sensor can automatically detect multiple types of SMM in ASD, and that our approach allows recognition of SMM with high accuracy in individuals when using a person-independent classifier.
Esteves Lima, Rafael Paschoal; Cota, Luis Otávio Miranda; Silva, Tarcília Aparecida; Cortelli, Sheila Cavalca; Cortelli, José Roberto; Costa, Fernando Oliveira
2017-01-01
The aim of this study was to verify the incidence on the development of type 2 diabetes in women with previous gestational diabetes with and without periodontitis after a three-year time interval. Initial sample of this follow-up study consisted of 90 women diagnosed with gestational diabetes who underwent periodontal examination. After three years, 49 women were subjected to new periodontal examination and biological, behavioral, and social data of interest were collected. Additionally, the quantification of the C-reactive protein in blood samples was performed. Fasting glucose and glycated hemoglobin levels were requested. Saliva samples were collected for quantification of interleukin 6 and 10, tumor necrosis factor α, matrix metalloproteinase 2 and 9. The incidence of type 2 diabetes mellitus was 18.4% and of periodontitis was 10.2%. There was no significant difference in the incidence of type 2 diabetes mellitus among women with and without periodontitis. It was observed impact of C-reactive protein in the development of type 2 diabetes mellitus. However, it was not observed impact of periodontitis on the development of type 2 diabetes mellitus among women with previous gestational diabetes. It was not observed impact of periodontitis on the development of type 2 diabetes among women with previous gestational diabetes. The impact of C-reactive protein in the development of type 2 diabetes mellitus highlights the importance of an inflammatory process in the diabetes pathogenesis.
NASA Astrophysics Data System (ADS)
Hui, Kai Hwee; Ambrosi, Adriano; Sofer, Zdeněk; Pumera, Martin; Bonanni, Alessandra
2015-05-01
Graphene doped with heteroatoms can show new or improved properties as compared to the original undoped material. It has been reported that the type of heteroatoms and the doping conditions can have a strong influence on the electronic and electrochemical properties of the resulting material. Here, we wish to compare the electrochemical behavior of two n-type and two p-type doped graphenes, namely boron-doped graphenes and nitrogen-doped graphenes containing different amounts of heteroatoms. We show that the boron-doped graphene containing a higher amount of dopants provides the best electroanalytical performance in terms of calibration sensitivity, selectivity and linearity of response for the detection of gallic acid normally used as the standard probe for the quantification of antioxidant activity of food and beverages. Our findings demonstrate that the type and amount of heteroatoms used for the doping have a profound influence on the electrochemical detection of gallic acid rather than the structural properties of the materials such as amounts of defects, oxygen functionalities and surface area. This finding has a profound influence on the application of doped graphenes in the field of analytical chemistry.Graphene doped with heteroatoms can show new or improved properties as compared to the original undoped material. It has been reported that the type of heteroatoms and the doping conditions can have a strong influence on the electronic and electrochemical properties of the resulting material. Here, we wish to compare the electrochemical behavior of two n-type and two p-type doped graphenes, namely boron-doped graphenes and nitrogen-doped graphenes containing different amounts of heteroatoms. We show that the boron-doped graphene containing a higher amount of dopants provides the best electroanalytical performance in terms of calibration sensitivity, selectivity and linearity of response for the detection of gallic acid normally used as the standard probe for the quantification of antioxidant activity of food and beverages. Our findings demonstrate that the type and amount of heteroatoms used for the doping have a profound influence on the electrochemical detection of gallic acid rather than the structural properties of the materials such as amounts of defects, oxygen functionalities and surface area. This finding has a profound influence on the application of doped graphenes in the field of analytical chemistry. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr01045d
Striegel, Deborah A.; Hara, Manami; Periwal, Vipul
2015-01-01
Pancreatic islets of Langerhans consist of endocrine cells, primarily α, β and δ cells, which secrete glucagon, insulin, and somatostatin, respectively, to regulate plasma glucose. β cells form irregular locally connected clusters within islets that act in concert to secrete insulin upon glucose stimulation. Due to the central functional significance of this local connectivity in the placement of β cells in an islet, it is important to characterize it quantitatively. However, quantification of the seemingly stochastic cytoarchitecture of β cells in an islet requires mathematical methods that can capture topological connectivity in the entire β-cell population in an islet. Graph theory provides such a framework. Using large-scale imaging data for thousands of islets containing hundreds of thousands of cells in human organ donor pancreata, we show that quantitative graph characteristics differ between control and type 2 diabetic islets. Further insight into the processes that shape and maintain this architecture is obtained by formulating a stochastic theory of β-cell rearrangement in whole islets, just as the normal equilibrium distribution of the Ornstein-Uhlenbeck process can be viewed as the result of the interplay between a random walk and a linear restoring force. Requiring that rearrangements maintain the observed quantitative topological graph characteristics strongly constrained possible processes. Our results suggest that β-cell rearrangement is dependent on its connectivity in order to maintain an optimal cluster size in both normal and T2D islets. PMID:26266953
Striegel, Deborah A; Hara, Manami; Periwal, Vipul
2015-08-01
Pancreatic islets of Langerhans consist of endocrine cells, primarily α, β and δ cells, which secrete glucagon, insulin, and somatostatin, respectively, to regulate plasma glucose. β cells form irregular locally connected clusters within islets that act in concert to secrete insulin upon glucose stimulation. Due to the central functional significance of this local connectivity in the placement of β cells in an islet, it is important to characterize it quantitatively. However, quantification of the seemingly stochastic cytoarchitecture of β cells in an islet requires mathematical methods that can capture topological connectivity in the entire β-cell population in an islet. Graph theory provides such a framework. Using large-scale imaging data for thousands of islets containing hundreds of thousands of cells in human organ donor pancreata, we show that quantitative graph characteristics differ between control and type 2 diabetic islets. Further insight into the processes that shape and maintain this architecture is obtained by formulating a stochastic theory of β-cell rearrangement in whole islets, just as the normal equilibrium distribution of the Ornstein-Uhlenbeck process can be viewed as the result of the interplay between a random walk and a linear restoring force. Requiring that rearrangements maintain the observed quantitative topological graph characteristics strongly constrained possible processes. Our results suggest that β-cell rearrangement is dependent on its connectivity in order to maintain an optimal cluster size in both normal and T2D islets.
NASA Astrophysics Data System (ADS)
Bejjani, A.; Roumié, M.; Akkad, S.; El-Yazbi, F.; Nsouli, B.
2016-03-01
We have demonstrated, in previous studies that Particle Induced X-ray Emission (PIXE) is one of the most rapid and accurate choices for quantification of an active ingredient, in a solid drug, from the reactions induced on its specific heteroatom using pellets made from original tablets. In this work, PIXE is used, for the first time, for simultaneous quantification of two active ingredients, amoxicillin trihydrate and potassium clavulanate, in six different commercial antibiotic type of drugs. Since the quality control process of a drug covers a large number of samples, the scope of this study was also to found the most rapid and low cost sample preparation needed to analyze these drugs with a good precision. The chosen drugs were analyzed in their tablets' "as received" form, in pellets made from the powder of the tablets and also in pellets made from the powder of the tablets after being heated up to 70 °C to avoid any molecular destruction until constant weight and removal of humidity. The quantification validity related to the aspects of each sample preparation (homogeneity of the drug components and humidity) are presented and discussed.
Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.
2018-02-01
In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.
Nie, Hui; Evans, Alison A.; London, W. Thomas; Block, Timothy M.; Ren, Xiangdong David
2011-01-01
Hepatitis B virus (HBV) carrying the A1762T/G1764A double mutation in the basal core promoter (BCP) region is associated with HBe antigen seroconversion and increased risk of liver cirrhosis and hepatocellular carcinoma (HCC). Quantification of the mutant viruses may help in predicting the risk of HCC. However, the viral genome tends to have nucleotide polymorphism, which makes it difficult to design hybridization-based assays including real-time PCR. Ultrasensitive quantification of the mutant viruses at the early developmental stage is even more challenging, as the mutant is masked by excessive amounts of the wild-type (WT) viruses. In this study, we developed a selective inhibitory PCR (siPCR) using a locked nucleic acid-based PCR blocker to selectively inhibit the amplification of the WT viral DNA but not the mutant DNA. At the end of siPCR, the proportion of the mutant could be increased by about 10,000-fold, making the mutant more readily detectable by downstream applications such as real-time PCR and DNA sequencing. We also describe a primer-probe partial overlap approach which significantly simplified the melting curve patterns and minimized the influence of viral genome polymorphism on assay accuracy. Analysis of 62 patient samples showed a complete match of the melting curve patterns with the sequencing results. More than 97% of HBV BCP sequences in the GenBank database can be correctly identified by the melting curve analysis. The combination of siPCR and the SimpleProbe real-time PCR enabled mutant quantification in the presence of a 100,000-fold excess of the WT DNA. PMID:21562108
Use of Multiple Competitors for Quantification of Human Immunodeficiency Virus Type 1 RNA in Plasma
Vener, Tanya; Nygren, Malin; Andersson, AnnaLena; Uhlén, Mathias; Albert, Jan; Lundeberg, Joakim
1998-01-01
Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens. PMID:9650926
NASA Astrophysics Data System (ADS)
Saha, Debajyoti; Shaw, Pankaj Kumar; Ghosh, Sabuj; Janaki, M. S.; Sekar Iyengar, A. N.
2018-01-01
We have carried out a detailed study of scaling region using detrended fractal analysis test by applying different forcing likewise noise, sinusoidal, square on the floating potential fluctuations acquired under different pressures in a DC glow discharge plasma. The transition in the dynamics is observed through recurrence plot techniques which is an efficient method to observe the critical regime transitions in dynamics. The complexity of the nonlinear fluctuation has been revealed with the help of recurrence quantification analysis which is a suitable tool for investigating recurrence, an ubiquitous feature providing a deep insight into the dynamics of real dynamical system. An informal test for stationarity which checks for the compatibility of nonlinear approximations to the dynamics made in different segments in a time series has been proposed. In case of sinusoidal, noise, square forcing applied on fluctuation acquired at P = 0.12 mbar only one dominant scaling region is observed whereas the forcing applied on fluctuation (P = 0.04 mbar) two prominent scaling regions have been explored reliably using different forcing amplitudes indicating the signature of crossover phenomena. Furthermore a persistence long range behavior has been observed in one of these scaling regions. A comprehensive study of the quantification of scaling exponents has been carried out with the increase in amplitude and frequency of sinusoidal, square type of forcings. The scalings exponent is envisaged to be the roughness of the time series. The method provides a single quantitative idea of the scaling exponent to quantify the correlation properties of a signal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsuchiya, Hikaru; Tanaka, Keiji, E-mail: tanaka-kj@igakuken.or.jp; Saeki, Yasushi, E-mail: saeki-ys@igakuken.or.jp
2013-06-28
Highlights: •The parallel reaction monitoring method was applied to ubiquitin quantification. •The ubiquitin PRM method is highly sensitive even in biological samples. •Using the method, we revealed that Ufd4 assembles the K29-linked ubiquitin chain. -- Abstract: Ubiquitylation is an essential posttranslational protein modification that is implicated in a diverse array of cellular functions. Although cells contain eight structurally distinct types of polyubiquitin chains, detailed function of several chain types including K29-linked chains has remained largely unclear. Current mass spectrometry (MS)-based quantification methods are highly inefficient for low abundant atypical chains, such as K29- and M1-linked chains, in complex mixtures thatmore » typically contain highly abundant proteins. In this study, we applied parallel reaction monitoring (PRM), a quantitative, high-resolution MS method, to quantify ubiquitin chains. The ubiquitin PRM method allows us to quantify 100 attomole amounts of all possible ubiquitin chains in cell extracts. Furthermore, we quantified ubiquitylation levels of ubiquitin-proline-β-galactosidase (Ub-P-βgal), a historically known model substrate of the ubiquitin fusion degradation (UFD) pathway. In wild-type cells, Ub-P-βgal is modified with ubiquitin chains consisting of 21% K29- and 78% K48-linked chains. In contrast, K29-linked chains are not detected in UFD4 knockout cells, suggesting that Ufd4 assembles the K29-linked ubiquitin chain(s) on Ub-P-βgal in vivo. Thus, the ubiquitin PRM is a novel, useful, quantitative method for analyzing the highly complicated ubiquitin system.« less
Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.
2013-01-01
A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168
Damond, F; Benard, A; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise
2011-10-01
Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHI(E)V(2E) study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log(10) copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log(10) copies/ml and 3.7 log(10) copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed.
Damond, F.; Benard, A.; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise
2011-01-01
Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHIEV2E study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log10 copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log10 copies/ml and 3.7 log10 copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed. PMID:21813718
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-19
Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food.
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-01
Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808
Aerosol-type retrieval and uncertainty quantification from OMI data
NASA Astrophysics Data System (ADS)
Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna
2017-11-01
We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model selection. The posterior probability distribution can provide a comprehensive characterisation of the uncertainty in this kind of problem for aerosol-type selection. As a result, the proposed method can account for the model error and also include the model selection uncertainty in the total uncertainty budget.
Quantification of the Relationship between Surrogate Fuel Structure and Performance
2012-07-31
order to account for know deficiencies [18]. The frequencies are then used to calculate the zero point energy ( ZPE ). In the G3 theory HF/6-31G* was used...for the ZPE and the new procedure is likely to be more reliable. Also in contrast to previous G series composite methods, the Hartree–Fock energy...The total energy is obtained by adding the previously calculated ZPE . Durant and Rohlfing [38] reported that B3LYP density functional methods provide
Assessing Spontaneous Combustion Instability with Recurrence Quantification Analysis
NASA Technical Reports Server (NTRS)
Eberhart, Chad J.; Casiano, Matthew J.
2016-01-01
Spontaneous instabilities can pose a significant challenge to verification of combustion stability, and characterizing its onset is an important avenue of improvement for stability assessments of liquid propellant rocket engines. Recurrence Quantification Analysis (RQA) is used here to explore nonlinear combustion dynamics that might give insight into instability. Multiple types of patterns representative of different dynamical states are identified within fluctuating chamber pressure data, and markers for impending instability are found. A class of metrics which describe these patterns is also calculated. RQA metrics are compared with and interpreted against another metric from nonlinear time series analysis, the Hurst exponent, to help better distinguish between stable and unstable operation.
Jeanneau, Laurent; Faure, Pierre
2010-09-01
The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.
Qian, Yiyun; Zhu, Zhenhua; Duan, Jin-Ao; Guo, Sheng; Shang, Erxin; Tao, Jinhua; Su, Shulan; Guo, Jianming
2017-01-15
A highly sensitive method using ultra-high-pressure liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap-MS) has been developed and validated for the simultaneous identification and quantification of ginkgolic acids and semi-quantification of their metabolites in rat plasma. For the five selected ginkgolic acids, the method was found to be with good linearities (r>0.9991), good intra- and inter-day precisions (RSD<15%), and good accuracies (RE, from -10.33% to 4.92%) as well. Extraction recoveries, matrix effects and stabilities for rat plasm samples were within the required limits. The validated method was successfully applied to investigate the pharmacokinetics of the five ginkgolic acids in rat plasma after oral administration of 3 dosage groups (900mg/kg, 300mg/kg and 100mg/kg). Meanwhile, six metabolites of GA (15:1) and GA (17:1) were identified by comparison of MS data with reported values. The results of validation in terms of linear ranges, precisions and stabilities were established for semi-quantification of metabolites. The curves of relative changes of these metabolites during the metabolic process were constructed by plotting the peak area ratios of metabolites to salicylic acid (internal standard, IS), respectively. Double peaks were observed in all 3 dose groups. Different type of metabolites and different dosage of each metabolite both resulted in different T max . Copyright © 2016 Elsevier B.V. All rights reserved.
Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J
2018-01-01
There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Frees, Edward W.; Kim, Jee-Seon
2006-01-01
Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…
Quantification of smoothness index differences related to LTPP equipment type : tech brief.
DOT National Transportation Integrated Search
2006-07-01
Researchers in the Long-Term Pavement Performance : (LTPP) program are conducting a major data collection : effort. They are using an inertial profiler to collect longitudinal : profile data at regular intervals on two wheelpaths : located along the ...
Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele
QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less
Extreme Beta-Cell Deficiency in Pancreata of Dogs with Canine Diabetes
Shields, Emily J.; Lam, Carol J.; Cox, Aaron R.; Rankin, Matthew M.; Van Winkle, Thomas J.; Hess, Rebecka S.; Kushner, Jake A.
2015-01-01
The pathophysiology of canine diabetes remains poorly understood, in part due to enigmatic clinical features and the lack of detailed histopathology studies. Canine diabetes, similar to human type 1 diabetes, is frequently associated with diabetic ketoacidosis at onset or after insulin omission. However, notable differences exist. Whereas human type 1 diabetes often occurs in children, canine diabetes is typically described in middle age to elderly dogs. Many competing theories have been proposed regarding the underlying cause of canine diabetes, from pancreatic atrophy to chronic pancreatitis to autoimmune mediated β-cell destruction. It remains unclear to what extent β-cell loss contributes to canine diabetes, as precise quantifications of islet morphometry have not been performed. We used high-throughput microscopy and automated image processing to characterize islet histology in a large collection of pancreata of diabetic dogs. Diabetic pancreata displayed a profound reduction in β-cells and islet endocrine cells. Unlike humans, canine non-diabetic islets are largely comprised of β-cells. Very few β-cells remained in islets of diabetic dogs, even in pancreata from new onset cases. Similarly, total islet endocrine cell number was sharply reduced in diabetic dogs. No compensatory proliferation or lymphocyte infiltration was detected. The majority of pancreata had no evidence of pancreatitis. Thus, canine diabetes is associated with extreme β-cell deficiency in both new and longstanding disease. The β-cell predominant composition of canine islets and the near-total absence of β-cells in new onset elderly diabetic dogs strongly implies that similar to human type 1 diabetes, β-cell loss underlies the pathophysiology of canine diabetes. PMID:26057531
The small world of osteocytes: connectomics of the lacuno-canalicular network in bone
NASA Astrophysics Data System (ADS)
Kollmannsberger, Philip; Kerschnitzki, Michael; Repp, Felix; Wagermaier, Wolfgang; Weinkamer, Richard; Fratzl, Peter
2017-07-01
Osteocytes and their cell processes reside in a large, interconnected network of voids pervading the mineralized bone matrix of most vertebrates. This osteocyte lacuno-canalicular network (OLCN) is believed to play important roles in mechanosensing, mineral homeostasis, and for the mechanical properties of bone. While the extracellular matrix structure of bone is extensively studied on ultrastructural and macroscopic scales, there is a lack of quantitative knowledge on how the cellular network is organized. Using a recently introduced imaging and quantification approach, we analyze the OLCN in different bone types from mouse and sheep that exhibit different degrees of structural organization not only of the cell network but also of the fibrous matrix deposited by the cells. We define a number of robust, quantitative measures that are derived from the theory of complex networks. These measures enable us to gain insights into how efficient the network is organized with regard to intercellular transport and communication. Our analysis shows that the cell network in regularly organized, slow-growing bone tissue from sheep is less connected, but more efficiently organized compared to irregular and fast-growing bone tissue from mice. On the level of statistical topological properties (edges per node, edge length and degree distribution), both network types are indistinguishable, highlighting that despite pronounced differences at the tissue level, the topological architecture of the osteocyte canalicular network at the subcellular level may be independent of species and bone type. Our results suggest a universal mechanism underlying the self-organization of individual cells into a large, interconnected network during bone formation and mineralization.
NASA Astrophysics Data System (ADS)
Ingale, S. V.; Datta, D.
2010-10-01
Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.
Recurrence quantification analysis of human postural fluctuations in older fallers and non-fallers.
Ramdani, Sofiane; Tallon, Guillaume; Bernard, Pierre Louis; Blain, Hubert
2013-08-01
We investigate postural sway data dynamics in older adult fallers and non-fallers. Center of pressure (COP) signals were recorded during quiet standing in 28 older adults. The subjects were divided in two groups: with and without history of falls. COP time series were analyzed using recurrence quantification analysis (RQA) in both anteroposterior and mediolateral (ML) directions. Classical stabilometric variables (path length and range) were also computed. The results showed that RQA outputs quantifying predictability of COP fluctuations and Shannon entropy of recurrence plot diagonal line length distribution, were significantly higher in fallers, only for ML direction. In addition, the range of ML COP signals was also significantly higher in fallers. This result is in accordance with some findings of the literature and could be interpreted as an increased hip strategy in fallers. The RQA results seem coherent with the theory of loss of complexity with aging and disease. Our results suggest that RQA is a promising approach for the investigation of COP fluctuations in a frail population.
Quantification of sensory and food quality: the R-index analysis.
Lee, Hye-Seong; van Hout, Danielle
2009-08-01
The accurate quantification of sensory difference/similarity between foods, as well as consumer acceptance/preference and concepts, is greatly needed to optimize and maintain food quality. The R-Index is one class of measures of the degree of difference/similarity, and was originally developed for sensory difference tests for food quality control, product development, and so on. The index is based on signal detection theory and is free of the response bias that can invalidate difference testing protocols, including categorization and same-different and A-Not A tests. It is also a nonparametric analysis, making no assumptions about sensory distributions, and is simple to compute and understand. The R-Index is also flexible in its application. Methods based on R-Index analysis have been used as detection and sensory difference tests, as simple alternatives to hedonic scaling, and for the measurement of consumer concepts. This review indicates the various computational strategies for the R-Index and its practical applications to consumer and sensory measurements in food science.
Yamashita, Shizuya; Kawase, Ryota; Nakaoka, Hajime; Nakatani, Kazuhiro; Inagaki, Miwako; Yuasa-Kawase, Miyako; Tsubakio-Yamamoto, Kazumi; Sandoval, Jose C; Masuda, Daisaku; Ohama, Tohru; Nakagawa-Toyama, Yumiko; Matsuyama, Akifumi; Nishida, Makoto; Ishigami, Masato
2009-12-01
In routine clinical laboratory testing and numerous epidemiological studies, LDL-cholesterol (LDL-C) has been estimated commonly using the Friedewald equation. We investigated the relationship between the Friedewald equation and 4 homogeneous assays for LDL-C. LDL-C was determined by 4 homogeneous assays [liquid selective detergent method: LDL-C (L), selective solubilization method: LDL-C (S), elimination method: LDL-C (E), and enzyme selective protecting method: LDL-C (P)]. Samples with discrepancies between the Friedewald equation and the 4 homogeneous assays for LDL-C were subjected to polyacrylamide gel electrophoresis and the beta-quantification method. The correlations between the Friedewald equation and the 4 homogeneous LDL-C assays were as follows: LDL-C (L) (r=0.962), LDL-C (S) (r=0.986), LDL-C (E) (r=0.946) and LDL-C (P) (r=0.963). Discrepancies were observed in sera from type III hyperlipoproteinemia patients and in sera containing large amounts of midband and small dense LDL on polyacrylamide gel electrophoresis. LDL-C (S) was most strongly correlated with the beta-quantification method even in sera from patients with type III hyperlipoproteinemia. Of the 4 homogeneous assays for LDL-C, LDL-C (S) exhibited the closest correlation with the Friedewald equation and the beta-quantification method, thus reflecting the current clinical databases for coronary heart disease.
Qin, Hua-Li; Chen, Xiao-Qing; Huang, Yi-Zhen; Kantchev, Eric Assen B
2014-09-26
First-principles modelling of the diastereomeric transition states in the enantiodiscrimination stage of the catalytic cycle can reveal intimate details about the mechanism of enantioselection. This information can be invaluable for further improvement of the catalytic protocols by rational design. Herein, we present a density functional theory (IEFPCM/PBE0/DGDZVP level of theory) modelling of the carborhodation step for the asymmetric 1,4-arylation of cyclic α,β-unsaturated ketones mediated by a [(binap)Rh(I)] catalyst. The calculations completely support the older, qualitative, pictorial model predicting the sense of the asymmetric induction for both the chelating diphosphane (binap) and the more recent chiral diene (Phbod) ligands, while also permitting quantification of the enantiomeric excess (ee). The effect of dispersion interaction correction and basis sets has been also investigated. Dispersion-corrected functionals and solvation models significantly improve the predicted ee values. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Ren, Jie
2017-12-01
The process by which a kinesin motor couples its ATPase activity with concerted mechanical hand-over-hand steps is a foremost topic of molecular motor physics. Two major routes toward elucidating kinesin mechanisms are the motility performance characterization of velocity and run length, and single-molecular state detection experiments. However, these two sets of experimental approaches are largely uncoupled to date. Here, we introduce an integrative motility state analysis based on a theorized kinetic graph theory for kinesin, which, on one hand, is validated by a wealth of accumulated motility data, and, on the other hand, allows for rigorous quantification of state occurrences and chemomechanical cycling probabilities. An interesting linear scaling for kinesin motility performance across species is discussed as well. An integrative kinetic graph theory analysis provides a powerful tool to bridge motility and state characterization experiments, so as to forge a unified effort for the elucidation of the working mechanisms of molecular motors.
New type IIB backgrounds and aspects of their field theory duals
NASA Astrophysics Data System (ADS)
Caceres, Elena; Macpherson, Niall T.; Núñez, Carlos
2014-08-01
In this paper we study aspects of geometries in Type IIA and Type IIB String theory and elaborate on their field theory dual pairs. The backgrounds are associated with reductions to Type IIA of solutions with G 2 holonomy in eleven dimensions. We classify these backgrounds according to their G-structure, perform a non-Abelian T-duality on them and find new Type IIB configurations presenting dynamical SU(2)-structure. We study some aspects of the associated field theories defined by these new backgrounds. Various technical details are clearly spelled out.
Wang, Chunyan; Zhu, Hongbin; Pi, Zifeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2013-09-15
An analytical method for quantifying underivatized amino acids (AAs) in urine samples of rats was developed by using liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Classification of type 2 diabetes rats was based on urine amino acids metabolic profiling. LC-MS/MS analysis was applied through chromatographic separation and multiple reactions monitoring (MRM) transitions of MS/MS. Multivariate profile-wide predictive models were constructed using partial least squares discriminant analysis (PLS-DA) by SIMAC-P 11.5 version software package and hierarchical cluster analysis (HCA) by SPSS 18.0 version software. Some amino acids in urine of rats have significant change. The results of the present study prove that this method could perform the quantification of free AAs in urine of rats by using LC-MS/MS. In summary, the PLS-DA and HCA statistical analysis in our research were preferable to differentiate healthy rats and type 2 diabetes rats by the quantification of AAs in their urine samples. In addition, comparing with health group the seven increased amino acids in urine of type 2 rats were returned to normal under the treatment of acarbose. Copyright © 2013 Elsevier B.V. All rights reserved.
Morales, Juan; Alonso-Nanclares, Lidia; Rodríguez, José-Rodrigo; DeFelipe, Javier; Rodríguez, Ángel; Merchán-Pérez, Ángel
2011-01-01
The synapses in the cerebral cortex can be classified into two main types, Gray's type I and type II, which correspond to asymmetric (mostly glutamatergic excitatory) and symmetric (inhibitory GABAergic) synapses, respectively. Hence, the quantification and identification of their different types and the proportions in which they are found, is extraordinarily important in terms of brain function. The ideal approach to calculate the number of synapses per unit volume is to analyze 3D samples reconstructed from serial sections. However, obtaining serial sections by transmission electron microscopy is an extremely time consuming and technically demanding task. Using focused ion beam/scanning electron microscope microscopy, we recently showed that virtually all synapses can be accurately identified as asymmetric or symmetric synapses when they are visualized, reconstructed, and quantified from large 3D tissue samples obtained in an automated manner. Nevertheless, the analysis, segmentation, and quantification of synapses is still a labor intensive procedure. Thus, novel solutions are currently necessary to deal with the large volume of data that is being generated by automated 3D electron microscopy. Accordingly, we have developed ESPINA, a software tool that performs the automated segmentation and counting of synapses in a reconstructed 3D volume of the cerebral cortex, and that greatly facilitates and accelerates these processes. PMID:21633491
Ghedira, Rim; Papazova, Nina; Vuylsteke, Marnik; Ruttink, Tom; Taverniers, Isabel; De Loose, Marc
2009-10-28
GMO quantification, based on real-time PCR, relies on the amplification of an event-specific transgene assay and a species-specific reference assay. The uniformity of the nucleotide sequences targeted by both assays across various transgenic varieties is an important prerequisite for correct quantification. Single nucleotide polymorphisms (SNPs) frequently occur in the maize genome and might lead to nucleotide variation in regions used to design primers and probes for reference assays. Further, they may affect the annealing of the primer to the template and reduce the efficiency of DNA amplification. We assessed the effect of a minor DNA template modification, such as a single base pair mismatch in the primer attachment site, on real-time PCR quantification. A model system was used based on the introduction of artificial mismatches between the forward primer and the DNA template in the reference assay targeting the maize starch synthase (SSIIb) gene. The results show that the presence of a mismatch between the primer and the DNA template causes partial to complete failure of the amplification of the initial DNA template depending on the type and location of the nucleotide mismatch. With this study, we show that the presence of a primer/template mismatch affects the estimated total DNA quantity to a varying degree.
Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio
2008-06-25
Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.
Cadieux, Brigitte; Blanchfield, Burke; Smith, James P; Austin, John W
2005-05-01
A simple, rapid, cost-effective in vitro slot blot immunoassay was developed for the detection and quantification of botulinum neurotoxin type E (BoNT/E) in cultures. Culture supernatants of 36 strains of clostridia, including 12 strains of Clostridium botulinum type E, 12 strains of other C. botulinum neurotoxin serotypes, and 12 strains of other clostridial species were tested. Samples containing BoNT/E were detected using affinity-purified polyclonal rabbit antisera prepared against BoNT/E with subsequent detection of secondary antibodies using chemiluminescence. All strains of C. botulinum type E tested positive, while all non C. botulinum type E strains tested negative. The sensitivity of the slot blot immunoassay for detection of BoNT/E was approximately four mouse lethal doses (MLD). The intensity of chemiluminescence was directly correlated with the concentration of BoNT/E up to 128 MLD, allowing quantification of BoNT/E between 4 and 128 MLD. The slot blot immunoassay was compared to the mouse bioassay for detection of BoNT/E using cultures derived from fish samples inoculated with C. botulinum type E, and cultures derived from naturally contaminated environmental samples. A total of 120 primary enrichment cultures derived from fish samples, of which 103 were inoculated with C. botulinum type E, and 17 were uninoculated controls, were assayed. Of the 103 primary enrichment cultures derived from inoculated fish samples, all were positive by mouse bioassay, while 94 were also positive by slot blot immunoassay, resulting in a 7.5% false-negative rate. All 17 primary enrichment cultures derived from the uninoculated fish samples were negative by both mouse bioassay and slot blot immunoassay. A total of twenty-six primary enrichment cultures derived from environmental samples were tested by mouse bioassay and slot blot immunoassay. Of 13 primary enrichment cultures positive by mouse bioassay, 12 were also positive by slot blot immunoassay, resulting in a 3.8% false-negative rate. All 13 primary enrichment cultures that tested negative by mouse bioassay also tested negative by slot blot immunoassay. The slot blot immunoassay could be used routinely as a positive screen for BoNT/E in primary enrichment cultures, and could be used as a replacement for the mouse bioassay for pure cultures.
NASA Astrophysics Data System (ADS)
Zhu, Xiaoqin; Liao, Chenxi; Wang, Zhenyu; Zhuo, Shuangmu; Liu, Wenge; Chen, Jianxin
2016-10-01
Hyaline cartilage is a semitransparent tissue composed of proteoglycan and thicker type II collagen fibers, while fibro cartilage large bundles of type I collagen besides other territorial matrix and chondrocytes. It is reported that the meniscus (fibro cartilage) has a greater capacity to regenerate and close a wound compared to articular cartilage (hyaline cartilage). And fibro cartilage often replaces the type II collagen-rich hyaline following trauma, leading to scar tissue that is composed of rigid type I collagen. The visualization and quantification of the collagen fibrillar meshwork is important for understanding the role of fibril reorganization during the healing process and how different types of cartilage contribute to wound closure. In this study, second harmonic generation (SHG) microscope was applied to image the articular and meniscus cartilage, and textural analysis were developed to quantify the collagen distribution. High-resolution images were achieved based on the SHG signal from collagen within fresh specimens, and detailed observations of tissue morphology and microstructural distribution were obtained without shrinkage or distortion. Textural analysis of SHG images was performed to confirm that collagen in fibrocartilage showed significantly coarser compared to collagen in hyaline cartilage (p < 0.01). Our results show that each type of cartilage has different structural features, which may significantly contribute to pathology when damaged. Our findings demonstrate that SHG microscopy holds potential as a clinically relevant diagnostic tool for imaging degenerative tissues or assessing wound repair following cartilage injury.
CUHK Papers in Linguistics, Number 3.
ERIC Educational Resources Information Center
Yip, Virginia, Ed.
1991-01-01
Papers in this volume include the following: "Constraints on Dative Acquisition by Chinese ESL Learners" (Hua Dong Fan); "The Learnability of Locality Conditions on Quantification" (Thomas Lee); "Do Learning Environments Make a Difference? A Study on the Acquisition of the English Interrogatives by Three Types of Cantonese…
Mendes, César S; Bartos, Imre; Akay, Turgay; Márka, Szabolcs; Mann, Richard S
2013-01-01
Coordinated walking in vertebrates and multi-legged invertebrates such as Drosophila melanogaster requires a complex neural network coupled to sensory feedback. An understanding of this network will benefit from systems such as Drosophila that have the ability to genetically manipulate neural activities. However, the fly's small size makes it challenging to analyze walking in this system. In order to overcome this limitation, we developed an optical method coupled with high-speed imaging that allows the tracking and quantification of gait parameters in freely walking flies with high temporal and spatial resolution. Using this method, we present a comprehensive description of many locomotion parameters, such as gait, tarsal positioning, and intersegmental and left-right coordination for wild type fruit flies. Surprisingly, we find that inactivation of sensory neurons in the fly's legs, to block proprioceptive feedback, led to deficient step precision, but interleg coordination and the ability to execute a tripod gait were unaffected. DOI: http://dx.doi.org/10.7554/eLife.00231.001 PMID:23326642
Peng, Dan; Bi, Yanlan; Ren, Xiaona; Yang, Guolong; Sun, Shangde; Wang, Xuede
2015-12-01
This study was performed to develop a hierarchical approach for detection and quantification of adulteration of sesame oil with vegetable oils using gas chromatography (GC). At first, a model was constructed to discriminate the difference between authentic sesame oils and adulterated sesame oils using support vector machine (SVM) algorithm. Then, another SVM-based model is developed to identify the type of adulterant in the mixed oil. At last, prediction models for sesame oil were built for each kind of oil using partial least square method. To validate this approach, 746 samples were prepared by mixing authentic sesame oils with five types of vegetable oil. The prediction results show that the detection limit for authentication is as low as 5% in mixing ratio and the root-mean-square errors for prediction range from 1.19% to 4.29%, meaning that this approach is a valuable tool to detect and quantify the adulteration of sesame oil. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bergmeister, Konstantin D; Gröger, Marion; Aman, Martin; Willensdorfer, Anna; Manzano-Szalai, Krisztina; Salminger, Stefan; Aszmann, Oskar C
2016-08-01
Skeletal muscle consists of different fiber types which adapt to exercise, aging, disease, or trauma. Here we present a protocol for fast staining, automatic acquisition, and quantification of fiber populations with ImageJ. Biceps and lumbrical muscles were harvested from Sprague-Dawley rats. Quadruple immunohistochemical staining was performed on single sections using antibodies against myosin heavy chains and secondary fluorescent antibodies. Slides were scanned automatically with a slide scanner. Manual and automatic analyses were performed and compared statistically. The protocol provided rapid and reliable staining for automated image acquisition. Analyses between manual and automatic data indicated Pearson correlation coefficients for biceps of 0.645-0.841 and 0.564-0.673 for lumbrical muscles. Relative fiber populations were accurate to a degree of ± 4%. This protocol provides a reliable tool for quantification of muscle fiber populations. Using freely available software, it decreases the required time to analyze whole muscle sections. Muscle Nerve 54: 292-299, 2016. © 2016 Wiley Periodicals, Inc.
From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches
Potter, Kristin; Rosen, Paul; Johnson, Chris R.
2014-01-01
Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly used in visualization. We then look at various approaches to visualizing this uncertainty by partitioning the work based on the dimensionality of the data and the dimensionality of the uncertainty. We also discuss noteworthy exceptions to our taxonomy along with future research directions for the uncertainty visualization community. PMID:25663949
Whole farm quantification of GHG emissions within smallholder farms in developing countries
NASA Astrophysics Data System (ADS)
Seebauer, Matthias
2014-03-01
The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.
Martins, Ayrton F; Frank, Carla da S; Altissimo, Joseline; de Oliveira, Júlia A; da Silva, Daiane S; Reichert, Jaqueline F; Souza, Darliana M
2017-08-24
Statins are classified as being amongst the most prescribed agents for treating hypercholesterolaemia and preventing vascular diseases. In this study, a rapid and effective liquid chromatography method, assisted by diode array detection, was designed and validated for the simultaneous quantification of atorvastatin (ATO) and simvastatin (SIM) in hospital effluent samples. The solid phase extraction (SPE) of the analytes was optimized regarding sorbent material and pH, and the dispersive liquid-liquid microextraction (DLLME), in terms of pH, ionic strength, type and volume of extractor/dispersor solvents. The performance of both extraction procedures was evaluated in terms of linearity, quantification limits, accuracy (recovery %), precision and matrix effects for each analyte. The methods proved to be linear in the concentration range considered; the quantification limits were 0.45 µg L -1 for ATO and 0.75 µg L -1 for SIM; the matrix effect was almost absent in both methods and the average recoveries remained between 81.5-90.0%; and the RSD values were <20%. The validated methods were applied to the quantification of the statins in real samples of hospital effluent; the concentrations ranged from 18.8 µg L -1 to 35.3 µg L -1 for ATO, and from 30.3 µg L -1 to 38.5 µg L -1 for SIM. Since the calculated risk quotient was ≤192, the occurrence of ATO and SIM in hospital effluent poses a potential serious risk to human health and the aquatic ecosystem.
It is the Theory Which Decides What We Can Observe (Einstein)
NASA Astrophysics Data System (ADS)
Filk, Thomas
In this chapter I will give examples for three types of contextuality: theory as context, a theory of measurement as context, and environmental and internal conditions as context. In particular, I will argue that depending on which theory of measurements we attribute to Bohmian mechanics, this theory may be called a classical theory or a quantum theory. Furthermore, I will show that for neural networks one can define in a natural way two different theories of measurements which can be compared with scanner-type measurements on the one hand and psychological experiments on the other hand. The later theory of measurements for neural networks leads to non-commutativity and even quantum-like contextuality. It is shown that very simple neural networks can violate Bell-type inequalities.
Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun
2012-01-01
How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.
Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun
2012-01-01
How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961
NASA Astrophysics Data System (ADS)
Kardanpour, Z.; Jacobsen, O. S.; Esbensen, K. H.
2015-06-01
This study is a contribution to development of a heterogeneity characterisation facility for "next generation" sampling aimed at more realistic and controllable pesticide variability in laboratory pots in experimental environmental contaminant assessment. The role of soil heterogeneity on quantification of a set of exemplar parameters, organic matter, loss on ignition (LOI), biomass, soil microbiology, MCPA sorption and mineralization is described, including a brief background on how heterogeneity affects sampling/monitoring procedures in environmental pollutant studies. The Theory of Sampling (TOS) and variographic analysis has been applied to develop a fit-for-purpose heterogeneity characterization approach. All parameters were assessed in large-scale profile (1-100 m) vs. small-scale (0.1-1 m) replication sampling pattern. Variographic profiles of experimental analytical results concludes that it is essential to sample at locations with less than a 2.5 m distance interval to benefit from spatial auto-correlation and thereby avoid unnecessary, inflated compositional variation in experimental pots; this range is an inherent characteristic of the soil heterogeneity and will differ among soils types. This study has a significant carrying-over potential for related research areas e.g. soil science, contamination studies, and environmental monitoring and environmental chemistry.
Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI
NASA Astrophysics Data System (ADS)
Fetita, Catalin; Thong, William E.; Ou, Phalla
2013-03-01
This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.
Gettings, Mark E.; Bultman, Mark W.
1993-01-01
An application of possibility theory from fuzzy logic to the quantification of favorableness for quartz-carbonate vein deposits in the southern Santa Rita Mountains of southeastern Arizona is described. Three necessary but probably not sufficient conditions for the formation of these deposits were defined as the occurrence of carbonate berain rocks within hypabyssal depths, significant fracturing of the rocks, and proximity to a felsic intrusive. The quality of data available to evaluate these conditions is variable over the study area. The possibility of each condition was represented as a fuzzy set enumerated over the area. The intersection of the sets measures the degree of simultaneous occurrence of hte necessary factors and provides a measure of the possibility of deposit occurrence. Using fuzzy set technicques, the effect of one or more fuzzy sets relative to the others in the intersection can be controlled and logical combinations of the sets can be used to impose a time sequential constraint on the necessary conditions. Other necessary conditions, and supplementary conditions such as variable data quality or intensity of exploration can be included in the analysis by their proper representation as fuzzy sets.
The physical hydrogeology of ore deposits
Ingebritsen, Steven E.; Appold, M.S.
2012-01-01
Hydrothermal ore deposits represent a convergence of fluid flow, thermal energy, and solute flux that is hydrogeologically unusual. From the hydrogeologic perspective, hydrothermal ore deposition represents a complex coupled-flow problem—sufficiently complex that physically rigorous description of the coupled thermal (T), hydraulic (H), mechanical (M), and chemical (C) processes (THMC modeling) continues to challenge our computational ability. Though research into these coupled behaviors has found only a limited subset to be quantitatively tractable, it has yielded valuable insights into the workings of hydrothermal systems in a wide range of geologic environments including sedimentary, metamorphic, and magmatic. Examples of these insights include the quantification of likely driving mechanisms, rates and paths of fluid flow, ore-mineral precipitation mechanisms, longevity of hydrothermal systems, mechanisms by which hydrothermal fluids acquire their temperature and composition, and the controlling influence of permeability and other rock properties on hydrothermal fluid behavior. In this communication we review some of the fundamental theory needed to characterize the physical hydrogeology of hydrothermal systems and discuss how this theory has been applied in studies of Mississippi Valley-type, tabular uranium, porphyry, epithermal, and mid-ocean ridge ore-forming systems. A key limitation in the computational state-of-the-art is the inability to describe fluid flow and transport fully in the many ore systems that show evidence of repeated shear or tensional failure with associated dynamic variations in permeability. However, we discuss global-scale compilations that suggest some numerical constraints on both mean and dynamically enhanced crustal permeability. Principles of physical hydrogeology can be powerful tools for investigating hydrothermal ore formation and are becoming increasingly accessible with ongoing advances in modeling software.
Angiogenesis and Therapeutic Approaches to NF1 Tumors
2007-04-01
corneal neovascularization model was developed. In this model, the avascularity of the cornea highly facilitates the quantification of neovascularture...wild-type corneas in avascular area. However, in the NV zone, the number of macrophage was 4.6-fold greater in Nf1þ /– corneas than wild-type corneas...GEM tumor classification because of low cellularity and no necrosis . They exceed that clas- sification, however, due to their low to moderate prolif
Quantification of error associated with stormwater and wastewater flow measurement devices
A novel flow testbed has been designed to evaluate the performance of flumes as flow measurement devices. The newly constructed testbed produces both steady and unsteady flows ranging from 10 to 1500 gpm. Two types of flumes (Parshall and trapezoidal) are evaluated under differen...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Semrau, P.
The purpose of this study was to analyze selected cognitive theories in the areas of artificial intelligence (A.I.) and psychology to determine the role of emotions in the cognitive or intellectual processes. Understanding the relationship of emotions to processes of intelligence has implications for constructing theories of aesthetic response and A.I. systems in art. Psychological theories were examined that demonstrated the changing nature of the research in emotion related to cognition. The basic techniques in A.I. were reviewed and the A.I. research was analyzed to determine the process of cognition and the role of emotion. The A.I. research emphasized themore » digital, quantifiable character of the computer and associated cognitive models and programs. In conclusion, the cognitive-emotive research in psychology and the cognitive research in A.I. emphasized quantification methods over analog and qualitative characteristics required for a holistic explanation of cognition. Further A.I. research needs to examine the qualitative aspects of values, attitudes, and beliefs on influencing the creative thinking processes. Inclusion of research related to qualitative problem solving in art provides a more comprehensive base of study for examining the area of intelligence in computers.« less
Assessing nonlinear structures in real exchange rates using recurrence plot strategies
NASA Astrophysics Data System (ADS)
Belaire-Franch, Jorge; Contreras, Dulce; Tordera-Lledó, Lorena
2002-11-01
Purchasing power parity (PPP) is an important theory at the basis of a large number of economic models. However, the implication derived from the theory that real exchange rates must follow stationary processes is not conclusively supported by empirical studies. In a recent paper, Serletis and Gogas [Appl. Finance Econ. 10 (2000) 615] show evidence of deterministic chaos in several OECD exchange rates. As a consequence, PPP rejections could be spurious. In this work, we follow a two-stage testing procedure to test for nonlinearities and chaos in real exchange rates, using a new set of techniques designed by Webber and Zbilut [J. Appl. Physiol. 76 (1994) 965], called recurrence quantification analysis (RQA). Our conclusions differ slightly from Serletis and Gogas [Appl. Finance Econ. 10 (2000) 615], but they are also supportive of chaos for some exchange rates.
Cooperative strings and glassy interfaces
Salez, Thomas; Salez, Justin; Dalnoki-Veress, Kari; Raphaël, Elie; Forrest, James A.
2015-01-01
We introduce a minimal theory of glass formation based on the ideas of molecular crowding and resultant string-like cooperative rearrangement, and address the effects of free interfaces. In the bulk case, we obtain a scaling expression for the number of particles taking part in cooperative strings, and we recover the Adam–Gibbs description of glassy dynamics. Then, by including thermal dilatation, the Vogel–Fulcher–Tammann relation is derived. Moreover, the random and string-like characters of the cooperative rearrangement allow us to predict a temperature-dependent expression for the cooperative length ξ of bulk relaxation. Finally, we explore the influence of sample boundaries when the system size becomes comparable to ξ. The theory is in agreement with measurements of the glass-transition temperature of thin polymer films, and allows quantification of the temperature-dependent thickness hm of the interfacial mobile layer. PMID:26100908
Music in film and animation: experimental semiotics applied to visual, sound and musical structures
NASA Astrophysics Data System (ADS)
Kendall, Roger A.
2010-02-01
The relationship of music to film has only recently received the attention of experimental psychologists and quantificational musicologists. This paper outlines theory, semiotical analysis, and experimental results using relations among variables of temporally organized visuals and music. 1. A comparison and contrast is developed among the ideas in semiotics and experimental research, including historical and recent developments. 2. Musicological Exploration: The resulting multidimensional structures of associative meanings, iconic meanings, and embodied meanings are applied to the analysis and interpretation of a range of film with music. 3. Experimental Verification: A series of experiments testing the perceptual fit of musical and visual patterns layered together in animations determined goodness of fit between all pattern combinations, results of which confirmed aspects of the theory. However, exceptions were found when the complexity of the stratified stimuli resulted in cognitive overload.
Li, N.; Yadav, S. K.; Liu, X. -Y.; ...
2015-11-05
Using the in situ indentation of TiN in a high-resolution transmission electron microscope, the nucleation of full as well as partial dislocations has been observed from {001} and {111} surfaces, respectively. The critical elastic strains associated with the nucleation of the dislocations were analyzed from the recorded atomic displacements, and the nucleation stresses corresponding to the measured critical strains were computed using density functional theory. The resolved shear stress was estimated to be 13.8 GPa for the partial dislocation 1/6 <110> {111} and 6.7 GPa for the full dislocation ½ <110> {110}. Moreover, such an approach of quantifying nucleation stressesmore » for defects via in situ high-resolution experiment coupled with density functional theory calculation may be applied to other unit processes.« less
Assessing and managing freshwater ecosystems vulnerable to global change
Angeler, David G.; Allen, Craig R.; Birge, Hannah E.; Drakare, Stina; McKie, Brendan G.; Johnson, Richard K.
2014-01-01
Freshwater ecosystems are important for global biodiversity and provide essential ecosystem services. There is consensus in the scientific literature that freshwater ecosystems are vulnerable to the impacts of environmental change, which may trigger irreversible regime shifts upon which biodiversity and ecosystem services may be lost. There are profound uncertainties regarding the management and assessment of the vulnerability of freshwater ecosystems to environmental change. Quantitative approaches are needed to reduce this uncertainty. We describe available statistical and modeling approaches along with case studies that demonstrate how resilience theory can be applied to aid decision-making in natural resources management. We highlight especially how long-term monitoring efforts combined with ecological theory can provide a novel nexus between ecological impact assessment and management, and the quantification of systemic vulnerability and thus the resilience of ecosystems to environmental change.
Elbeik, Tarek; Charlebois, Edwin; Nassos, Patricia; Kahn, James; Hecht, Frederick M.; Yajko, David; Ng, Valerie; Hadley, Keith
2000-01-01
Quantification of human immunodeficiency virus type 1 (HIV-1) RNA as a measure of viral load has greatly improved the monitoring of therapies for infected individuals. With the significant reductions in viral load now observed in individuals treated with highly active anti-retroviral therapy (HAART), viral load assays have been adapted to achieve greater sensitivity. Two commercially available ultrasensitive assays, the Bayer Quantiplex HIV-1 bDNA version 3.0 (bDNA 3.0) assay and the Roche Amplicor HIV-1 Monitor Ultrasensitive version 1.5 (Amplicor 1.5) assay, are now being used to monitor HIV-1-infected individuals. Both of these ultrasensitive assays have a reported lower limit of 50 HIV-1 RNA copies/ml and were developed from corresponding older generation assays with lower limits of 400 to 500 copies/ml. However, the comparability of viral load data generated by these ultrasensitive assays and the relative costs of labor, disposables, and biohazardous wastes were not determined in most cases. In this study, we used matched clinical plasma samples to compare the quantification of the newer bDNA 3.0 assay with that of the older bDNA 2.0 assay and to compare the quantification and costs of the bDNA 3.0 assay and the Amplicor 1.5 assay. We found that quantification by the bDNA 3.0 assay was approximately twofold higher than that by the bDNA 2.0 assay and was highly correlated to that by the Amplicor 1.5 assay. Moreover, cost analysis based on labor, disposables, and biohazardous wastes showed significant savings with the bDNA 3.0 assay as compared to the costs of the Amplicor 1.5 assay. PMID:10699005
Elbeik, T; Charlebois, E; Nassos, P; Kahn, J; Hecht, F M; Yajko, D; Ng, V; Hadley, K
2000-03-01
Quantification of human immunodeficiency virus type 1 (HIV-1) RNA as a measure of viral load has greatly improved the monitoring of therapies for infected individuals. With the significant reductions in viral load now observed in individuals treated with highly active anti-retroviral therapy (HAART), viral load assays have been adapted to achieve greater sensitivity. Two commercially available ultrasensitive assays, the Bayer Quantiplex HIV-1 bDNA version 3.0 (bDNA 3.0) assay and the Roche Amplicor HIV-1 Monitor Ultrasensitive version 1.5 (Amplicor 1.5) assay, are now being used to monitor HIV-1-infected individuals. Both of these ultrasensitive assays have a reported lower limit of 50 HIV-1 RNA copies/ml and were developed from corresponding older generation assays with lower limits of 400 to 500 copies/ml. However, the comparability of viral load data generated by these ultrasensitive assays and the relative costs of labor, disposables, and biohazardous wastes were not determined in most cases. In this study, we used matched clinical plasma samples to compare the quantification of the newer bDNA 3.0 assay with that of the older bDNA 2.0 assay and to compare the quantification and costs of the bDNA 3.0 assay and the Amplicor 1.5 assay. We found that quantification by the bDNA 3.0 assay was approximately twofold higher than that by the bDNA 2.0 assay and was highly correlated to that by the Amplicor 1.5 assay. Moreover, cost analysis based on labor, disposables, and biohazardous wastes showed significant savings with the bDNA 3.0 assay as compared to the costs of the Amplicor 1.5 assay.
Herath, H M D R; Shaw, P N; Cabot, P; Hewavitharana, A K
2010-06-15
The high-performance liquid chromatography (HPLC) column is capable of enrichment/pre-concentration of trace impurities in the mobile phase during the column equilibration, prior to sample injection and elution. These impurities elute during gradient elution and result in significant chromatographic peaks. Three types of purified water were tested for their impurity levels, and hence their performances as mobile phase, in HPLC followed by total ion current (TIC) mode of MS. Two types of HPLC-grade water produced 3-4 significant peaks in solvent blanks while LC/MS-grade water produced no peaks (although peaks were produced by LC/MS-grade water also after a few days of standing). None of the three waters produced peaks in HPLC followed by UV-Vis detection. These peaks, if co-eluted with analyte, are capable of suppressing or enhancing the analyte signal in a MS detector. As it is not common practice to run solvent blanks in TIC mode, when quantification is commonly carried out using single ion monitoring (SIM) or single or multiple reaction monitoring (SRM or MRM), the effect of co-eluting impurities on the analyte signal and hence on the accuracy of the results is often unknown to the analyst. Running solvent blanks in TIC mode, regardless of the MS mode used for quantification, is essential in order to detect this problem and to take subsequent precautions. Copyright (c) 2010 John Wiley & Sons, Ltd.
Laursen, Kristoffer; Adamsen, Christina E; Laursen, Jens; Olsen, Karsten; Møller, Jens K S
2008-03-01
Zinc-protoporphyrin (Zn-pp), which has been identified as the major pigment in certain dry-cured meat products, was extracted with acetone/water (75%) and isolated from the following meat products: Parma ham, Iberian ham and dry-cured ham with added nitrite. The quantification of Zn-pp by electron absorption, fluorescence and X-ray fluorescence (XRF) spectroscopy was compared (concentration range used [Zn-pp]=0.8-9.7μM). All three hams were found to contain Zn-pp, and the results show no significant difference among the content of Zn-pp quantified by fluorescence, absorbance and X-ray fluorescence spectroscopy for Parma ham and Iberian ham. All three methods can be used for quantification of Zn-pp in acetone/water extracts of different ham types if the content is higher than 1.0ppm. For dry-cured ham with added nitrite, XRF was not applicable due to the low content of Zn-pp (<0.1ppm). In addition, XRF spectroscopy provides further information regarding other trace elements and can therefore be advantageous in this aspect. This study also focused on XRF determination of Fe in the extracts and as no detectable Fe was found in the three types of ham extracts investigated (limit of detection; Fe⩽1.8ppm), it allows the conclusion that iron containing pigments, e.g., heme, do not contribute to the noticeable red colour observed in some of the extracts.
The persistent cosmic web and its filamentary structure - I. Theory and implementation
NASA Astrophysics Data System (ADS)
Sousbie, T.
2011-06-01
We present DisPerSE, a novel approach to the coherent multiscale identification of all types of astrophysical structures, in particular the filaments, in the large-scale distribution of the matter in the Universe. This method and the corresponding piece of software allows for a genuinely scale-free and parameter-free identification of the voids, walls, filaments, clusters and their configuration within the cosmic web, directly from the discrete distribution of particles in N-body simulations or galaxies in sparse observational catalogues. To achieve that goal, the method works directly over the Delaunay tessellation of the discrete sample and uses the Delaunay tessellation field estimator density computed at each tracer particle; no further sampling, smoothing or processing of the density field is required. The idea is based on recent advances in distinct subdomains of the computational topology, namely the discrete Morse theory which allows for a rigorous application of topological principles to astrophysical data sets, and the theory of persistence, which allows us to consistently account for the intrinsic uncertainty and Poisson noise within data sets. Practically, the user can define a given persistence level in terms of robustness with respect to noise (defined as a 'number of σ') and the algorithm returns the structures with the corresponding significance as sets of critical points, lines, surfaces and volumes corresponding to the clusters, filaments, walls and voids - filaments, connected at cluster nodes, crawling along the edges of walls bounding the voids. From a geometrical point of view, the method is also interesting as it allows for a robust quantification of the topological properties of a discrete distribution in terms of Betti numbers or Euler characteristics, without having to resort to smoothing or having to define a particular scale. In this paper, we introduce the necessary mathematical background and describe the method and implementation, while we address the application to 3D simulated and observed data sets in the companion paper (Sousbie, Pichon & Kawahara, Paper II).
NASA Astrophysics Data System (ADS)
Pisek, J.
2017-12-01
Clumping index (CI) is the measure of foliage aggregation relative to a random distribution of leaves in space. CI is an important factor for the correct quantification of true leaf area index (LAI). Global and regional scale CI maps have been generated from various multi-angle sensors based on an empirical relationship with the normalized difference between hotspot and darkspot (NDHD) index (Chen et al., 2005). Ryu et al. (2011) suggested that accurate calculation of radiative transfer in a canopy, important for controlling gross primary productivity (GPP) and evapotranspiration (ET) (Baldocchi and Harley, 1995), should be possible by integrating CI with incoming solar irradiance and LAI from MODIS land and atmosphere products. It should be noted that MODIS LAI/FPAR product uses internal non-empirical, stochastic equations for parameterization of foliage clumping. This raises a question if integration of the MODIS LAI product with empirically-based CI maps does not introduce any inconsistencies. Here, the consistency is examined independently through the `recollision probability theory' or `p-theory' (Knyazikhin et al., 1998) along with raw LAI-2000/2200 Plant Canopy Analyzer (PCA) data from > 30 sites, surveyed across a range of vegetation types. The theory predicts that the amount of radiation scattered by a canopy should depend only on the wavelength and the spectrally invariant canopy structural parameter p. The parameter p is linked to the foliage clumping (Stenberg et al., 2016). Results indicate that integration of the MODIS LAI product with empirically-based CI maps is feasible. Importantly, for the first time it is shown that it is possible to obtain p values for any location solely from Earth Observation data. This is very relevant for future applications of photon recollision probability concept for global and local monitoring of vegetation using Earth Observation data.
Stillhart, Cordula; Kuentz, Martin
2012-02-05
Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. Copyright © 2011 Elsevier B.V. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Quantification of microbial fate and transport in streams has become one of most important topics in studying biogeochemical properties and behavior of stream ecosystems. Using "smart" tracer such as resazurin (Raz) allows assessment of sediment-water interactions and associated biological activity ...
The Novice-Expert Continuum in Astronomy Knowledge
ERIC Educational Resources Information Center
Bryce, T. G. K.; Blown, E. J.
2012-01-01
The nature of expertise in astronomy was investigated across a broad spectrum of ages and experience in China and New Zealand. Five hypotheses (capable of quantification and statistical analysis) were used to probe types of expertise identified by previous researchers: (a) domain-specific knowledge-skill in the use of scientific vocabulary and…
Application of geotechnical data to resource planning in southeast Alaska.
W.L. Schroeder; D.N. Swanston
1987-01-01
Recent quantification of engineering properties and index values of dominant soil types in the Alexander Archipelago, southeast Alaska, have revealed consistent diagnostic characteristics useful to evaluating landslide risk and subgrade material stability before timber harvesting and low-volume road construction. Shear strength data are summarized and grouped by Soil...
Rapid cell-based assay for detection and quantification of active staphylococcal enterotoxin type D
USDA-ARS?s Scientific Manuscript database
Food poisoning by Staphylococcus aureus is a result of ingestion of Staphylococcal enterotoxins (SEs) produced by this bacterium and is a major source of foodborne illness. Staphylococcal enterotoxin D (SED) is one of the predominant enterotoxins recovered in Staphylococcal food poisoning incidences...
Group field theory with noncommutative metric variables.
Baratin, Aristide; Oriti, Daniele
2010-11-26
We introduce a dual formulation of group field theories as a type of noncommutative field theories, making their simplicial geometry manifest. For Ooguri-type models, the Feynman amplitudes are simplicial path integrals for BF theories. We give a new definition of the Barrett-Crane model for gravity by imposing the simplicity constraints directly at the level of the group field theory action.
Toward a Theory of Psychological Type Congruence for Advertisers.
ERIC Educational Resources Information Center
McBride, Michael H.; And Others
Focusing on the impact of advertisers' persuasive selling messages on consumers, this paper discusses topics relating to the theory of psychological type congruence. Based on an examination of persuasion theory and relevant psychological concepts, including recent cognitive stability and personality and needs theory and the older concept of…
Subnuclear foci quantification using high-throughput 3D image cytometry
NASA Astrophysics Data System (ADS)
Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.
2015-07-01
Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.
NASA Astrophysics Data System (ADS)
Wellenreuther, G.; Fittschen, U. E. A.; Achard, M. E. S.; Faust, A.; Kreplin, X.; Meyer-Klaucke, W.
2008-12-01
Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence (μXRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.
Watzinger, Franz; Hörth, Elfriede; Lion, Thomas
2001-01-01
Despite the recent introduction of real-time PCR methods, competitive PCR techniques continue to play an important role in nucleic acid quantification because of the significantly lower cost of equipment and consumables. Here we describe a shifted restriction-site competitive PCR (SRS-cPCR) assay based on a modified type of competitor. The competitor fragments are designed to contain a recognition site for a restriction endonuclease that is also present in the target sequence to be quantified, but in a different position. Upon completion of the PCR, the amplicons are digested in the same tube with a single restriction enzyme, without the need to purify PCR products. The generated competitor- and target-specific restriction fragments display different sizes, and can be readily separated by electrophoresis and quantified by image analysis. Suboptimal digestion affects competitor- and target-derived amplicons to the same extent, thus eliminating the problem of incorrect quantification as a result of incomplete digestion of PCR products. We have established optimized conditions for a panel of 20 common restriction endonucleases permitting efficient digestion in PCR buffer. It is possible, therefore, to find a suitable restriction site for competitive PCR in virtually any sequence of interest. The assay presented is inexpensive, widely applicable, and permits reliable and accurate quantification of nucleic acid targets. PMID:11376164
Babu, Dinesh; Muriana, Peter M.
2014-01-01
Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR) assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2), horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40) extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be <20 μg/kg. The RT iq-PCR assay exhibited high antigen hook effect in samples containing aflatoxin levels higher than the quantification limits (0.1–10 μg/kg), addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup. PMID:25474493
Babu, Dinesh; Muriana, Peter M
2014-12-02
Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR) assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2), horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40) extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be <20 μg/kg. The RT iq-PCR assay exhibited high antigen hook effect in samples containing aflatoxin levels higher than the quantification limits (0.1-10 μg/kg), addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup.
Does theory influence the effectiveness of health behavior interventions? Meta-analysis.
Prestwich, Andrew; Sniehotta, Falko F; Whittington, Craig; Dombrowski, Stephan U; Rogers, Lizzie; Michie, Susan
2014-05-01
To systematically investigate the extent and type of theory use in physical activity and dietary interventions, as well as associations between extent and type of theory use with intervention effectiveness. An in-depth analysis of studies included in two systematic reviews of physical activity and healthy eating interventions (k = 190). Extent and type of theory use was assessed using the Theory Coding Scheme (TCS) and intervention effectiveness was calculated using Hedges's g. Metaregressions assessed the relationships between these measures. Fifty-six percent of interventions reported a theory base. Of these, 90% did not report links between all of their behavior change techniques (BCTs) with specific theoretical constructs and 91% did not report links between all the specified constructs with BCTs. The associations between a composite score or specific items on the TCS and intervention effectiveness were inconsistent. Interventions based on Social Cognitive Theory or the Transtheoretical Model were similarly effective and no more effective than interventions not reporting a theory base. The coding of theory in these studies suggested that theory was not often used extensively in the development of interventions. Moreover, the relationships between type of theory used and the extent of theory use with effectiveness were generally weak. The findings suggest that attempts to apply the two theories commonly used in this review more extensively are unlikely to increase intervention effectiveness. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Quantification of differential gene expression by multiplexed targeted resequencing of cDNA
Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.
2017-01-01
Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677
Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq
Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H.; Keleş, Sündüz; Dewey, Colin N.
2016-01-01
RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. PMID:27405803
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2014-01-01
This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.
A surrogate accelerated multicanonical Monte Carlo method for uncertainty quantification
NASA Astrophysics Data System (ADS)
Wu, Keyi; Li, Jinglai
2016-09-01
In this work we consider a class of uncertainty quantification problems where the system performance or reliability is characterized by a scalar parameter y. The performance parameter y is random due to the presence of various sources of uncertainty in the system, and our goal is to estimate the probability density function (PDF) of y. We propose to use the multicanonical Monte Carlo (MMC) method, a special type of adaptive importance sampling algorithms, to compute the PDF of interest. Moreover, we develop an adaptive algorithm to construct local Gaussian process surrogates to further accelerate the MMC iterations. With numerical examples we demonstrate that the proposed method can achieve several orders of magnitudes of speedup over the standard Monte Carlo methods.
Pain as a fact and heuristic: how pain neuroimaging illuminates moral dimensions of law.
Pustilnik, Amanda C
2012-05-01
In legal domains ranging from tort to torture, pain and its degree do important definitional work by delimiting boundaries of lawfulness and of entitlements. Yet, for all the work done by pain as a term in legal texts and practice, it has a confounding lack of external verifiability. Now, neuroimaging is rendering pain and myriad other subjective states at least partly ascertainable. This emerging ability to ascertain and quantify subjective states is prompting a "hedonic" or a "subjectivist" turn in legal scholarship, which has sparked a vigorous debate as to whether the quantification of subjective states might affect legal theory and practice. Subjectivists contend that much values-talk in law has been a necessary but poor substitute for quantitative determinations of subjective states--determinations that will be possible in the law's "experiential future." This Article argues the converse: that pain discourse in law frequently is a heuristic for values. Drawing on interviews and laboratory visits with neuroimaging researchers, this Article shows current and in-principle limitations of pain quantification through neuroimaging. It then presents case studies on torture-murder, torture, the death penalty, and abortion to show the largely heuristic role of pain discourse in law. Introducing the theory of "embodied morality," the Article describes how moral conceptions of rights and duties are informed by human physicality and constrained by the limits of empathic identification. Pain neuroimaging helps reveal this dual factual and heuristic nature of pain in the law, and thus itself points to the translational work required for neuroimaging to influence, much less transform, legal practice and doctrine.
Intrusive Method for Uncertainty Quantification in a Multiphase Flow Solver
NASA Astrophysics Data System (ADS)
Turnquist, Brian; Owkes, Mark
2016-11-01
Uncertainty quantification (UQ) is a necessary, interesting, and often neglected aspect of fluid flow simulations. To determine the significance of uncertain initial and boundary conditions, a multiphase flow solver is being created which extends a single phase, intrusive, polynomial chaos scheme into multiphase flows. Reliably estimating the impact of input uncertainty on design criteria can help identify and minimize unwanted variability in critical areas, and has the potential to help advance knowledge in atomizing jets, jet engines, pharmaceuticals, and food processing. Use of an intrusive polynomial chaos method has been shown to significantly reduce computational cost over non-intrusive collocation methods such as Monte-Carlo. This method requires transforming the model equations into a weak form through substitution of stochastic (random) variables. Ultimately, the model deploys a stochastic Navier Stokes equation, a stochastic conservative level set approach including reinitialization, as well as stochastic normals and curvature. By implementing these approaches together in one framework, basic problems may be investigated which shed light on model expansion, uncertainty theory, and fluid flow in general. NSF Grant Number 1511325.
NASA Astrophysics Data System (ADS)
Giovanis, D. G.; Shields, M. D.
2018-07-01
This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.
The logical primitives of thought: Empirical foundations for compositional cognitive models.
Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D
2016-07-01
The notion of a compositional language of thought (LOT) has been central in computational accounts of cognition from earliest attempts (Boole, 1854; Fodor, 1975) to the present day (Feldman, 2000; Penn, Holyoak, & Povinelli, 2008; Fodor, 2008; Kemp, 2012; Goodman, Tenenbaum, & Gerstenberg, 2015). Recent modeling work shows how statistical inferences over compositionally structured hypothesis spaces might explain learning and development across a variety of domains. However, the primitive components of such representations are typically assumed a priori by modelers and theoreticians rather than determined empirically. We show how different sets of LOT primitives, embedded in a psychologically realistic approximate Bayesian inference framework, systematically predict distinct learning curves in rule-based concept learning experiments. We use this feature of LOT models to design a set of large-scale concept learning experiments that can determine the most likely primitives for psychological concepts involving Boolean connectives and quantification. Subjects' inferences are most consistent with a rich (nonminimal) set of Boolean operations, including first-order, but not second-order, quantification. Our results more generally show how specific LOT theories can be distinguished empirically. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Final Technical Report: Mathematical Foundations for Uncertainty Quantification in Materials Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plechac, Petr; Vlachos, Dionisios G.
We developed path-wise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of non-equilibrium extended molecular systems. The combination of these novel methodologies provided the first methods in the literature which are capable to handle UQ questions for stochastic complex systems with some or all of the following features: (a) multi-scale stochastic models such as (bio)chemical reaction networks, with a very large number of parameters, (b) spatially distributed systems such as Kinetic Monte Carlo or Langevin Dynamics, (c) non-equilibrium processes typically associated with coupled physico-chemical mechanisms, driven boundary conditions, hybrid micro-macro systems,more » etc. A particular computational challenge arises in simulations of multi-scale reaction networks and molecular systems. Mathematical techniques were applied to in silico prediction of novel materials with emphasis on the effect of microstructure on model uncertainty quantification (UQ). We outline acceleration methods to make calculations of real chemistry feasible followed by two complementary tasks on structure optimization and microstructure-induced UQ.« less
Dhandapani, Sivashanmugam; Srinivasan, Anirudh
2016-01-01
Triple spinal dysraphism is extremely rare. There are published reports of multiple discrete neural tube defects with intervening normal segments that are explained by the multisite closure theory of primary neurulation, having an association with Chiari malformation Type II consistent with the unified theory of McLone. The authors report on a 1-year-old child with contiguous myelomeningocele and lipomyelomeningocele centered on Type I split cord malformation with Chiari malformation Type II and hydrocephalus. This composite anomaly is probably due to select abnormalities of the neurenteric canal during gastrulation, with a contiguous cascading impact on both dysjunction of the neural tube and closure of the neuropore, resulting in a small posterior fossa, probably bringing the unified theory of McLone closer to the unified theory of Pang.
Kulasiri, Don
2011-01-01
We discuss the quantification of molecular fluctuations in the biochemical reaction systems within the context of intracellular processes associated with gene expression. We take the molecular reactions pertaining to circadian rhythms to develop models of molecular fluctuations in this chapter. There are a significant number of studies on stochastic fluctuations in intracellular genetic regulatory networks based on single cell-level experiments. In order to understand the fluctuations associated with the gene expression in circadian rhythm networks, it is important to model the interactions of transcriptional factors with the E-boxes in the promoter regions of some of the genes. The pertinent aspects of a near-equilibrium theory that would integrate the thermodynamical and particle dynamic characteristics of intracellular molecular fluctuations would be discussed, and the theory is extended by using the theory of stochastic differential equations. We then model the fluctuations associated with the promoter regions using general mathematical settings. We implemented ubiquitous Gillespie's algorithms, which are used to simulate stochasticity in biochemical networks, for each of the motifs. Both the theory and the Gillespie's algorithms gave the same results in terms of the time evolution of means and variances of molecular numbers. As biochemical reactions occur far away from equilibrium-hence the use of the Gillespie algorithm-these results suggest that the near-equilibrium theory should be a good approximation for some of the biochemical reactions. © 2011 Elsevier Inc. All rights reserved.
Ordering states with various coherence measures
NASA Astrophysics Data System (ADS)
Yang, Long-Mei; Chen, Bin; Fei, Shao-Ming; Wang, Zhi-Xi
2018-04-01
Quantum coherence is one of the most significant theories in quantum physics. Ordering states with various coherence measures is an intriguing task in quantification theory of coherence. In this paper, we study this problem by use of four important coherence measures—the l_1 norm of coherence, the relative entropy of coherence, the geometric measure of coherence and the modified trace distance measure of coherence. We show that each pair of these measures give a different ordering of qudit states when d≥3. However, for single-qubit states, the l_1 norm of coherence and the geometric coherence provide the same ordering. We also show that the relative entropy of coherence and the geometric coherence give a different ordering for single-qubit states. Then we partially answer the open question proposed in Liu et al. (Quantum Inf Process 15:4189, 2016) whether all the coherence measures give a different ordering of states.
NASA Technical Reports Server (NTRS)
Dekorvin, Andre
1992-01-01
The Dempster-Shafer theory of evidence is applied to a multiattribute decision making problem whereby the decision maker (DM) must compromise with available alternatives, none of which exactly satisfies his ideal. The decision mechanism is constrained by the uncertainty inherent in the determination of the relative importance of each attribute element and the classification of existing alternatives. The classification of alternatives is addressed through expert evaluation of the degree to which each element is contained in each available alternative. The relative importance of each attribute element is determined through pairwise comparisons of the elements by the decision maker and implementation of a ratio scale quantification method. Then the 'belief' and 'plausibility' that an alternative will satisfy the decision maker's ideal are calculated and combined to rank order the available alternatives. Application to the problem of selecting computer software is given.
The time-dependence of exchange-induced relaxation during modulated radio frequency pulses.
Sorce, Dennis J; Michaeli, Shalom; Garwood, Michael
2006-03-01
The problem of the relaxation of identical spins 1/2 induced by chemical exchange between spins with different chemical shifts in the presence of time-dependent RF irradiation (in the first rotating frame) is considered for the fast exchange regime. The solution for the time evolution under the chemical exchange Hamiltonian in the tilted doubly rotating frame (TDRF) is presented. Detailed derivation is specified to the case of a two-site chemical exchange system with complete randomization between jumps of the exchanging spins. The derived theory can be applied to describe the modulation of the chemical exchange relaxation rate constants when using a train of adiabatic pulses, such as the hyperbolic secant pulse. Theory presented is valid for quantification of the exchange-induced time-dependent rotating frame longitudinal T1rho,ex and transverse T2rho,ex relaxations in the fast chemical exchange regime.
Sánchez-García, L; Bolea, E; Laborda, F; Cubel, C; Ferrer, P; Gianolio, D; da Silva, I; Castillo, J R
2016-03-18
Facing the lack of studies on characterization and quantification of cerium oxide nanoparticles (CeO2 NPs), whose consumption and release is greatly increasing, this work proposes a method for their sizing and quantification by Flow Field-flow Fractionation (FFFF) coupled to Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). Two modalities of FFFF (Asymmetric Flow- and Hollow Fiber-Flow Field Flow Fractionation, AF4 and HF5, respectively) are compared, and their advantages and limitations discussed. Experimental conditions (carrier composition, pH, ionic strength, crossflow and carrier flow rates) are studied in detail in terms of NP separation, recovery, and repeatability. Size characterization of CeO2 NPs was addressed by different approaches. In the absence of feasible size standards of CeO2 NPs, suspensions of Ag, Au, and SiO2 NPs of known size were investigated. Ag and Au NPs failed to show a comparable behavior to that of the CeO2 NPs, whereas the use of SiO2 NPs provided size estimations in agreement to those predicted by the theory. The latter approach was thus used for characterizing the size of CeO2 NPs in a commercial suspension. Results were in adequate concordance with those achieved by transmission electron microscopy, X-ray diffraction and dynamic light scattering. The quantification of CeO2 NPs in the commercial suspension by AF4-ICP-MS required the use of a CeO2 NPs standards, since the use of ionic cerium resulted in low recoveries (99 ± 9% vs. 73 ± 7%, respectively). A limit of detection of 0.9 μg L(-1) CeO2 corresponding to a number concentration of 1.8 × 1012 L(-1) for NPs of 5 nm was achieved for an injection volume of 100 μL. Copyright © 2016 Elsevier B.V. All rights reserved.
1999-12-11
Kolb envisioned experiential 26 Table 2 Subscales on the NASSP Learning Styles Profile Cognitive Styles Perceptual Responses Analytic Skill...Research Type Theory and Learning Preferences Jung and the Theory of Psychological Types Isabel Briggs Myers’ Contribution to Jung’s Work The Myers...Implications Recommendations for Further Study Summary of Specific Conclusions Discussion Grounded Curriculum Learning Preferences Type Theory Student
Timoshenko-Type Theory in the Stability Analysis of Corrugated Cylindrical Shells
NASA Astrophysics Data System (ADS)
Semenyuk, N. P.; Neskhodovskaya, N. A.
2002-06-01
A technique is proposed for stability analysis of longitudinally corrugated shells under axial compression. The technique employs the equations of the Timoshenko-type nonlinear theory of shells. The geometrical parameters of shells are specified on discrete set of points and are approximated by segments of Fourier series. Infinite systems of homogeneous algebraic equations are derived from a variational equation written in displacements to determine the critical loads and buckling modes. Specific types of corrugated isotropic metal and fiberglass shells are considered. The calculated results are compared with those obtained within the framework of the classical theory of shells. It is shown that the Timoshenko-type theory extends significantly the possibility of exact allowance for the geometrical parameters and material properties of corrugated shells compared with Kirchhoff-Love theory.
NASA Astrophysics Data System (ADS)
Elhag, Mohamed; Boteva, Silvena
2017-12-01
Quantification of geomorphometric features is the keystone concern of the current study. The quantification was based on the statistical approach in term of multivariate analysis of local topographic features. The implemented algorithm utilizes the Digital Elevation Model (DEM) to categorize and extract the geomorphometric features embedded in the topographic dataset. The morphological settings were exercised on the central pixel of 3x3 per-defined convolution kernel to evaluate the surrounding pixels under the right directional pour point model (D8) of the azimuth viewpoints. Realization of unsupervised classification algorithm in term of Iterative Self-Organizing Data Analysis Technique (ISODATA) was carried out on ASTER GDEM within the boundary of the designated study area to distinguish 10 morphometric classes. The morphometric classes expressed spatial distribution variation in the study area. The adopted methodology is successful to appreciate the spatial distribution of the geomorphometric features under investigation. The conducted results verified the superimposition of the delineated geomorphometric elements over a given remote sensing imagery to be further analyzed. Robust relationship between different Land Cover types and the geomorphological elements was established in the context of the study area. The domination and the relative association of different Land Cover types in corresponding to its geomorphological elements were demonstrated.
Analysis of Metal Contents in Portland Type V and MTA-Based Cements
Dorileo, Maura Cristiane Gonçales Orçati; Bandeca, Matheus Coelho; Pedro, Fábio Luis Miranda; Volpato, Luiz Evaristo Ricci; Guedes, Orlando Aguirre; Villa, Ricardo Dalla; Tonetto, Mateus Rodrigues; Borges, Alvaro Henrique
2014-01-01
The aim of this study was to determine, by Atomic Absorption Spectrometry (AAS), the concentration levels of 11 metals in Type V gray and structural white PC, ProRoot MTA, and MTA Bio. Samples, containing one gram of each tested cement, were prepared and transferred to a 100 mL Teflon tube with a mixture of 7.0 mL of nitric acid and 21 mL of hydrochloric acid. After the reaction, the mixture was filtered and then volumed to 50 mL of distilled water. For each metal, specific patterns were determined from universal standards. Arsenic quantification was performed by hydride generator. The analysis was performed five times and the data were statistically analyzed at 5% level of significance. Only the cadmium presented concentration levels of values lower than the quantification limit of the device. The AAS analysis showed increased levels of calcium, nickel, and zinc in structural white PC. Type V PC presented the greatest concentration levels of arsenic, chromium, copper, iron, lead, and manganese (P < 0.05). Bismuth was found in all cements, and the lowest concentration levels were observed in Portland cements, while the highest were observed in ProRoot MTA. Both PC and MTA-based cements showed evidence of metals inclusion. PMID:25436238
Vennat, B; Pourrat, H; Pouget, M P; Gross, D; Pourrat, A
1988-10-01
The tannins in leaf, bark, and stem extracts of HAMAMELIS VIRGINIANA were analyzed. Four proanthocyanidins were isolated by HPLC. One was a procyanidin polymer containing only one type of flavanol unit; the other three were polymers of procyanidin and prodelphinidin containing two types of flavanol units. A method of assay of hamamelitannin showed the bark extract to be 31 times richer in hamamelitannin than the leaf extract and 87 times richer than the stem extract.
43 CFR 11.72 - Quantification phase-baseline services determination.
Code of Federal Regulations, 2012 CFR
2012-10-01
... for changes that have occurred as a result of causes other than the discharge or release. In addition... predictable that changes as a result of the discharge or release are likely to be detectable. (3) If... control area. Other factors, including climate, depth of ground water, vegetation type and area covered...
43 CFR 11.72 - Quantification phase-baseline services determination.
Code of Federal Regulations, 2011 CFR
2011-10-01
... for changes that have occurred as a result of causes other than the discharge or release. In addition... predictable that changes as a result of the discharge or release are likely to be detectable. (3) If... control area. Other factors, including climate, depth of ground water, vegetation type and area covered...
43 CFR 11.72 - Quantification phase-baseline services determination.
Code of Federal Regulations, 2014 CFR
2014-10-01
... for changes that have occurred as a result of causes other than the discharge or release. In addition... predictable that changes as a result of the discharge or release are likely to be detectable. (3) If... control area. Other factors, including climate, depth of ground water, vegetation type and area covered...
43 CFR 11.72 - Quantification phase-baseline services determination.
Code of Federal Regulations, 2013 CFR
2013-10-01
... for changes that have occurred as a result of causes other than the discharge or release. In addition... predictable that changes as a result of the discharge or release are likely to be detectable. (3) If... control area. Other factors, including climate, depth of ground water, vegetation type and area covered...
Ryan Wagner; Robert J. Moon; Arvind Raman
2016-01-01
Quantification of the mechanical properties of cellulose nanomaterials is key to the development of new cellulose nanomaterial based products. Using contact resonance atomic force microscopy we measured and mapped the transverse elastic modulus of three types of cellulosic nanoparticles: tunicate cellulose nanocrystals, wood cellulose nanocrystals, and wood cellulose...
Molecular detection methods such as PCR have been extensively used to type Cryptosporidium oocysts detected in the environment. More recently, studies have developed quantitative real-time PCR assays for detection and quantification of microbial contaminants in water as well as ...
USDA-ARS?s Scientific Manuscript database
Abundance was assessed by utilizing a panel of cross-reactive monoclonal antibodies (mAbs) tested in this study. Characterization of multichannel autofluorescence of eosinophils permitted cell-type specific gating of granulocytes for quantification of LDMs on neutrophils and eosinophils by indirect,...
QUANTIFICATION OF 2,4-D ON SOLID-PHASE EXPOSURE SAMPLING MEDIA BY LC/MS/MS
Three types of solid phase chemical exposure sampling media: cellulose, polyurethane foam (PUF) and XAD-2, were analyzed for 2,4-D and the amine salts of 2,4-D. Individual samples were extracted into acidified methanol and the extracts were analyzed via LC/MS/MS using electrospra...
Troubling Theory in Case Study Research
ERIC Educational Resources Information Center
Hammersley, Martyn
2012-01-01
The article begins by examining the variety of meanings that can be given to the word "theory", the different attitudes that may be taken towards theories of these various types and some of the problems associated with them. The second half of the article focuses on one of these types, explanatory theory, and the question of what is required if…
A Facile Stable-Isotope Dilution Method for Determination of Sphingosine Phosphate Lyase Activity
Suh, Jung H.; Eltanawy, Abeer; Rangan, Apoorva; Saba, Julie D.
2015-01-01
A new technique for quantifying sphingosine phosphate lyase activity in biological samples is described. In this procedure, 2-hydrazinoquinoline is used to convert (2E)-hexadecenal into the corresponding hydrazone derivative to improve ionization efficiency and selectivity of detection. Combined utilization of liquid chromatographic separation and multiple reaction monitoring-mass spectrometry allows for simultaneous quantification of the substrate S1P and product (2E)-hexadecenal. Incorporation of (2E)-d5-hexadecenal as an internal standard improves detection accuracy and precision. A simple one-step derivatization procedure eliminates the need for further extractions. Limits of quantification for (2E)-hexadecenal and sphingosine-1-phosphate are 100 and 50 fmol, respectively. The assay displays a wide dynamic detection range useful for detection of low basal sphingosine phosphate lyase activity in wild type cells, SPL-overexpressing cell lines, and wild type mouse tissues. Compared to current methods, the capacity for simultaneous detection of sphingosine-1-phosphate and (2E)-hexadecenal greatly improves the accuracy of results and shows excellent sensitivity and specificity for sphingosine phosphate lyase activity detection. PMID:26408264
Ju, Yong Han; Sohn, So Young
2011-01-01
Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. Copyright © 2010 Elsevier Ltd. All rights reserved.
Simon, Márta; van Alst, Nikki; Vollertsen, Jes
2018-05-17
This paper presents a method for microplastic (MP) mass quantification using a Focal Plane Array-based Fourier Transform Infrared imaging technique. It discusses the issue that particle number is not a conserved base quantity and hence less suited than mass to compare independent studies on MP in the environment. It concludes that MP mass should be included when quantifying MP pollution in the environment, supplementing the conventional approach of reporting particle numbers. Applying mass as the unit of MP measurement, the paper presents data showing that Danish wastewater treatment plants discharge around 3 t/year of MP in the size range 10-500 μm. This value corresponds to an annual per capita emission from these plants of 0.56 g MP/(capita year). The distribution of polymer types by mass and particle number differed because the size of MP particles of the different material types varied. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
The Impact of Repeated Freeze-Thaw Cycles on the Quality of Biomolecules in Four Different Tissues.
Ji, Xiaoli; Wang, Min; Li, Lingling; Chen, Fang; Zhang, Yanyang; Li, Qian; Zhou, Junmei
2017-10-01
High-quality biosamples are valuable resources for biomedical research. However, some tissues are stored without being sectioned into small aliquots and have to undergo repeated freeze-thaw cycles throughout prolonged experimentation. Little is known regarding the effects of repeated freeze-thaw cycles on the quality of biomolecules in tissues. The aim of this study was to evaluate the impact of repeated freeze-thaw (at room temperature or on ice) cycles on biomolecules and gene expression in four different types of tissues. Each fresh tissue was sectioned into seven aliquots and snap-frozen before undergoing repeated freeze-thaw cycles at room temperature or on ice. Biomolecules were extracted and analyzed. Both relative and absolute quantification were used to detect the changes in gene expression. The results indicated that the impact of repeated freeze-thaw cycles on RNA integrity varied by tissue type. Gene expression, including the housekeeping gene, was affected in RNA-degraded samples according to absolute quantification rather than relative quantification. Furthermore, our results suggest that thawing on ice could protect RNA integrity compared with thawing at room temperature. No obvious degradation of protein or DNA was observed with repeated freeze-thaw cycles either at room temperature or on ice. This research provides ample evidence for the necessity of sectioning fresh tissues into small aliquots before snap-freezing, thus avoiding degradation of RNA and alteration of gene expression resulting from repeated freeze-thaw cycles. For frozen tissue samples that were already in storage and had to be used repeatedly during their lifecycle, thawing on ice or sectioned at ultralow temperature is recommended.
Kim, Jin Young; Cheong, Jae Chul; Kim, Min Kyoung; Lee, Jae Il; In, Moon Kyo
2008-06-01
A gas chromatography-mass spectrometric (GC-MS) method was developed and validated for the simultaneous detection and quantification of four amphetamine-type stimulants (amphetamine (AP), methamphetamine (MA), 3,4-methylenedioxyamphetamine (MDA) and 3,4-methylenedioxymethamphetamine (MDMA)) and two cannabinoids (Delta9-tetrahydrocannabinol (Delta9-THC) and 11-nor-Delta9-tetrahydrocannabinol-9-carboxylic acid (THCCOOH)) in fingernails. Fingernail clippings (30 mg) were washed with distilled water and methanol, and then incubated in 1.0 M sodium hydroxide at 95 degrees C for 30 min. The compounds of interest were isolated by liquid-liquid extraction followed by derivatization with N-methyl-N-trimethylsilyltrifluoroacetamide (MSTFA) at 70 degrees C for 15 min. The derivatized compounds were analyzed by GC-MS in the selective ion monitoring (SIM) mode. The linear ranges were 0.1-15.0 ng/mg for AP, 0.2-15.0 ng/mg for MDA, Delta9-THC and THCCOOH, and 0.2-30.0 ng/mg for MA and MDMA, with good correlation coefficients (r2 > 0.9991). The intra-day, inter-day, and inter-person precisions were within 10.6%, 6.3%, and 5.3%, respectively. The intra-day, inter-day and inter-person accuracies were between -6.1 and 5.0%, -6.2 and 5.7%, and -6.4 and 5.6%, respectively. The limits of detection (LODs) and quantification (LOQs) for each compound were lower than 0.056 and 0.2 ng/mg, respectively. The recoveries were in the range of 74.0-94.8%. Positive GC-MS results were obtained from specimens of nine suspected MA or cannabis abusers. The concentration ranges of MA, AP, and THCCOOH were 0.10-1.41, 0.12-2.64, and 0.20 ng/mg, respectively. Based on these results, the method proved to be effective for the simultaneous qualification and quantification of amphetamine-type stimulants and cannabinoids in fingernails.
Pereira, Lara; Pujol, Marta; Garcia-Mas, Jordi; Phillips, Michael A
2017-07-01
Ethylene is a gaseous plant hormone involved in defense, adaptations to environmental stress and fruit ripening. Its relevance to the latter makes its detection highly useful for physiologists interested in the onset of ripening. Produced as a sharp peak during the respiratory burst, ethylene is biologically active at tens of nl L -1 . Reliable quantification at such concentrations generally requires specialized instrumentation. Here we present a rapid, high-sensitivity method for detecting ethylene in attached fruit using a conventional gas chromatography-mass spectrometry (GC-MS) system and in situ headspace collection chambers. We apply this method to melon (Cucumis melo L.), a unique species consisting of climacteric and non-climacteric varieties, with a high variation in the climacteric phenotype among climacteric types. Using a population of recombinant inbred lines (RILs) derived from highly climacteric ('Védrantais', cantalupensis type) and non-climacteric ('Piel de Sapo', inodorus type) parental lines, we observed a significant variation for the intensity, onset and duration of the ethylene burst during fruit ripening. Our method does not require concentration, sampling times over 1 h or fruit harvest. We achieved a limit of detection of 0.41 ± 0.04 nl L -1 and a limit of quantification of 1.37 ± 0.13 nl L -1 with an analysis time per sample of 2.6 min. Validation of the analytical method indicated that linearity (>98%), precision (coefficient of variation ≤2%) and sensitivity compared favorably with dedicated optical sensors. This study adds to evidence of the characteristic climacteric ethylene burst as a complex trait whose intensity in our RIL population lies along a continuum in addition to two extremes. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.
Differential diagnosis of neurodegenerative diseases using structural MRI data
Koikkalainen, Juha; Rhodius-Meester, Hanneke; Tolonen, Antti; Barkhof, Frederik; Tijms, Betty; Lemstra, Afina W.; Tong, Tong; Guerrero, Ricardo; Schuh, Andreas; Ledig, Christian; Rueckert, Daniel; Soininen, Hilkka; Remes, Anne M.; Waldemar, Gunhild; Hasselbalch, Steen; Mecocci, Patrizia; van der Flier, Wiesje; Lötjönen, Jyrki
2016-01-01
Different neurodegenerative diseases can cause memory disorders and other cognitive impairments. The early detection and the stratification of patients according to the underlying disease are essential for an efficient approach to this healthcare challenge. This emphasizes the importance of differential diagnostics. Most studies compare patients and controls, or Alzheimer's disease with one other type of dementia. Such a bilateral comparison does not resemble clinical practice, where a clinician is faced with a number of different possible types of dementia. Here we studied which features in structural magnetic resonance imaging (MRI) scans could best distinguish four types of dementia, Alzheimer's disease, frontotemporal dementia, vascular dementia, and dementia with Lewy bodies, and control subjects. We extracted an extensive set of features quantifying volumetric and morphometric characteristics from T1 images, and vascular characteristics from FLAIR images. Classification was performed using a multi-class classifier based on Disease State Index methodology. The classifier provided continuous probability indices for each disease to support clinical decision making. A dataset of 504 individuals was used for evaluation. The cross-validated classification accuracy was 70.6% and balanced accuracy was 69.1% for the five disease groups using only automatically determined MRI features. Vascular dementia patients could be detected with high sensitivity (96%) using features from FLAIR images. Controls (sensitivity 82%) and Alzheimer's disease patients (sensitivity 74%) could be accurately classified using T1-based features, whereas the most difficult group was the dementia with Lewy bodies (sensitivity 32%). These results were notable better than the classification accuracies obtained with visual MRI ratings (accuracy 44.6%, balanced accuracy 51.6%). Different quantification methods provided complementary information, and consequently, the best results were obtained by utilizing several quantification methods. The results prove that automatic quantification methods and computerized decision support methods are feasible for clinical practice and provide comprehensive information that may help clinicians in the diagnosis making. PMID:27104138
Composite structural materials
NASA Technical Reports Server (NTRS)
Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.
1983-01-01
Progress and plans are reported for investigations of: (1) the mechanical properties of high performance carbon fibers; (2) fatigue in composite materials; (3) moisture and temperature effects on the mechanical properties of graphite-epoxy laminates; (4) the theory of inhomogeneous swelling in epoxy resin; (5) numerical studies of the micromechanics of composite fracture; (6) free edge failures of composite laminates; (7) analysis of unbalanced laminates; (8) compact lug design; (9) quantification of Saint-Venant's principles for a general prismatic member; (10) variation of resin properties through the thickness of cured samples; and (11) the wing fuselage ensemble of the RP-1 and RP-2 sailplanes.
Towards a formal definition of static and dynamic electronic correlations.
Benavides-Riveros, Carlos L; Lathiotakis, Nektarios N; Marques, Miguel A L
2017-05-24
Some of the most spectacular failures of density-functional and Hartree-Fock theories are related to an incorrect description of the so-called static electron correlation. Motivated by recent progress in the N-representability problem of the one-body density matrix for pure states, we propose a method to quantify the static contribution to the electronic correlation. By studying several molecular systems we show that our proposal correlates well with our intuition of static and dynamic electron correlation. Our results bring out the paramount importance of the occupancy of the highest occupied natural spin-orbital in such quantification.
Optic fiber pulse-diagnosis sensor of traditional Chinese medicine
NASA Astrophysics Data System (ADS)
Ni, J. S.; Jin, W.; Zhao, B. N.; Zhang, X. L.; Wang, C.; Li, S. J.; Zhang, F. X.; Peng, G. D.
2013-09-01
The wrist-pulse is a kind of signals, from which a lot of physiological and pathological status of patients are deduced according to traditional Chinese medicine theories. This paper designs a new optic fiber wrist-pulse sensor that based on a group of FBGs. Sensitivity of the optic fiber wrist-pulse measurement system reaches 0.05% FS and the range reaches 50kPa. Frequency response is from 0 Hz to 5 kHz. A group of typical pulse signal is given out in the paper to compare different status of patient. It will improve quantification of pulse diagnosis greatly.
New imaging technology: measurement of myocardial perfusion by contrast echocardiography
NASA Technical Reports Server (NTRS)
Rubin, D. N.; Thomas, J. D.
2000-01-01
Myocardial perfusion imaging has long been a goal for the non-invasive echocardiographic assessment of the heart. However, many factors at play in perfusion imaging have made this goal elusive. Harmonic imaging and triggered imaging with newer contrast agents have made myocardial perfusion imaging potentially practical in the very near future. The application of indicator dilution theory to the coronary circulation and bubble contrast agents is fraught with complexities and sources of error. Therefore, quantification of myocardial perfusion by non-invasive echocardiographic imaging requires further investigation in order to make this technique clinically viable.
Information-Theoretic Benchmarking of Land Surface Models
NASA Astrophysics Data System (ADS)
Nearing, Grey; Mocko, David; Kumar, Sujay; Peters-Lidard, Christa; Xia, Youlong
2016-04-01
Benchmarking is a type of model evaluation that compares model performance against a baseline metric that is derived, typically, from a different existing model. Statistical benchmarking was used to qualitatively show that land surface models do not fully utilize information in boundary conditions [1] several years before Gong et al [2] discovered the particular type of benchmark that makes it possible to *quantify* the amount of information lost by an incorrect or imperfect model structure. This theoretical development laid the foundation for a formal theory of model benchmarking [3]. We here extend that theory to separate uncertainty contributions from the three major components of dynamical systems models [4]: model structures, model parameters, and boundary conditions describe time-dependent details of each prediction scenario. The key to this new development is the use of large-sample [5] data sets that span multiple soil types, climates, and biomes, which allows us to segregate uncertainty due to parameters from the two other sources. The benefit of this approach for uncertainty quantification and segregation is that it does not rely on Bayesian priors (although it is strictly coherent with Bayes' theorem and with probability theory), and therefore the partitioning of uncertainty into different components is *not* dependent on any a priori assumptions. We apply this methodology to assess the information use efficiency of the four land surface models that comprise the North American Land Data Assimilation System (Noah, Mosaic, SAC-SMA, and VIC). Specifically, we looked at the ability of these models to estimate soil moisture and latent heat fluxes. We found that in the case of soil moisture, about 25% of net information loss was from boundary conditions, around 45% was from model parameters, and 30-40% was from the model structures. In the case of latent heat flux, boundary conditions contributed about 50% of net uncertainty, and model structures contributed about 40%. There was relatively little difference between the different models. 1. G. Abramowitz, R. Leuning, M. Clark, A. Pitman, Evaluating the performance of land surface models. Journal of Climate 21, (2008). 2. W. Gong, H. V. Gupta, D. Yang, K. Sricharan, A. O. Hero, Estimating Epistemic & Aleatory Uncertainties During Hydrologic Modeling: An Information Theoretic Approach. Water Resources Research 49, 2253-2273 (2013). 3. G. S. Nearing, H. V. Gupta, The quantity and quality of information in hydrologic models. Water Resources Research 51, 524-538 (2015). 4. H. V. Gupta, G. S. Nearing, Using models and data to learn: A systems theoretic perspective on the future of hydrological science. Water Resources Research 50(6), 5351-5359 (2014). 5. H. V. Gupta et al., Large-sample hydrology: a need to balance depth with breadth. Hydrology and Earth System Sciences Discussions 10, 9147-9189 (2013).
Levels of theory and types of theoretical explanation in theoretical physics
NASA Astrophysics Data System (ADS)
Flores, Francisco J.
In Newtonian physics, there is a clear distinction between a 'framework theory', a collection of general physical principles and definitions of physical terms, and theories that describe specific causal interactions such as gravitation, i.e., 'interaction theories'. I argue that this distinction between levels of theory can also be found in the context of Special Relativity and that recognizing it is essential for a philosophical account of how laws are explained in this theory. As a case study, I consider the history of derivations of mass-energy equivalence which shows, I argue, that there are two distinct types of theoretical explanations (i.e., explanations of laws) in physics. One type is best characterized by the 'top-down' account of scientific explanation, while the other is more accurately described by the 'bottom-up' account. What is significant, I argue, is that the type of explanation a law receives depends on whether it is part of the framework theory or part of an interaction theory. The former only receive 'top-down' explanations while the latter can also receive 'bottom- up' explanations. Thus, I argue that current debates regarding 'top-down' vs 'bottom-up' views of scientific explanation can be clarified by recognizing the distinction between two levels of physical theory.
Bianchi Type VI1 Viscous Fluid Cosmological Model in Wesson´s Theory of Gravitation
NASA Astrophysics Data System (ADS)
Khadekar, G. S.; Avachar, G. R.
2007-03-01
Field equations of a scale invariant theory of gravitation proposed by Wesson [1, 2] are obtained in the presence of viscous fluid with the aid of Bianchi type VIh space-time with the time dependent gauge function (Dirac gauge). It is found that Bianchi type VIh (h = 1) space-time with viscous fluid is feasible in this theory, whereas Bianchi type VIh (h = -1, 0) space-times are not feasible in this theory, even in the presence of viscosity. For the feasible case, by assuming a relation connecting viscosity and metric coefficient, we have obtained a nonsingular-radiating model. We have discussed some physical and kinematical properties of the models.
Chen, Jianzhong; Green, Kari B; Nichols, Kelly K
2015-01-01
A series of different types of wax esters (represented by RCOOR′) were systematically studied by using electrospray ionization (ESI) collision-induced dissociation tandem mass spectrometry (MS/MS) along with pseudo MS3 (in-source dissociation combined with MS/MS) on a quadrupole time-of-flight (Q-TOF) mass spectrometer. The tandem mass spectra patterns resulting from dissociation of ammonium/proton adducts of these wax esters were influenced by the wax ester type and the collision energy applied. The product ions [RCOOH2]+, [RCO]+ and [RCO – H2O]+ that have been reported previously were detected; however, different primary product ions were demonstrated for the three wax ester types including: 1) [RCOOH2]+ for saturated wax esters, 2) [RCOOH2]+, [RCO]+ and [RCO – H2O]+ for unsaturated wax esters containing only one double bond in the fatty acid moiety or with one additional double bond in the fatty alcohol moiety, and 3) [RCOOH2]+ and [RCO]+ for unsaturated wax esters containing a double bond in the fatty alcohol moiety alone. Other fragments included [R′]+ and several series of product ions for all types of wax esters. Interestingly, unusual product ions were detected, such as neutral molecule (including water, methanol and ammonia) adducts of [RCOOH2]+ ions for all types of wax esters and [R′ – 2H]+ ions for unsaturated fatty acyl-containing wax esters. The patterns of tandem mass spectra for different types of wax esters will inform future identification and quantification approaches of wax esters in biological samples as supported by a preliminary study of quantification of isomeric wax esters in human meibomian gland secretions. PMID:26178197
Chen, Jianzhong; Green, Kari B; Nichols, Kelly K
2015-08-01
A series of different types of wax esters (represented by RCOOR') were systematically studied by using electrospray ionization (ESI) collision-induced dissociation tandem mass spectrometry (MS/MS) along with pseudo MS(3) (in-source dissociation combined with MS/MS) on a quadrupole time-of-flight (Q-TOF) mass spectrometer. The tandem mass spectra patterns resulting from dissociation of ammonium/proton adducts of these wax esters were influenced by the wax ester type and the collision energy applied. The product ions [RCOOH2](+), [RCO](+) and [RCO-H2O](+) that have been reported previously were detected; however, different primary product ions were demonstrated for the three wax ester types including: (1) [RCOOH2](+) for saturated wax esters, (2) [RCOOH2](+), [RCO](+) and [RCO-H2O](+) for unsaturated wax esters containing only one double bond in the fatty acid moiety or with one additional double bond in the fatty alcohol moiety, and (3) [RCOOH2](+) and [RCO](+) for unsaturated wax esters containing a double bond in the fatty alcohol moiety alone. Other fragments included [R'](+) and several series of product ions for all types of wax esters. Interestingly, unusual product ions were detected, such as neutral molecule (including water, methanol and ammonia) adducts of [RCOOH2](+) ions for all types of wax esters and [R'-2H](+) ions for unsaturated fatty acyl-containing wax esters. The patterns of tandem mass spectra for different types of wax esters will inform future identification and quantification approaches of wax esters in biological samples as supported by a preliminary study of quantification of isomeric wax esters in human meibomian gland secretions.
Tong, C Y; Hollingsworth, R C; Williams, H; Irving, W L; Gilmore, I T
1998-07-01
The Amplicor HCV Monitor test and the Quantiplex HCV RNA 2.0 (bDNA) assay are two commercially available assays for the quantification of hepatitis C virus (HCV) RNA in clinical samples. A direct comparison of the two assays was carried out using sera frozen previously from patients known to be chronically infected with HCV. Overall, 61 samples from 51 patients were tested simultaneously by the two methods: 67% (28/42) of the patients were infected by HCV genotype/serotype 1, 10% (4/42) with type 2, and 24% (10/42) with type 3. When the absolute value from each assay was examined, the Quantiplex assay gave a consistently higher reading and the mean logarithmic difference between the two assays was 1.4 (1.0 in type 1, 2.0 in type 2, and 2.2 in type 3). When analyzed according to genotype, strong correlation was observed between the two assays for type 1 (r = 0.83, 95% CI 0.63-0.93, P < 0.01), but not for nontype 1 samples. Despite the difference in absolute level reported by the two assays, there was a consistent trend of change in HCV RNA concentration by both assays in patients whose consecutive samples were analyzed and the differences between the two assays in consecutive samples were within 0.4 log of each other. The results suggested that with samples containing genotype 1, the Amplicor assay was more sensitive than the Quantiplex assay by about one log. However, the sensitivities of the two assays with nontype 1 samples were much closer probably due to the failure of the Amplicor assay to quantify nontype 1 genotypes effectively.
Confined phase in the real time formalism and the fate of the world behind the horizon
NASA Astrophysics Data System (ADS)
Furuuchi, Kazuyuki
2006-02-01
In the real time formulation of finite temperature field theories, one introduces an additional set of fields (type-2 fields) associated to each field in the original theory (type-1 field). In [J. M. Maldacena, J. High Energy Phys., JHEPFG, 1029-8479 04 (2003) 021., 10.1088/1126-6708/2003/04/021], in the context of the anti-de Sitter (AdS)-conformal field theories (CFT) correspondence, Maldacena interpreted type-2 fields as living on a boundary behind the black hole horizon. However, below the Hawking-Page transition temperature, the thermodynamically preferred configuration is the thermal AdS without a black hole, and hence there are no horizon and boundary behind it. This means that when the dual gauge theory is in confined phase, the type-2 fields cannot be associated with the degrees of freedom behind the black hole horizon. I argue that in this case the role of the type-2 fields is to make up bulk type-2 fields of classical closed string field theory on AdS at finite temperature in the real time formalism.
Riss, Patrick J; Hong, Young T; Williamson, David; Caprioli, Daniele; Sitnikov, Sergey; Ferrari, Valentina; Sawiak, Steve J; Baron, Jean-Claude; Dalley, Jeffrey W; Fryer, Tim D; Aigbirhio, Franklin I
2011-01-01
The 5-hydroxytryptamine type 2a (5-HT2A) selective radiotracer [18F]altanserin has been subjected to a quantitative micro-positron emission tomography study in Lister Hooded rats. Metabolite-corrected plasma input modeling was compared with reference tissue modeling using the cerebellum as reference tissue. [18F]altanserin showed sufficient brain uptake in a distribution pattern consistent with the known distribution of 5-HT2A receptors. Full binding saturation and displacement was documented, and no significant uptake of radioactive metabolites was detected in the brain. Blood input as well as reference tissue models were equally appropriate to describe the radiotracer kinetics. [18F]altanserin is suitable for quantification of 5-HT2A receptor availability in rats. PMID:21750562
Yargholi, Elahe'; Nasrabadi, Ali Motie
2015-01-01
The purpose of this study was to apply RQA (recurrence quantification analysis) on hypnotic electroencephalograph (EEG) signals recorded after hypnotic induction while subjects were doing standard tasks of the Waterloo-Stanford Group Scale (WSGS) of hypnotic susceptibility. Then recurrence quantifiers were used to analyse the influence of hypnotic depth on EEGs. By the application of this method, the capability of tasks to distinguish subjects of different hypnotizability levels was determined. Besides, medium hypnotizable subjects showed the highest disposition to be inducted by hypnotizer. Similarities between brain governing dynamics during tasks of the same type were also observed. The present study demonstrated two remarkable innovations; investigating the EEGs of the hypnotized as doing mental tasks of Waterloo-Stanford Group Scale (WSGS) and applying RQA on hypnotic EEGs.
Ramírez, Juan Carlos; Cura, Carolina Inés; Moreira, Otacilio da Cruz; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Guedes, Paulo Marcos da Matta; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Galvão, Lúcia Maria da Cunha; da Câmara, Antonia Cláudia Jácome; Espinoza, Bertha; de Noya, Belkisyole Alarcón; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G.
2015-01-01
An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. PMID:26320872
Remily-Wood, Elizabeth R; Benson, Kaaron; Baz, Rachid C; Chen, Y Ann; Hussein, Mohamad; Hartley-Brown, Monique A; Sprung, Robert W; Perez, Brianna; Liu, Richard Z; Yoder, Sean J; Teer, Jamie K; Eschrich, Steven A; Koomen, John M
2014-10-01
Quantitative MS assays for Igs are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, for example, multiple myeloma (MM). Using LC-MS/MS data, Ig constant region peptides, and transitions were selected for LC-MRM MS. Quantitative assays were used to assess Igs in serum from 83 patients. RNA sequencing and peptide-based LC-MRM are used to define peptides for quantification of the disease-specific Ig. LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1-4, IgA1-2, IgM, IgD, and IgE, as well as kappa (κ) and lambda (λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 MM cell line and two MM patients. LC-MRM assays targeting constant region peptides determine the type and isoform of the involved Ig and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher inter-assay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Subaihi, Abdu; Muhamadali, Howbeer; Mutter, Shaun T; Blanch, Ewan; Ellis, David I; Goodacre, Royston
2017-03-27
In this study surface enhanced Raman scattering (SERS) combined with the isotopic labelling (IL) principle has been used for the quantification of codeine spiked into both water and human plasma. Multivariate statistical approaches were employed for the analysis of these SERS spectral data, particularly partial least squares regression (PLSR) which was used to generate models using the full SERS spectral data for quantification of codeine with, and without, an internal isotopic labelled standard. The PLSR models provided accurate codeine quantification in water and human plasma with high prediction accuracy (Q 2 ). In addition, the employment of codeine-d 6 as the internal standard further improved the accuracy of the model, by increasing the Q 2 from 0.89 to 0.94 and decreasing the low root-mean-square error of predictions (RMSEP) from 11.36 to 8.44. Using the peak area at 1281 cm -1 assigned to C-N stretching, C-H wagging and ring breathing, the limit of detection was calculated in both water and human plasma to be 0.7 μM (209.55 ng mL -1 ) and 1.39 μM (416.12 ng mL -1 ), respectively. Due to a lack of definitive codeine vibrational assignments, density functional theory (DFT) calculations have also been used to assign the spectral bands with their corresponding vibrational modes, which were in excellent agreement with our experimental Raman and SERS findings. Thus, we have successfully demonstrated the application of SERS with isotope labelling for the absolute quantification of codeine in human plasma for the first time with a high degree of accuracy and reproducibility. The use of the IL principle which employs an isotopolog (that is to say, a molecule which is only different by the substitution of atoms by isotopes) improves quantification and reproducibility because the competition of the codeine and codeine-d 6 for the metal surface used for SERS is equal and this will offset any difference in the number of particles under analysis or any fluctuations in laser fluence. It is our belief that this may open up new exciting opportunities for testing SERS in real-world samples and applications which would be an area of potential future studies.
Diagnosis of dementia--automatic quantification of brain structures.
Engedal, Knut; Brækhus, Anne; Andreassen, Ole A; Nakstad, Per Hj
2012-08-21
The aim of the present study was to examine the usefulness of a fully automatic quantification of brain structures by means of magnetic resonance imaging (MRI) for diagnosing dementia of the Alzheimer's type (DAT). MRI scans of the brains of 122 patients, referred to a memory clinic, were analysed using Neuroquant® software, which quantifies the volume of various brain structures. Clinical diagnoses were made by two doctors without knowledge of the MRI results. We performed Receiver Operating Characteristic analyses and calculated the area under the curve (AUC). A value of 1 means that all ill patients have been diagnosed as diseased and no patient has been falsely diagnosed as diseased. The mean age of the patients was 67.2 years (SD 10.5 years), 60 % were men, 63 had DAT, 24 had another type of dementia, 25 had mild cognitive impairment (MCI) and ten had subjective cognitive impairment (SCI). In the comparison between DAT patients and patients with SCI or MCI, seven of eleven volumes were significantly larger than AUC 0.5. Positive and negative likelihood ratios were less than 5 and more than 0.2, respectively, for the best limit values of the volumes. Apart from the cerebellum (AUC 0.67), none of the brain structures was significantly different from AUC 0.5 in patients with dementia conditions other than dementia Alzheimer's type. MRI scans with Neuroquant analyses cannot be used alone to distinguish between persons with dementia of Alzheimer's type and persons without dementia.
Leung, Ross Ka-Kit; Dong, Zhi Qiang; Sa, Fei; Chong, Cheong Meng; Lei, Si Wan; Tsui, Stephen Kwok-Wing; Lee, Simon Ming-Yuen
2014-02-01
Minor variants have significant implications in quasispecies evolution, early cancer detection and non-invasive fetal genotyping but their accurate detection by next-generation sequencing (NGS) is hampered by sequencing errors. We generated sequencing data from mixtures at predetermined ratios in order to provide insight into sequencing errors and variations that can arise for which simulation cannot be performed. The information also enables better parameterization in depth of coverage, read quality and heterogeneity, library preparation techniques, technical repeatability for mathematical modeling, theory development and simulation experimental design. We devised minor variant authentication rules that achieved 100% accuracy in both testing and validation experiments. The rules are free from tedious inspection of alignment accuracy, sequencing read quality or errors introduced by homopolymers. The authentication processes only require minor variants to: (1) have minimum depth of coverage larger than 30; (2) be reported by (a) four or more variant callers, or (b) DiBayes or LoFreq, plus SNVer (or BWA when no results are returned by SNVer), and with the interassay coefficient of variation (CV) no larger than 0.1. Quantification accuracy undermined by sequencing errors could neither be overcome by ultra-deep sequencing, nor recruiting more variant callers to reach a consensus, such that consistent underestimation and overestimation (i.e. low CV) were observed. To accommodate stochastic error and adjust the observed ratio within a specified accuracy, we presented a proof of concept for the use of a double calibration curve for quantification, which provides an important reference towards potential industrial-scale fabrication of calibrants for NGS.
Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq.
Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H; Keleş, Sündüz; Dewey, Colin N
2016-08-01
RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. © 2016 Liu et al.; Published by Cold Spring Harbor Laboratory Press.
Virus detection and quantification using electrical parameters
NASA Astrophysics Data System (ADS)
Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.
2014-10-01
Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.
Dráberová, Eduarda; Stegurová, Lucie; Sulimenko, Vadym; Hájková, Zuzana; Dráber, Petr; Dráber, Pavel
2013-09-30
Microtubules formed by αβ-tubulin dimers represent cellular structures that are indispensable for the maintenance of cell morphology and for cell motility generation. Microtubules in intact cells are in highly regulated equilibrium with cellular pools of soluble tubulin dimers. Sensitive, reproducible and rapid assays are necessary to monitor tubulin changes in cytosolic pools after treatment with anti-mitotic drugs, during the cell cycle or activation and differentiation events. Here we describe new assays for α-tubulin quantification. The assays are based on sandwich ELISA, and the signal is amplified with biotinyl-tyramide or immuno-PCR. Matching monoclonal antibody pair recognizes phylogenetically highly conserved epitopes localized outside the C-terminal isotype-defining region. This makes it possible to detect α-tubulin isotypes in different cell types of various species. Biotinyl-tyramide amplification and immuno-PCR amplification enable detection of tubulin at concentrations 2.5ng/ml and 0.086ng/ml, respectively. Immuno-PCR detection shows enhanced sensitivity and wider dynamic range when compared to ELISA with biotinyl-tyramide detection. Our results on taxol-treated and activated bone marrow-derived mast cells demonstrate, that the assays allow sensitive quantification of tubulin in complex biological fluids. © 2013.
Vezzi, Vanessa; Onaran, H Ongun; Molinari, Paola; Guerrini, Remo; Balboni, Gianfranco; Calò, Girolamo; Costa, Tommaso
2013-08-16
Using a cell-free bioluminescence resonance energy transfer strategy we compared the levels of spontaneous and ligand-induced receptor-G protein coupling in δ (DOP) and μ (MOP) opioid receptors. In this assay GDP can suppress spontaneous coupling, thus allowing its quantification. The level of constitutive activity was 4-5 times greater at the DOP than at the MOP receptor. A series of opioid analogues with a common peptidomimetic scaffold displayed remarkable inversions of efficacy in the two receptors. Agonists that enhanced coupling above the low intrinsic level of the MOP receptor were inverse agonists in reducing the greater level of constitutive coupling of the DOP receptor. Yet the intrinsic activities of such ligands are identical when scaled over the GDP base line of both receptors. This pattern is in conflict with the predictions of the ternary complex model and the "two state" extensions. According to this theory, the order of spontaneous and ligand-induced coupling cannot be reversed if a shift of the equilibrium between active and inactive forms raises constitutive activation in one receptor type. We propose that constitutive activation results from a lessened intrinsic barrier that restrains spontaneous coupling. Any ligand, regardless of its efficacy, must enhance this constraint to stabilize the ligand-bound complexed form.
Real-time flood forecasts & risk assessment using a possibility-theory based fuzzy neural network
NASA Astrophysics Data System (ADS)
Khan, U. T.
2016-12-01
Globally floods are one of the most devastating natural disasters and improved flood forecasting methods are essential for better flood protection in urban areas. Given the availability of high resolution real-time datasets for flood variables (e.g. streamflow and precipitation) in many urban areas, data-driven models have been effectively used to predict peak flow rates in river; however, the selection of input parameters for these types of models is often subjective. Additionally, the inherit uncertainty associated with data models along with errors in extreme event observations means that uncertainty quantification is essential. Addressing these concerns will enable improved flood forecasting methods and provide more accurate flood risk assessments. In this research, a new type of data-driven model, a quasi-real-time updating fuzzy neural network is developed to predict peak flow rates in urban riverine watersheds. A possibility-to-probability transformation is first used to convert observed data into fuzzy numbers. A possibility theory based training regime is them used to construct the fuzzy parameters and the outputs. A new entropy-based optimisation criterion is used to train the network. Two existing methods to select the optimum input parameters are modified to account for fuzzy number inputs, and compared. These methods are: Entropy-Wavelet-based Artificial Neural Network (EWANN) and Combined Neural Pathway Strength Analysis (CNPSA). Finally, an automated algorithm design to select the optimum structure of the neural network is implemented. The overall impact of each component of training this network is to replace the traditional ad hoc network configuration methods, with one based on objective criteria. Ten years of data from the Bow River in Calgary, Canada (including two major floods in 2005 and 2013) are used to calibrate and test the network. The EWANN method selected lagged peak flow as a candidate input, whereas the CNPSA method selected lagged precipitation and lagged mean daily flow as candidate inputs. Model performance metric show that the CNPSA method had higher performance (with an efficiency of 0.76). Model output was used to assess the risk of extreme peak flows for a given day using an inverse possibility-to-probability transformation.
The Scientific Status of Learning Styles Theories
ERIC Educational Resources Information Center
Willingham, Daniel T.; Hughes, Elizabeth M.; Dobolyi, David G.
2015-01-01
Theories of learning styles suggest that individuals think and learn best in different ways. These are not differences of ability but rather preferences for processing certain types of information or for processing information in certain types of way. If accurate, learning styles theories could have important implications for instruction because…
Jachiet, Pierre-Alain; Colson, Philippe; Lopez, Philippe; Bapteste, Eric
2014-01-01
Complex nongradual evolutionary processes such as gene remodeling are difficult to model, to visualize, and to investigate systematically. Despite these challenges, the creation of composite (or mosaic) genes by combination of genetic segments from unrelated gene families was established as an important adaptive phenomena in eukaryotic genomes. In contrast, almost no general studies have been conducted to quantify composite genes in viruses. Although viral genome mosaicism has been well-described, the extent of gene mosaicism and its rules of emergence remain largely unexplored. Applying methods from graph theory to inclusive similarity networks, and using data from more than 3,000 complete viral genomes, we provide the first demonstration that composite genes in viruses are 1) functionally biased, 2) involved in key aspects of the arm race between cells and viruses, and 3) can be classified into two distinct types of composite genes in all viral classes. Beyond the quantification of the widespread recombination of genes among different viruses of the same class, we also report a striking sharing of genetic information between viruses of different classes and with different nucleic acid types. This latter discovery provides novel evidence for the existence of a large and complex mobilome network, which appears partly bound by the sharing of genetic information and by the formation of composite genes between mobile entities with different genetic material. Considering that there are around 10E31 viruses on the planet, gene remodeling appears as a hugely significant way of generating and moving novel sequences between different kinds of organisms on Earth. PMID:25104113
Lin, Shu; Wein, Samuel; Gonzales-Cope, Michelle; Otte, Gabriel L.; Yuan, Zuo-Fei; Afjehi-Sadat, Leila; Maile, Tobias; Berger, Shelley L.; Rush, John; Lill, Jennie R.; Arnott, David; Garcia, Benjamin A.
2014-01-01
To facilitate accurate histone variant and post-translational modification (PTM) quantification via mass spectrometry, we present a library of 93 synthetic peptides using Protein-Aqua™ technology. The library contains 55 peptides representing different modified forms from histone H3 peptides, 23 peptides representing H4 peptides, 5 peptides representing canonical H2A peptides, 8 peptides representing H2A.Z peptides, and peptides for both macroH2A and H2A.X. The PTMs on these peptides include lysine mono- (me1), di- (me2), and tri-methylation (me3); lysine acetylation; arginine me1; serine/threonine phosphorylation; and N-terminal acetylation. The library was subjected to chemical derivatization with propionic anhydride, a widely employed protocol for histone peptide quantification. Subsequently, the detection efficiencies were quantified using mass spectrometry extracted ion chromatograms. The library yields a wide spectrum of detection efficiencies, with more than 1700-fold difference between the peptides with the lowest and highest efficiencies. In this paper, we describe the impact of different modifications on peptide detection efficiencies and provide a resource to correct for detection biases among the 93 histone peptides. In brief, there is no correlation between detection efficiency and molecular weight, hydrophobicity, basicity, or modification type. The same types of modifications may have very different effects on detection efficiencies depending on their positions within a peptide. We also observed antagonistic effects between modifications. In a study of mouse trophoblast stem cells, we utilized the detection efficiencies of the peptide library to correct for histone PTM/variant quantification. For most histone peptides examined, the corrected data did not change the biological conclusions but did alter the relative abundance of these peptides. For a low-abundant histone H2A variant, macroH2A, the corrected data led to a different conclusion than the uncorrected data. The peptide library and detection efficiencies presented here may serve as a resource to facilitate studies in the epigenetics and proteomics fields. PMID:25000943
Yoshinaga, Kazuaki; Obi, Junji; Nagai, Toshiharu; Iioka, Hiroyuki; Yoshida, Akihiko; Beppu, Fumiaki; Gotoh, Naohiro
2017-03-01
In the present study, the resolution parameters and correction factors (CFs) of triacylglycerol (TAG) standards were estimated by gas chromatography-flame ionization detector (GC-FID) to achieve the precise quantification of the TAG composition in edible fats and oils. Forty seven TAG standards comprising capric acid, lauric acid, myristic acid, pentadecanoic acid, palmitic acid, palmitoleic acid, stearic acid, oleic acid, linoleic acid, and/or linolenic acid were analyzed, and the CFs of these TAGs were obtained against tripentadecanoyl glycerol as the internal standard. The capillary column was Ultra ALLOY + -65 (30 m × 0.25 mm i.d., 0.10 μm thickness) and the column temperature was programmed to rise from 250°C to 360°C at 4°C/min and then hold for 25 min. The limit of detection (LOD) and limit of quantification (LOQ) values of the TAG standards were > 0.10 mg and > 0.32 mg per 100 mg fat and oil, respectively, except for LnLnLn, and the LOD and LOQ values of LnLnLn were 0.55 mg and 1.84 mg per 100 mg fat and oil, respectively. The CFs of TAG standards decreased with increasing total acyl carbon number and degree of desaturation of TAG molecules. Also, there were no remarkable differences in the CFs between TAG positional isomers such as 1-palmitoyl-2-oleoyl-3-stearoyl-rac-glycerol, 1-stearoyl-2-palmitoyl-3-oleoyl-rac-glycerol, and 1-palmitoyl-2-stearoyl-3-oleoyl-rac-glycerol, which cannot be separated by GC-FID. Furthermore, this method was able to predict the CFs of heterogeneous (AAB- and ABC-type) TAGs from the CFs of homogenous (AAA-, BBB-, and CCC-type) TAGs. In addition, the TAG composition in cocoa butter, palm oil, and canola oil was determined using CFs, and the results were found to be in good agreement with those reported in the literature. Therefore, the GC-FID method using CFs can be successfully used for the quantification of TAG molecular species in natural fats and oils.
Super-Lie n-algebra extensions, higher WZW models and super-p-branes with tensor multiplet fields
NASA Astrophysics Data System (ADS)
Fiorenza, Domenico; Sati, Hisham; Schreiber, Urs
2015-12-01
We formalize higher-dimensional and higher gauge WZW-type sigma-model local prequantum field theory, and discuss its rationalized/perturbative description in (super-)Lie n-algebra homotopy theory (the true home of the "FDA"-language used in the supergravity literature). We show generally how the intersection laws for such higher WZW-type σ-model branes (open brane ending on background brane) are encoded precisely in (super-)L∞-extension theory and how the resulting "extended (super-)space-times" formalize spacetimes containing σ-model brane condensates. As an application we prove in Lie n-algebra homotopy theory that the complete super-p-brane spectrum of superstring/M-theory is realized this way, including the pure σ-model branes (the "old brane scan") but also the branes with tensor multiplet worldvolume fields, notably the D-branes and the M5-brane. For instance the degree-0 piece of the higher symmetry algebra of 11-dimensional (11D) spacetime with an M2-brane condensate turns out to be the "M-theory super-Lie algebra". We also observe that in this formulation there is a simple formal proof of the fact that type IIA spacetime with a D0-brane condensate is the 11D sugra/M-theory spacetime, and of (prequantum) S-duality for type IIB string theory. Finally we give the non-perturbative description of all this by higher WZW-type σ-models on higher super-orbispaces with higher WZW terms in stacky differential cohomology.
Uncertainty quantification for optical model parameters
Lovell, A. E.; Nunes, F. M.; Sarich, J.; ...
2017-02-21
Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of our work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fitmore » and create corresponding 95% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. Here, we study a number of reactions involving neutron and deuteron projectiles with energies in the range of 5–25 MeV/u, on targets with mass A=12–208. We investigate the correlations between the parameters in the fit. The case of deuterons on 12C is discussed in detail: the elastic-scattering fit and the prediction of 12C(d,p) 13C transfer angular distributions, using both uncorrelated and correlated χ 2 minimization functions. The general features for all cases are compiled in a systematic manner to identify trends. This work shows that, in many cases, the correlated χ 2 functions (in comparison to the uncorrelated χ 2 functions) provide a more natural parameterization of the process. These correlated functions do, however, produce broader confidence bands. Further optimization may require improvement in the models themselves and/or more information included in the fit.« less
Colloquium: Non-Markovian dynamics in open quantum systems
NASA Astrophysics Data System (ADS)
Breuer, Heinz-Peter; Laine, Elsi-Mari; Piilo, Jyrki; Vacchini, Bassano
2016-04-01
The dynamical behavior of open quantum systems plays a key role in many applications of quantum mechanics, examples ranging from fundamental problems, such as the environment-induced decay of quantum coherence and relaxation in many-body systems, to applications in condensed matter theory, quantum transport, quantum chemistry, and quantum information. In close analogy to a classical Markovian stochastic process, the interaction of an open quantum system with a noisy environment is often modeled phenomenologically by means of a dynamical semigroup with a corresponding time-independent generator in Lindblad form, which describes a memoryless dynamics of the open system typically leading to an irreversible loss of characteristic quantum features. However, in many applications open systems exhibit pronounced memory effects and a revival of genuine quantum properties such as quantum coherence, correlations, and entanglement. Here recent theoretical results on the rich non-Markovian quantum dynamics of open systems are discussed, paying particular attention to the rigorous mathematical definition, to the physical interpretation and classification, as well as to the quantification of quantum memory effects. The general theory is illustrated by a series of physical examples. The analysis reveals that memory effects of the open system dynamics reflect characteristic features of the environment which opens a new perspective for applications, namely, to exploit a small open system as a quantum probe signifying nontrivial features of the environment it is interacting with. This Colloquium further explores the various physical sources of non-Markovian quantum dynamics, such as structured environmental spectral densities, nonlocal correlations between environmental degrees of freedom, and correlations in the initial system-environment state, in addition to developing schemes for their local detection. Recent experiments addressing the detection, quantification, and control of non-Markovian quantum dynamics are also briefly discussed.
Chang, Sun Ju; Im, Eun-Ok
2014-01-01
The purpose of the study was to develop a situation-specific theory for explaining health-related quality of life (QOL) among older South Korean adults with type 2 diabetes. To develop a situation-specific theory, three sources were considered: (a) the conceptual model of health promotion and QOL for people with chronic and disabling conditions (an existing theory related to the QOL in patients with chronic diseases); (b) a literature review using multiple databases including Cumulative Index for Nursing and Allied Health Literature (CINAHL), PubMed, PsycINFO, and two Korean databases; and (c) findings from our structural equation modeling study on health-related QOL in older South Korean adults with type 2 diabetes. The proposed situation-specific theory is constructed with six major concepts including barriers, resources, perceptual factors, psychosocial factors, health-promoting behaviors, and health-related QOL. The theory also provides the interrelationships among concepts. Health care providers and nurses could incorporate the proposed situation-specific theory into development of diabetes education programs for improving health-related QOL in older South Korean adults with type 2 diabetes.
Block-localized wavefunction (BLW) method at the density functional theory (DFT) level.
Mo, Yirong; Song, Lingchun; Lin, Yuchun
2007-08-30
The block-localized wavefunction (BLW) approach is an ab initio valence bond (VB) method incorporating the efficiency of molecular orbital (MO) theory. It can generate the wavefunction for a resonance structure or diabatic state self-consistently by partitioning the overall electrons and primitive orbitals into several subgroups and expanding each block-localized molecular orbital in only one subspace. Although block-localized molecular orbitals in the same subspace are constrained to be orthogonal (a feature of MO theory), orbitals between different subspaces are generally nonorthogonal (a feature of VB theory). The BLW method is particularly useful in the quantification of the electron delocalization (resonance) effect within a molecule and the charge-transfer effect between molecules. In this paper, we extend the BLW method to the density functional theory (DFT) level and implement the BLW-DFT method to the quantum mechanical software GAMESS. Test applications to the pi conjugation in the planar allyl radical and ions with the basis sets of 6-31G(d), 6-31+G(d), 6-311+G(d,p), and cc-pVTZ show that the basis set dependency is insignificant. In addition, the BLW-DFT method can also be used to elucidate the nature of intermolecular interactions. Examples of pi-cation interactions and solute-solvent interactions will be presented and discussed. By expressing each diabatic state with one BLW, the BLW method can be further used to study chemical reactions and electron-transfer processes whose potential energy surfaces are typically described by two or more diabatic states.
Remote sensing of land use and water quality relationships - Wisconsin shore, Lake Michigan
NASA Technical Reports Server (NTRS)
Haugen, R. K.; Marlar, T. L.
1976-01-01
This investigation assessed the utility of remote sensing techniques in the study of land use-water quality relationships in an east central Wisconsin test area. The following types of aerial imagery were evaluated: high altitude (60,000 ft) color, color infrared, multispectral black and white, and thermal; low altitude (less than 5000 ft) color infrared, multispectral black and white, thermal, and passive microwave. A non-imaging hand-held four-band radiometer was evaluated for utility in providing data on suspended sediment concentrations. Land use analysis includes the development of mapping and quantification methods to obtain baseline data for comparison to water quality variables. Suspended sediment loads in streams, determined from water samples, were related to land use differences and soil types in three major watersheds. A multiple correlation coefficient R of 0.85 was obtained for the relationship between the 0.6-0.7 micrometer incident and reflected radiation data from the hand-held radiometer and concurrent ground measurements of suspended solids in streams. Applications of the methods and baseline data developed in this investigation include: mapping and quantification of land use; input to watershed runoff models; estimation of effects of land use changes on stream sedimentation; and remote sensing of suspended sediment content of streams. High altitude color infrared imagery was found to be the most acceptable remote sensing technique for the mapping and measurement of land use types.
Visual Hemispheric Specialization: A Computational Theory. Technical Report #7.
ERIC Educational Resources Information Center
Kosslyn, Stephen M.
Visual recognition, navigation, tracking, and imagery are posited to involve some of the same types of representations and processes. The first part of this paper develops a theory of some of the shared types of representations and processing modules. The theory is developed in light of neurophysiological and neuroanatomical data from non-human…
Cognitive Type Theory & Learning Style, A Teacher's Guide.
ERIC Educational Resources Information Center
Mamchur, Carolyn
This guide provides a practical explanation of cognitive type theory and learning style that will help teachers meet students' needs and discover their own strengths as teachers and colleagues. The introduction provides an overview of the book from the perspective of a high school classroom teacher. Part One introduces the theory of psychological…
Current trends in nursing theories.
Im, Eun-Ok; Chang, Sun Ju
2012-06-01
To explore current trends in nursing theories through an integrated literature review. The literature related to nursing theories during the past 10 years was searched through multiple databases and reviewed to determine themes reflecting current trends in nursing theories. The trends can be categorized into six themes: (a) foci on specifics; (b) coexistence of various types of theories; (c) close links to research; (d) international collaborative works; (e) integration to practice; and (f) selective evolution. We need to make our continuous efforts to link research and practice to theories, to identify specifics of our theories, to develop diverse types of theories, and to conduct international collaborative works. Our paper gives implications for future theoretical development in diverse clinical areas of nursing research and practice. © 2012 Sigma Theta Tau International.
IIB duals of D = 3 {N} = 4 circular quivers
NASA Astrophysics Data System (ADS)
Assel, Benjamin; Bachas, Costas; Estes, John; Gomis, Jaume
2012-12-01
We construct the type-IIB AdS4 ⋉ K supergravity solutions which are dual to the three-dimensional {N} = 4 superconformal field theories that arise as infrared fixed points of circular-quiver gauge theories. These superconformal field theories are labeled by a triple ( {ρ, hat{ρ},L} ) subject to constraints, where ρ and hat{ρ} are two partitions of a number N, and L is a positive integer. We show that in the limit of large L the localized five- branes in our solutions are effectively smeared, and these type-IIB solutions are dual to the near-horizon geometry of M-theory M2-branes at a {{{{{{C}}^4}}} / {{( {{Z_k}× {Z_{widehat{k}}}} )}} .} orbifold singularity. Our IIB solutions resolve the singularity into localized five-brane throats, without breaking the conformal symmetry. The constraints satisfied by the triple ( {ρ, hat{ρ},L} ) , together with the enhanced non-abelian flavour symmetries of the superconformal field theories are precisely reproduced by the type-IIB supergravity solutions. As a bonus, we uncover a novel type of "orbifold equivalence" between different quantum field theories and provide quantitative evidence for this equivalence.
Dual-Process Theories of Reasoning: Contemporary Issues and Developmental Applications
ERIC Educational Resources Information Center
Evans, Jonathan St. B. T.
2011-01-01
In this paper, I discuss the current state of theorising about dual processes in adult performance on reasoning and decision making tasks, in which Type 1 intuitive processing is distinguished from Type 2 reflective thinking. I show that there are many types of theory some of which distinguish modes rather than types of thinking and that…
Talarico, Sarah; Safaeian, Mahboobeh; Gonzalez, Paula; Hildesheim, Allan; Herrero, Rolando; Porras, Carolina; Cortes, Bernal; Larson, Ann; Fang, Ferric C; Salama, Nina R
2016-08-01
Epidemiologic studies of the carcinogenic stomach bacterium Helicobacter pylori have been limited by the lack of noninvasive detection and genotyping methods. We developed a new stool-based method for detection, quantification, and partial genotyping of H. pylori using droplet digital PCR (ddPCR), which allows for increased sensitivity and absolute quantification by PCR partitioning. Stool-based ddPCR assays for H. pylori 16S gene detection and cagA virulence gene typing were tested using a collection of 50 matched stool and serum samples from Costa Rican volunteers and 29 H. pylori stool antigen-tested stool samples collected at a US hospital. The stool-based H. pylori 16S ddPCR assay had a sensitivity of 84% and 100% and a specificity of 100% and 71% compared to serology and stool antigen tests, respectively. The stool-based cagA genotyping assay detected cagA in 22 (88%) of 25 stools from CagA antibody-positive individuals and four (16%) of 25 stools from CagA antibody-negative individuals from Costa Rica. All 26 of these samples had a Western-type cagA allele. Presence of serum CagA antibodies was correlated with a significantly higher load of H. pylori in the stool. The stool-based ddPCR assays are a sensitive, noninvasive method for detection, quantification, and partial genotyping of H. pylori. The quantitative nature of ddPCR-based H. pylori detection revealed significant variation in bacterial load among individuals that correlates with presence of the cagA virulence gene. These stool-based ddPCR assays will facilitate future population-based epidemiologic studies of this important human pathogen. © 2015 John Wiley & Sons Ltd.
Implications of Measurement Assay Type in Design of HIV Experiments.
Cannon, LaMont; Jagarapu, Aditya; Vargas-Garcia, Cesar A; Piovoso, Michael J; Zurakowski, Ryan
2017-12-01
Time series measurements of circular viral episome (2-LTR) concentrations enable indirect quantification of persistent low-level Human Immunodeficiency Virus (HIV) replication in patients on Integrase-Inhibitor intensified Combined Antiretroviral Therapy (cART). In order to determine the magnitude of these low level infection events, blood has to be drawn from a patients at a frequency and volume that is strictly regulated by the Institutional Review Board (IRB). Once the blood is drawn, the 2-LTR concentration is determined by quantifying the amount of HIV DNA present in the sample via a PCR (Polymerase Chain Reaction) assay. Real time quantitative Polymerase Chain Reaction (qPCR) is a widely used method of performing PCR; however, a newer droplet digital Polymerase Chain Reaction (ddPCR) method has been shown to provide more accurate quantification of DNA. Using a validated model of HIV viral replication, this paper demonstrates the importance of considering DNA quantification assay type when optimizing experiment design conditions. Experiments are optimized using a Genetic Algorithm (GA) to locate a family of suboptimal sample schedules which yield the highest fitness. Fitness is defined as the expected information gained in the experiment, measured by the Kullback-Leibler Divergence (KLD) between the prior and posterior distributions of the model parameters. We compare the information content of the optimized schedules to uniform schedules as well as two clinical schedules implemented by researchers at UCSF and the University of Melbourne. This work shows that there is a significantly greater gain information in experiments using a ddPCR assay vs. a qPCR assay and that certain experiment design considerations should be taken when using either assay.
Quantification of multiple gene expression in individual cells.
Peixoto, António; Monteiro, Marta; Rocha, Benedita; Veiga-Fernandes, Henrique
2004-10-01
Quantitative gene expression analysis aims to define the gene expression patterns determining cell behavior. So far, these assessments can only be performed at the population level. Therefore, they determine the average gene expression within a population, overlooking possible cell-to-cell heterogeneity that could lead to different cell behaviors/cell fates. Understanding individual cell behavior requires multiple gene expression analyses of single cells, and may be fundamental for the understanding of all types of biological events and/or differentiation processes. We here describe a new reverse transcription-polymerase chain reaction (RT-PCR) approach allowing the simultaneous quantification of the expression of 20 genes in the same single cell. This method has broad application, in different species and any type of gene combination. RT efficiency is evaluated. Uniform and maximized amplification conditions for all genes are provided. Abundance relationships are maintained, allowing the precise quantification of the absolute number of mRNA molecules per cell, ranging from 2 to 1.28 x 10(9) for each individual gene. We evaluated the impact of this approach on functional genetic read-outs by studying an apparently homogeneous population (monoclonal T cells recovered 4 d after antigen stimulation), using either this method or conventional real-time RT-PCR. Single-cell studies revealed considerable cell-to-cell variation: All T cells did not express all individual genes. Gene coexpression patterns were very heterogeneous. mRNA copy numbers varied between different transcripts and in different cells. As a consequence, this single-cell assay introduces new and fundamental information regarding functional genomic read-outs. By comparison, we also show that conventional quantitative assays determining population averages supply insufficient information, and may even be highly misleading.
Bertolucci, Suzan Kelly; Pereira, Ana Bárbara; Pinto, José Eduardo; de Aquino Ribeiro, José Antônio; de Oliveira, Alaíde Braga; Braga, Fernão Castro
2009-02-01
MIKANIA GLOMERATA and MIKANIA LAEVIGATA (Asteraceae) are medicinal plants popularly named 'guaco' in Brazil. The leaves of both species are used to treat respiratory diseases, with coumarin (CO) and kaurane-type diterpenes being regarded as the bioactive constituents. A new and simple RP-HPLC method was developed and validated for the simultaneous quantification of CO, O-coumaric (OC), benzoylgrandifloric (BA), cinnamoylgrandifloric (CA) and kaurenoic (KA) acids in the species. Optimal separation was achieved with an alternating gradient elution of methanol and acetonitrile and detection was carried out by DAD at three different wavelengths: 210 nm for CO, OC, KA; 230 nm for BA; and 270 nm for CA. The extracts showed good stability during 42 hours under normal laboratory conditions (temperature of 23 +/- 2 degrees C). The standard curves were linear over the range 0.5 - 5.0 microg (CO), 0.25 - 4.0 microg (OC), 1.0 - 8.0 microg (BA), 0.5 - 3.0 microg (CA) and 0.8 - 12.0 microg (KA), with R(2) > 0.999 for all compounds. The method showed good precision for intra-day (RSD < 4.6 %) and inter-day assays (RSD < 4.4 %). The recovery was between 99.9 and 105.3 %, except for CO and OC in M. glomerata (73.2 - 91.6 % and 86.3 - 117.4 %, respectively). The limits of quantification and detection were in the range of 0.025 - 0.800 microg and 0.007 - 0.240 microg. The method was tested for new and old columns, temperature variation (26 and 28 degrees C) and by different operators in the same laboratory. The method was successfully applied to samples of both species.
Kinetic quantification of plyometric exercise intensity.
Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J
2011-12-01
Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.
NASA Astrophysics Data System (ADS)
Navascues, M. A.; Sebastian, M. V.
Fractal interpolants of Barnsley are defined for any continuous function defined on a real compact interval. The uniform distance between the function and its approximant is bounded in terms of the vertical scale factors. As a general result, the density of the affine fractal interpolation functions of Barnsley in the space of continuous functions in a compact interval is proved. A method of data fitting by means of fractal interpolation functions is proposed. The procedure is applied to the quantification of cognitive brain processes. In particular, the increase in the complexity of the electroencephalographic signal produced by the execution of a test of visual attention is studied. The experiment was performed on two types of children: a healthy control group and a set of children diagnosed with an attention deficit disorder.
Zhang, Xiao-Hua; Wu, Hai-Long; Wang, Jian-Yao; Tu, De-Zhu; Kang, Chao; Zhao, Juan; Chen, Yao; Miu, Xiao-Xia; Yu, Ru-Qin
2013-05-01
This paper describes the use of second-order calibration for development of HPLC-DAD method to quantify nine polyphenols in five kinds of honey samples. The sample treatment procedure was simplified effectively relative to the traditional ways. Baselines drift was also overcome by means of regarding the drift as additional factor(s) as well as the analytes of interest in the mathematical model. The contents of polyphenols obtained by the alternating trilinear decomposition (ATLD) method have been successfully used to distinguish different types of honey. This method shows good linearity (r>0.99), rapidity (t<7.60 min) and accuracy, which may be extremely promising as an excellent routine strategy for identification and quantification of polyphenols in the complex matrices. Copyright © 2012 Elsevier Ltd. All rights reserved.
Salazar, Oscar; Valverde, Aranzazu; Genilloud, Olga
2006-01-01
Real-time PCR (RT-PCR) technology was used for the specific detection and quantification of members of the family Geodermatophilaceae in stone samples. Differences in the nucleotide sequences of the 16S rRNA gene region were used to design a pair of family-specific primers that were used to detect and quantify by RT-PCR DNA from members of this family in stone samples from different geographical origins in Spain. These primers were applied later to identify by PCR-specific amplification new members of the family Geodermatophilaceae isolated from the same stone samples. The diversity and taxonomic position of the wild-type strains identified from ribosomal sequence analysis suggest the presence of a new lineage within the genus Blastococcus. PMID:16391063
Aeras: A next generation global atmosphere model
Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...
2015-06-01
Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less
Li, Yao; Dwivedi, Gaurav; Huang, Wen; Yi, Yingfei
2012-01-01
There is an evolutionary advantage in having multiple components with overlapping functionality (i.e degeneracy) in organisms. While theoretical considerations of degeneracy have been well established in neural networks using information theory, the same concepts have not been developed for differential systems, which form the basis of many biochemical reaction network descriptions in systems biology. Here we establish mathematical definitions of degeneracy, complexity and robustness that allow for the quantification of these properties in a system. By exciting a dynamical system with noise, the mutual information associated with a selected observable output and the interacting subspaces of input components can be used to define both complexity and degeneracy. The calculation of degeneracy in a biological network is a useful metric for evaluating features such as the sensitivity of a biological network to environmental evolutionary pressure. Using a two-receptor signal transduction network, we find that redundant components will not yield high degeneracy whereas compensatory mechanisms established by pathway crosstalk will. This form of analysis permits interrogation of large-scale differential systems for non-identical, functionally equivalent features that have evolved to maintain homeostasis during disruption of individual components. PMID:22619750
NASA Astrophysics Data System (ADS)
Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef
2018-04-01
In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.
Ullrich, Thomas; Ermantraut, Eugen; Schulz, Torsten; Steinmetzer, Katrin
2012-01-01
Background State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. Methodology and Principal Findings The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. Conclusions and Significance The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2), we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls and targets into a single assay and to perform the assay on simple and robust instrumentation is a prerequisite for the development of novel molecular point of care tests. PMID:22539973
Influences of Altered River Geomorphology on Channel-Floodplain Mass and Momentum Transfer
NASA Astrophysics Data System (ADS)
Byrne, C. F.; Stone, M. C.
2017-12-01
River management strategies, including both river engineering and restoration, have altered river geomorphology and associated lateral channel-floodplain connectivity throughout the world. This altered connectivity is known to drive changes in ecologic and geomorphic processes during floods, however, quantification of altered connectivity is difficult due to the highly dynamic spatial and temporal nature of flood wave conditions. The objective of this research was to quantify the physical processes of lateral mass and momentum transfer at the channel-floodplain interface. The objective was achieved with the implementation of novel scripting and high-resolution, two-dimensional hydrodynamic modeling techniques under unsteady flow conditions. The process-based analysis focused on three geomorphic feature types within the Middle Rio Grande, New Mexico, USA: (1) historical floodplain surfaces, (2) inset floodplain surfaces formed as a result of channel training and hydrologic alteration, and (3) mechanically restored floodplain surfaces. Results suggest that inset floodplain feature types are not only subject to greater mass and momentum transfer magnitudes, but those connections are also more heterogeneous in nature compared with historical feature types. While restored floodplain feature types exhibit transfer magnitudes and heterogeneity comparable to inset feature types, the surfaces are not of great enough spatial extent to substantially influence total channel-floodplain mass and momentum transfer. Mass and momentum transfer also displayed differing characteristic changes as a result of increased flood magnitude, indicating that linked hydrodynamic processes can be altered differently as a result of geomorphic and hydrologic change. The results display the potential of high-resolution modeling strategies in capturing the spatial and temporal complexities of river processes. In addition, the results have implications for other fields of river science including biogeochemical exchange at the channel-floodplain interface and quantification of process associated with environmental flow and river restoration strategies.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.
2017-12-01
Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical species. Numerous geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. As a result, these types of model analyses are typically extremely challenging. Here, we demonstrate a new contaminant source identification approach that performs decomposition of the observation mixtures based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. We also demonstrate how NMFk can be extended to perform uncertainty quantification and experimental design related to real-world site characterization. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios). The NMFk algorithm has been extensively tested on synthetic datasets; NMFk analyses have been actively performed on real-world data collected at the Los Alamos National Laboratory (LANL) groundwater sites related to Chromium and RDX contamination.
A comparison of three methods for measuring local urban tree canopy cover
Kristen L. King; Dexter H. Locke
2013-01-01
Measurements of urban tree canopy cover are crucial for managing urban forests and required for the quantification of the benefits provided by trees. These types of data are increasingly used to secure funding and justify large-scale planting programs in urban areas. Comparisons of tree canopy measurement methods have been conducted before, but a rapidly evolving set...
Using airborne laser altimetry to determine fuel models for estimating fire behavior
Carl A. Seielstad; Lloyd P. Queen
2003-01-01
Airborne laser altimetry provides an unprecedented view of the forest floor in timber fuel types and is a promising new tool for fuels assessments. It can be used to resolve two fuel models under closed canopies and may be effective for estimating coarse woody debris loads. A simple metric - obstacle density - provides the necessary quantification of fuel bed roughness...
Matthew B. Russell; Anthony W. D' Amato; Bethany K. Schulz; Christopher W. Woodall; Grant M. Domke; John B. Bradford
2014-01-01
The contribution of understorey vegetation (UVEG) to forest ecosystem biomass and carbon (C) across diverse forest types has, to date, eluded quantification at regional and national scales. Efforts to quantify UVEG C have been limited to field-intensive studies or broad-scalemodelling approaches lacking fieldmeasurements. Although large-scale inventories of UVEG C are...
Artimovich, Elena; Jackson, Russell K; Kilander, Michaela B C; Lin, Yu-Chih; Nestor, Michael W
2017-10-16
Intracellular calcium is an important ion involved in the regulation and modulation of many neuronal functions. From regulating cell cycle and proliferation to initiating signaling cascades and regulating presynaptic neurotransmitter release, the concentration and timing of calcium activity governs the function and fate of neurons. Changes in calcium transients can be used in high-throughput screening applications as a basic measure of neuronal maturity, especially in developing or immature neuronal cultures derived from stem cells. Using human induced pluripotent stem cell derived neurons and dissociated mouse cortical neurons combined with the calcium indicator Fluo-4, we demonstrate that PeakCaller reduces type I and type II error in automated peak calling when compared to the oft-used PeakFinder algorithm under both basal and pharmacologically induced conditions. Here we describe PeakCaller, a novel MATLAB script and graphical user interface for the quantification of intracellular calcium transients in neuronal cultures. PeakCaller allows the user to set peak parameters and smoothing algorithms to best fit their data set. This new analysis script will allow for automation of calcium measurements and is a powerful software tool for researchers interested in high-throughput measurements of intracellular calcium.
Chi, Liandi; Chen, Lingxiao; Zhang, Jiwen; Zhao, Jing; Li, Shaoping; Zheng, Ying
2018-07-15
Inulin-type fructooligosaccharides (FOS) purified from Morinda Officinalis, with degrees of polymerization (DP) from 3 to 9, have been approved in China as an oral prescribed drug for mild and moderate depression episode, while the stability and oral absorption of this FOS mixtures are largely unknown. As the main active component and quality control marker for above FOS, DP5 was selected as the representative FOS in this study. Desalting method by ion exchange resin was developed to treat bio-sample, followed by separation and quantification by high performance liquid chromatography-charged aerosol detector. Results showed that the DP5 was stepwisely hydrolyzed in simulated gastric fluid and gut microbiota, while maintained stable in intestinal fluid. DP5 has poor permeability across Caco-2 monolayer with P app of 5.22 × 10 -7 cm/s, and very poor oral absorption with bioavailability of (0.50 ± 0.12)% in rat. In conclusion, FOS in Morinda Officinalis demonstrated poor chemical stability in simulated gastric fluid and human gut microbiota, and low oral absorption in rats. Copyright © 2018 Elsevier B.V. All rights reserved.
Lopez-Gazpio, Josu; Garcia-Arrona, Rosa; Millán, Esmeralda
2015-04-01
In this work, a simple and reliable micellar electrokinetic chromatography method for the separation and quantification of 14 preservatives, including isothiazolinones, and two benzophenone-type UV filters in household, cosmetic and personal care products was developed. The selected priority compounds are widely used as ingredients in many personal care products, and are included in the European Regulation concerning cosmetic products. The electrophoretic separation parameters were optimized by means of a modified chromatographic response function in combination with an experimental design, namely a central composite design. After optimization of experimental conditions, the BGE selected for the separation of the targets consisted of 60 mM SDS, 18 mM sodium tetraborate, pH 9.4 and 10% v/v methanol. The MEKC method was checked in terms of linearity, LODs and quantification, repeatability, intermediate precision, and accuracy, providing appropriate values (i.e. R(2) ≥ 0.992, repeatability RSD values ˂9%, and accuracy 90-115%). Applicability of the validated method was successfully assessed by quantifying preservatives and UV filters in commercial consumer products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting
2017-01-01
Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. PMID:28235782
Tryptophan and kynurenine determination in human hair by liquid chromatography.
Dario, Michelli F; Freire, Thamires Batello; Pinto, Claudinéia Aparecida Sales de Oliveira; Prado, María Segunda Aurora; Baby, André R; Velasco, Maria Valéria R
2017-10-15
Tryptophan, an amino acid found in hair proteinaceous structure is used as a marker of hair photodegradation. Also, protein loss caused by several chemical/physical treatments can be inferred by tryptophan quantification. Kynurenine is a photo-oxidation product of tryptophan, expected to be detected when hair is exposed mainly to UVB (290-320nm) radiation range. Tryptophan from hair is usually quantified directly as a solid or after alkaline hydrolysis, spectrofluorimetrically. However, these types of measure are not sufficiently specific and present several interfering substances. Thus, this work aimed to propose a quantification method for both tryptophan and kynurenine in hair samples, after alkali hydrolysis process, by using high-performance liquid chromatography (HPLC) with fluorimetric and UV detection. The tryptophan and kynurenine quantification method was developed and validated. Black, white, bleached and dyed (blond and auburn) hair tresses were used in this study. Tryptophan and kynurenine were separated within ∼9min by HPLC. Both black and white virgin hair samples presented similar concentrations of tryptophan, while bleaching caused a reduction in the tryptophan content as well as dyeing process. Unexpectedly, UV/vis radiation did not promote significantly the conversion of tryptophan into its photo-oxidation product and consequently, kynurenine was not detected. Thus, this works presented an acceptable method for quantification of tryptophan and its photooxidation metabolite kynurenine in hair samples. Also, the results indicated that bleaching and dyeing processes promoted protein/amino acids loss but tryptophan is not extensively degraded in human hair by solar radiation. Copyright © 2017 Elsevier B.V. All rights reserved.
PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics1[OPEN
Poeschl, Yvonne; Plötner, Romina
2017-01-01
Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. PMID:28931626
Gibby, Jacob T; Njeru, Dennis K; Cvetko, Steve T; Heiny, Eric L; Creer, Andrew R; Gibby, Wendell A
We correlate and evaluate the accuracy of accepted anthropometric methods of percent body fat (%BF) quantification, namely, hydrostatic weighing (HW) and air displacement plethysmography (ADP), to 2 automatic adipose tissue quantification methods using computed tomography (CT). Twenty volunteer subjects (14 men, 6 women) received head-to-toe CT scans. Hydrostatic weighing and ADP were obtained from 17 and 12 subjects, respectively. The CT data underwent conversion using 2 separate algorithms, namely, the Schneider method and the Beam method, to convert Hounsfield units to their respective tissue densities. The overall mass and %BF of both methods were compared with HW and ADP. When comparing ADP to CT data using the Schneider method and Beam method, correlations were r = 0.9806 and 0.9804, respectively. Paired t tests indicated there were no statistically significant biases. Additionally, observed average differences in %BF between ADP and the Schneider method and the Beam method were 0.38% and 0.77%, respectively. The %BF measured from ADP, the Schneider method, and the Beam method all had significantly higher mean differences when compared with HW (3.05%, 2.32%, and 1.94%, respectively). We have shown that total body mass correlates remarkably well with both the Schneider method and Beam method of mass quantification. Furthermore, %BF calculated with the Schneider method and Beam method CT algorithms correlates remarkably well with ADP. The application of these CT algorithms have utility in further research to accurately stratify risk factors with periorgan, visceral, and subcutaneous types of adipose tissue, and has the potential for significant clinical application.
Vadas, P A; Good, L W; Moore, P A; Widman, N
2009-01-01
Nonpoint-source pollution of fresh waters by P is a concern because it contributes to accelerated eutrophication. Given the state of the science concerning agricultural P transport, a simple tool to quantify annual, field-scale P loss is a realistic goal. We developed new methods to predict annual dissolved P loss in runoff from surface-applied manures and fertilizers and validated the methods with data from 21 published field studies. We incorporated these manure and fertilizer P runoff loss methods into an annual, field-scale P loss quantification tool that estimates dissolved and particulate P loss in runoff from soil, manure, fertilizer, and eroded sediment. We validated the P loss tool using independent data from 28 studies that monitored P loss in runoff from a variety of agricultural land uses for at least 1 yr. Results demonstrated (i) that our new methods to estimate P loss from surface manure and fertilizer are an improvement over methods used in existing Indexes, and (ii) that it was possible to reliably quantify annual dissolved, sediment, and total P loss in runoff using relatively simple methods and readily available inputs. Thus, a P loss quantification tool that does not require greater degrees of complexity or input data than existing P Indexes could accurately predict P loss across a variety of management and fertilization practices, soil types, climates, and geographic locations. However, estimates of runoff and erosion are still needed that are accurate to a level appropriate for the intended use of the quantification tool.
Cryar, Adam; Pritchard, Caroline; Burkitt, William; Walker, Michael; O'Connor, Gavin; Burns, Duncan Thorburn; Quaglia, Milena
2013-01-01
Current routine food allergen quantification methods, which are based on immunochemistry, offer high sensitivity but can suffer from issues of specificity and significant variability of results. MS approaches have been developed, but currently lack metrological traceability. A feasibility study on the application of metrologically traceable MS-based reference procedures was undertaken. A proof of concept involving proteolytic digestion and isotope dilution MS for quantification of protein allergens in a food matrix was undertaken using lysozyme in wine as a model system. A concentration of lysozyme in wine of 0.95 +/- 0.03 microg/g was calculated based on the concentrations of two peptides, confirming that this type of analysis is viable at allergenically meaningful concentrations. The challenges associated with this promising method were explored; these included peptide stability, chemical modification, enzymatic digestion, and sample cleanup. The method is suitable for the production of allergen in food certified reference materials, which together with the achieved understanding of the effects of sample preparation and of the matrix on the final results, will assist in addressing the bias of the techniques routinely used and improve measurement confidence. Confirmation of the feasibility of MS methods for absolute quantification of an allergenic protein in a food matrix with results traceable to the International System of Units is a step towards meaningful comparison of results for allergen proteins among laboratories. This approach will also underpin risk assessment and risk management of allergens in the food industry, and regulatory compliance of the use of thresholds or action levels when adopted.
Towards deconstruction of the Type D (2,0) theory
NASA Astrophysics Data System (ADS)
Bourget, Antoine; Pini, Alessandro; Rodriguez-Gomez, Diego
2017-12-01
We propose a four-dimensional supersymmetric theory that deconstructs, in a particular limit, the six-dimensional (2, 0) theory of type D k . This 4d theory is defined by a necklace quiver with alternating gauge nodes O(2 k) and Sp( k). We test this proposal by comparing the 6d half-BPS index to the Higgs branch Hilbert series of the 4d theory. In the process, we overcome several technical difficulties, such as Hilbert series calculations for non-complete intersections, and the choice of O versus SO gauge groups. Consistently, the result matches the Coulomb branch formula for the mirror theory upon reduction to 3d.
Chaos of radiative heat-loss-induced flame front instability.
Kinugawa, Hikaru; Ueda, Kazuhiro; Gotoda, Hiroshi
2016-03-01
We are intensively studying the chaos via the period-doubling bifurcation cascade in radiative heat-loss-induced flame front instability by analytical methods based on dynamical systems theory and complex networks. Significant changes in flame front dynamics in the chaotic region, which cannot be seen in the bifurcation diagrams, were successfully extracted from recurrence quantification analysis and nonlinear forecasting and from the network entropy. The temporal dynamics of the fuel concentration in the well-developed chaotic region is much more complicated than that of the flame front temperature. It exhibits self-affinity as a result of the scale-free structure in the constructed visibility graph.
GPS free navigation inspired by insects through monocular camera and inertial sensors
NASA Astrophysics Data System (ADS)
Liu, Yi; Liu, J. G.; Cao, H.; Huang, Y.
2015-12-01
Navigation without GPS and other knowledge of environment have been studied for many decades. Advance technology have made sensors more compact and subtle that can be easily integrated into micro and hand-hold device. Recently researchers found that bee and fruit fly have an effectively and efficiently navigation mechanism through optical flow information and process only with their miniature brain. We present a navigation system inspired by the study of insects through a calibrated camera and other inertial sensors. The system utilizes SLAM theory and can be worked in many GPS denied environment. Simulation and experimental results are presented for validation and quantification.
Inhibitory Competition between Shape Properties in Figure-Ground Perception
ERIC Educational Resources Information Center
Peterson, Mary A.; Skow, Emily
2008-01-01
Theories of figure-ground perception entail inhibitory competition between either low-level units (edge or feature units) or high-level shape properties. Extant computational models instantiate the 1st type of theory. The authors investigated a prediction of the 2nd type of theory: that shape properties suggested on the ground side of an edge are…
Psychological Type and the Matching of Cognitive Styles.
ERIC Educational Resources Information Center
Bargar, Robert R.; Hoover, Randy L.
1984-01-01
Carl Jung's theory of psychological type is explored and related to education in this article. A model of the interaction between teacher, student, subject matter, and instructional alternatives is examined and the educational implications are discussed. This theory is used to illustrate how psychological-type influences teaching and learning…
A short course on measure and probability theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre
2004-02-01
This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the pastmore » decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.« less
Universality of Schmidt decomposition and particle identity
NASA Astrophysics Data System (ADS)
Sciara, Stefania; Lo Franco, Rosario; Compagno, Giuseppe
2017-03-01
Schmidt decomposition is a widely employed tool of quantum theory which plays a key role for distinguishable particles in scenarios such as entanglement characterization, theory of measurement and state purification. Yet, its formulation for identical particles remains controversial, jeopardizing its application to analyze general many-body quantum systems. Here we prove, using a newly developed approach, a universal Schmidt decomposition which allows faithful quantification of the physical entanglement due to the identity of particles. We find that it is affected by single-particle measurement localization and state overlap. We study paradigmatic two-particle systems where identical qubits and qutrits are located in the same place or in separated places. For the case of two qutrits in the same place, we show that their entanglement behavior, whose physical interpretation is given, differs from that obtained before by different methods. Our results are generalizable to multiparticle systems and open the way for further developments in quantum information processing exploiting particle identity as a resource.
Extracting Cell Stiffness from Real-Time Deformability Cytometry: Theory and Experiment.
Mietke, Alexander; Otto, Oliver; Girardo, Salvatore; Rosendahl, Philipp; Taubenberger, Anna; Golfier, Stefan; Ulbricht, Elke; Aland, Sebastian; Guck, Jochen; Fischer-Friedrich, Elisabeth
2015-11-17
Cell stiffness is a sensitive indicator of physiological and pathological changes in cells, with many potential applications in biology and medicine. A new method, real-time deformability cytometry, probes cell stiffness at high throughput by exposing cells to a shear flow in a microfluidic channel, allowing for mechanical phenotyping based on single-cell deformability. However, observed deformations of cells in the channel not only are determined by cell stiffness, but also depend on cell size relative to channel size. Here, we disentangle mutual contributions of cell size and cell stiffness to cell deformation by a theoretical analysis in terms of hydrodynamics and linear elasticity theory. Performing real-time deformability cytometry experiments on both model spheres of known elasticity and biological cells, we demonstrate that our analytical model not only predicts deformed shapes inside the channel but also allows for quantification of cell mechanical parameters. Thereby, fast and quantitative mechanical sampling of large cell populations becomes feasible. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Extracting Cell Stiffness from Real-Time Deformability Cytometry: Theory and Experiment
Mietke, Alexander; Otto, Oliver; Girardo, Salvatore; Rosendahl, Philipp; Taubenberger, Anna; Golfier, Stefan; Ulbricht, Elke; Aland, Sebastian; Guck, Jochen; Fischer-Friedrich, Elisabeth
2015-01-01
Cell stiffness is a sensitive indicator of physiological and pathological changes in cells, with many potential applications in biology and medicine. A new method, real-time deformability cytometry, probes cell stiffness at high throughput by exposing cells to a shear flow in a microfluidic channel, allowing for mechanical phenotyping based on single-cell deformability. However, observed deformations of cells in the channel not only are determined by cell stiffness, but also depend on cell size relative to channel size. Here, we disentangle mutual contributions of cell size and cell stiffness to cell deformation by a theoretical analysis in terms of hydrodynamics and linear elasticity theory. Performing real-time deformability cytometry experiments on both model spheres of known elasticity and biological cells, we demonstrate that our analytical model not only predicts deformed shapes inside the channel but also allows for quantification of cell mechanical parameters. Thereby, fast and quantitative mechanical sampling of large cell populations becomes feasible. PMID:26588562
Universality of Schmidt decomposition and particle identity
Sciara, Stefania; Lo Franco, Rosario; Compagno, Giuseppe
2017-01-01
Schmidt decomposition is a widely employed tool of quantum theory which plays a key role for distinguishable particles in scenarios such as entanglement characterization, theory of measurement and state purification. Yet, its formulation for identical particles remains controversial, jeopardizing its application to analyze general many-body quantum systems. Here we prove, using a newly developed approach, a universal Schmidt decomposition which allows faithful quantification of the physical entanglement due to the identity of particles. We find that it is affected by single-particle measurement localization and state overlap. We study paradigmatic two-particle systems where identical qubits and qutrits are located in the same place or in separated places. For the case of two qutrits in the same place, we show that their entanglement behavior, whose physical interpretation is given, differs from that obtained before by different methods. Our results are generalizable to multiparticle systems and open the way for further developments in quantum information processing exploiting particle identity as a resource. PMID:28333163
Universality of Schmidt decomposition and particle identity.
Sciara, Stefania; Lo Franco, Rosario; Compagno, Giuseppe
2017-03-23
Schmidt decomposition is a widely employed tool of quantum theory which plays a key role for distinguishable particles in scenarios such as entanglement characterization, theory of measurement and state purification. Yet, its formulation for identical particles remains controversial, jeopardizing its application to analyze general many-body quantum systems. Here we prove, using a newly developed approach, a universal Schmidt decomposition which allows faithful quantification of the physical entanglement due to the identity of particles. We find that it is affected by single-particle measurement localization and state overlap. We study paradigmatic two-particle systems where identical qubits and qutrits are located in the same place or in separated places. For the case of two qutrits in the same place, we show that their entanglement behavior, whose physical interpretation is given, differs from that obtained before by different methods. Our results are generalizable to multiparticle systems and open the way for further developments in quantum information processing exploiting particle identity as a resource.
Chiral EFT based nuclear forces: achievements and challenges
NASA Astrophysics Data System (ADS)
Machleidt, R.; Sammarruca, F.
2016-08-01
During the past two decades, chiral effective field theory has become a popular tool to derive nuclear forces from first principles. Two-nucleon interactions have been worked out up to sixth order of chiral perturbation theory and three-nucleon forces up to fifth order. Applications of some of these forces have been conducted in nuclear few- and many-body systems—with a certain degree of success. But in spite of these achievements, we are still faced with great challenges. Among them is the issue of a proper uncertainty quantification of predictions obtained when applying these forces in ab initio calculations of nuclear structure and reactions. A related problem is the order by order convergence of the chiral expansion. We start this review with a pedagogical introduction and then present the current status of the field of chiral nuclear forces. This is followed by a discussion of representative examples for the application of chiral two- and three-body forces in the nuclear many-body system including convergence issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moschella, Ugo; INFN, Sezione di Milano
In this paper we will give a short introduction to the geometry of the de Sitter universe and describe an approach to de Sitter Quantum Field Theory based on the complexification of the de Sitter manifold. We will also present some recent results about the stability of de Sitterian 'particles' that have been obtained in a continuing collaboration with Jacques Bros and Henri Epstein. We will show that for particles with masses above a critical mass there is no such thing as particle stability, so that decays forbidden in flat space-time do occur in the de Sitter universe. The lifetimemore » of one such a particle also turns out to be independent of its velocity when that lifetime is comparable with de Sitter radius. Particles with lower mass are even stranger. The masses of their decay products must obey quantification rules, and their lifetime is zero. These results have been obtained at the first order of perturbative expansion of an interacting quantum field theory.« less
Size-dependent forced PEG partitioning into channels: VDAC, OmpC, and α-hemolysin
Aksoyoglu, M. Alphan; Podgornik, Rudolf; Bezrukov, Sergey M.; ...
2016-07-27
Nonideal polymer mixtures of PEGs of different molecular weights partition differently into nanosize protein channels. Here, we assess the validity of the recently proposed theoretical approach of forced partitioning for three structurally different beta-barrel channels: voltage-dependent anion channel from outer mitochondrial membrane VDAC, bacterial porin OmpC (outer membrane protein C), and bacterial channel-forming toxin alpha-hemolysin. Our interpretation is based on the idea that relatively less-penetrating polymers push the more easily penetrating ones into nanosize channels in excess of their bath concentration. Comparison of the theory with experiments is excellent for VDAC. Polymer partitioning data for the other two channels aremore » consistent with theory if additional assumptions regarding the energy penalty of pore penetration are included. In conclusion, the obtained results demonstrate that the general concept of "polymers pushing polymers" is helpful in understanding and quantification of concrete examples of size-dependent forced partitioning of polymers into protein nanopores.« less
Size-dependent forced PEG partitioning into channels: VDAC, OmpC, and α-hemolysin
Aksoyoglu, M. Alphan; Podgornik, Rudolf; Bezrukov, Sergey M.; Gurnev, Philip A.; Muthukumar, Murugappan; Parsegian, V. Adrian
2016-01-01
Nonideal polymer mixtures of PEGs of different molecular weights partition differently into nanosize protein channels. Here, we assess the validity of the recently proposed theoretical approach of forced partitioning for three structurally different β-barrel channels: voltage-dependent anion channel from outer mitochondrial membrane VDAC, bacterial porin OmpC (outer membrane protein C), and bacterial channel-forming toxin α-hemolysin. Our interpretation is based on the idea that relatively less-penetrating polymers push the more easily penetrating ones into nanosize channels in excess of their bath concentration. Comparison of the theory with experiments is excellent for VDAC. Polymer partitioning data for the other two channels are consistent with theory if additional assumptions regarding the energy penalty of pore penetration are included. The obtained results demonstrate that the general concept of “polymers pushing polymers” is helpful in understanding and quantification of concrete examples of size-dependent forced partitioning of polymers into protein nanopores. PMID:27466408
Frobenius-norm-based measures of quantum coherence and asymmetry
Yao, Yao; Dong, G. H.; Xiao, Xing; Sun, C. P.
2016-01-01
We formulate the Frobenius-norm-based measures for quantum coherence and asymmetry respectively. In contrast to the resource theory of coherence and asymmetry, we construct a natural measure of quantum coherence inspired from optical coherence theory while the group theoretical approach is employed to quantify the asymmetry of quantum states. Besides their simple structures and explicit physical meanings, we observe that these quantities are intimately related to the purity (or linear entropy) of the corresponding quantum states. Remarkably, we demonstrate that the proposed coherence quantifier is not only a measure of mixedness, but also an intrinsic (basis-independent) quantification of quantum coherence contained in quantum states, which can also be viewed as a normalized version of Brukner-Zeilinger invariant information. In our context, the asymmetry of N-qubit quantum systems is considered under local independent and collective transformations. In- triguingly, it is illustrated that the collective effect has a significant impact on the asymmetry measure, and quantum correlation between subsystems plays a non-negligible role in this circumstance. PMID:27558009
Quantification and characterization of grouped type I myofibers in human aging.
Kelly, Neil A; Hammond, Kelley G; Stec, Michael J; Bickel, C Scott; Windham, Samuel T; Tuggle, S Craig; Bamman, Marcas M
2018-01-01
Myofiber type grouping is a histological hallmark of age-related motor unit remodeling. Despite the accepted concept that denervation-reinnervation events lead to myofiber type grouping, the completeness of those conversions remains unknown. Type I myofiber grouping was assessed in vastus lateralis biopsies from Young (26 ± 4 years; n = 27) and Older (66 ± 4 years; n = 91) adults. Grouped and ungrouped type I myofibers were evaluated for phenotypic differences. Higher type I grouping in Older versus Young was driven by more myofibers per group (i.e., larger group size) (P < 0.05). In Older only, grouped type I myofibers displayed larger cross-sectional area, more myonuclei, lower capillary supply, and more sarco(endo)plasmic reticulum calcium ATPase I (SERCA I) expression (P < 0.05) than ungrouped type I myofibers. Grouped type I myofibers retain type II characteristics suggesting that conversion during denervation-reinnervation events is either progressive or incomplete. Muscle Nerve 57: E52-E59, 2018. © 2017 Wiley Periodicals, Inc.
R 4 couplings in M- and type II theories on Calabi-Yau spaces
NASA Astrophysics Data System (ADS)
Antoniadis, I.; Feffara, S.; Minasian, R.; Narain, K. S.
1997-02-01
We discuss several implications of R 4 couplings in M-theory when compactified on Calabi-Yau (CY) manifolds. In particular, these couplings can be predicted by supersymmetry from the mixed gauge-gravitational Chem-Simons couplings in five dimensions and are related to the one-loop holomorphic anomaly in four-dimensional N = 2 theories. We find a new contribution to the Einstein term in five dimensions proportional to the Euler number of the internal CY threefold, which corresponds to a one-loop correction of the hypermultiplet geometry. This correction is reproduced by a direct computation in type 11 string theories. Finally, we discuss a universal non-perturbative correction to the type IIB hyper-metric.
Wartime Construction Project Outcomes as a Function of Contract Type
2016-07-01
contract types has been well established. The theory of contractual incentives promulgated by Sherer (1964) established expected contractor...behaviors using a max- imization problem. The theory focuses on expected contractor behaviors in incentive contracts (cf. Federal Acquisition Regulation...Scherer, F. M. (1964). The theory of contractual incentives for cost reduction. Quarterly Journal of Economics, 78, 257–280. Tawazuh Commercial and
Stochastic Growth Theory of Type 3 Solar Radio Emission
NASA Technical Reports Server (NTRS)
Robinson, P. A.; Carins, I. H.
1993-01-01
The recently developed stochastic growth theory of type 3 radio sources is extended to predict their electromagnetic volume emissivities and brightness temperatures. Predicted emissivities are consistent with spacecraft observations and independent theoretical constraints.
Gauge Theory on a Space with Linear Lie Type Fuzziness
NASA Astrophysics Data System (ADS)
Khorrami, Mohammad; Fatollahi, Amir H.; Shariati, Ahmad
2013-03-01
The U(1) gauge theory on a space with Lie type noncommutativity is constructed. The construction is based on the group of translations in Fourier space, which in contrast to space itself is commutative. In analogy with lattice gauge theory, the object playing the role of flux of field strength per plaquette, as well as the action, is constructed. It is observed that the theory, in comparison with ordinary U(1) gauge theory, has an extra gauge field component. This phenomena is reminiscent of similar ones in formulation of SU(N) gauge theory in space with canonical noncommutativity, and also appearance of gauge field component in discrete direction of Connes' construction of the Standard Model.
Reference tissue quantification of DCE-MRI data without a contrast agent calibration
NASA Astrophysics Data System (ADS)
Walker-Samuel, Simon; Leach, Martin O.; Collins, David J.
2007-02-01
The quantification of dynamic contrast-enhanced (DCE) MRI data conventionally requires a conversion from signal intensity to contrast agent concentration by measuring a change in the tissue longitudinal relaxation rate, R1. In this paper, it is shown that the use of a spoiled gradient-echo acquisition sequence (optimized so that signal intensity scales linearly with contrast agent concentration) in conjunction with a reference tissue-derived vascular input function (VIF), avoids the need for the conversion to Gd-DTPA concentration. This study evaluates how to optimize such sequences and which dynamic time-series parameters are most suitable for this type of analysis. It is shown that signal difference and relative enhancement provide useful alternatives when full contrast agent quantification cannot be achieved, but that pharmacokinetic parameters derived from both contain sources of error (such as those caused by differences between reference tissue and region of interest proton density and native T1 values). It is shown in a rectal cancer study that these sources of uncertainty are smaller when using signal difference, compared with relative enhancement (15 ± 4% compared with 33 ± 4%). Both of these uncertainties are of the order of those associated with the conversion to Gd-DTPA concentration, according to literature estimates.
Colletes, T C; Garcia, P T; Campanha, R B; Abdelnur, P V; Romão, W; Coltro, W K T; Vaz, B G
2016-03-07
The analytical performance for paper spray (PS) using a new insert sample approach based on paper with paraffin barriers (PS-PB) is presented. The paraffin barrier is made using a simple, fast and cheap method based on the stamping of paraffin onto a paper surface. Typical operation conditions of paper spray such as the solvent volume applied on the paper surface, and the paper substrate type are evaluated. A paper substrate with paraffin barriers shows better performance on analysis of a range of typical analytes when compared to the conventional PS-MS using normal paper (PS-NP) and PS-MS using paper with two rounded corners (PS-RC). PS-PB was applied to detect sugars and their inhibitors in sugarcane bagasse liquors from a second generation ethanol process. Moreover, the PS-PB proved to be excellent, showing results for the quantification of glucose in hydrolysis liquors with excellent linearity (R(2) = 0.99), limits of detection (2.77 mmol L(-1)) and quantification (9.27 mmol L(-1)). The results are better than for PS-NP and PS-RC. The PS-PB was also excellent in performance when compared with the HPLC-UV method for glucose quantification on hydrolysis of liquor samples.
Rodrigues, É O; Morais, F F C; Morais, N A O S; Conci, L S; Neto, L V; Conci, A
2016-01-01
The deposits of fat on the surroundings of the heart are correlated to several health risk factors such as atherosclerosis, carotid stiffness, coronary artery calcification, atrial fibrillation and many others. These deposits vary unrelated to obesity, which reinforces its direct segmentation for further quantification. However, manual segmentation of these fats has not been widely deployed in clinical practice due to the required human workload and consequential high cost of physicians and technicians. In this work, we propose a unified method for an autonomous segmentation and quantification of two types of cardiac fats. The segmented fats are termed epicardial and mediastinal, and stand apart from each other by the pericardium. Much effort was devoted to achieve minimal user intervention. The proposed methodology mainly comprises registration and classification algorithms to perform the desired segmentation. We compare the performance of several classification algorithms on this task, including neural networks, probabilistic models and decision tree algorithms. Experimental results of the proposed methodology have shown that the mean accuracy regarding both epicardial and mediastinal fats is 98.5% (99.5% if the features are normalized), with a mean true positive rate of 98.0%. In average, the Dice similarity index was equal to 97.6%. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures
NASA Technical Reports Server (NTRS)
Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo
2014-01-01
This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.
Yarita, Takashi; Aoyagi, Yoshie; Otake, Takamitsu
2015-05-29
The impact of the matrix effect in GC-MS quantification of pesticides in food using the corresponding isotope-labeled internal standards was evaluated. A spike-and-recovery study of nine target pesticides was first conducted using paste samples of corn, green soybean, carrot, and pumpkin. The observed analytical values using isotope-labeled internal standards were more accurate for most target pesticides than that obtained using the external calibration method, but were still biased from the spiked concentrations when a matrix-free calibration solution was used for calibration. The respective calibration curves for each target pesticide were also prepared using matrix-free calibration solutions and matrix-matched calibration solutions with blank soybean extract. The intensity ratio of the peaks of most target pesticides to that of the corresponding isotope-labeled internal standards was influenced by the presence of the matrix in the calibration solution; therefore, the observed slope varied. The ratio was also influenced by the type of injection method (splitless or on-column). These results indicated that matrix-matching of the calibration solution is required for very accurate quantification, even if isotope-labeled internal standards were used for calibration. Copyright © 2015 Elsevier B.V. All rights reserved.
Malucelli, Emil; Procopio, Alessandra; Fratini, Michela; Gianoncelli, Alessandra; Notargiacomo, Andrea; Merolle, Lucia; Sargenti, Azzurra; Castiglioni, Sara; Cappadone, Concettina; Farruggia, Giovanna; Lombardo, Marco; Lagomarsino, Stefano; Maier, Jeanette A; Iotti, Stefano
2018-01-01
The quantification of elemental concentration in cells is usually performed by analytical assays on large populations missing peculiar but important rare cells. The present article aims at comparing the elemental quantification in single cells and cell population in three different cell types using a new approach for single cells elemental analysis performed at sub-micrometer scale combining X-ray fluorescence microscopy and atomic force microscopy. The attention is focused on the light element Mg, exploiting the opportunity to compare the single cell quantification to the cell population analysis carried out by a highly Mg-selective fluorescent chemosensor. The results show that the single cell analysis reveals the same Mg differences found in large population of the different cell strains studied. However, in one of the cell strains, single cell analysis reveals two cells with an exceptionally high intracellular Mg content compared with the other cells of the same strain. The single cell analysis allows mapping Mg and other light elements in whole cells at sub-micrometer scale. A detailed intensity correlation analysis on the two cells with the highest Mg content reveals that Mg subcellular localization correlates with oxygen in a different fashion with respect the other sister cells of the same strain. Graphical abstract Single cells or large population analysis this is the question!
Synthesis of robust nonlinear autopilots using differential game theory
NASA Technical Reports Server (NTRS)
Menon, P. K. A.
1991-01-01
A synthesis technique for handling unmodeled disturbances in nonlinear control law synthesis was advanced using differential game theory. Two types of modeling inaccuracies can be included in the formulation. The first is a bias-type error, while the second is the scale-factor-type error in the control variables. The disturbances were assumed to satisfy an integral inequality constraint. Additionally, it was assumed that they act in such a way as to maximize a quadratic performance index. Expressions for optimal control and worst-case disturbance were then obtained using optimal control theory.
Maleki, Farzaneh; Hosseini Nodeh, Zahra; Rahnavard, Zahra; Arab, Masoume
2016-01-01
Since type-2 diabetes is the most common chronic disease among Iranian female adolescents, we applied theory of planned behavior to examine the effect of training to intention to preventative nutritional behaviors for type-2 diabetes among female adolescents. In this experimental study 200 (11-14 year old) girls from 8 schools of Tehran city (100 in each intervention and control group) were recruited based on cluster sampling method during two stages. For intervention group, an educational program was designed based on the theory of planned behavior and presented in 6 workshop sessions to prevent type-2 diabetes. The data were collected before and two months after the workshops using a valid and reliable (α=0.72 and r=0.80) authormade questionnaire based on Ajzens TPB questionnaire manual. The data were analyzed using t-test, chi-square test and analysis of covariance. Findings indicate that the two groups were homogeneous regarding the demographic characteristics before education, but the mean score of the theory components (attitudes, subjective norms, perceived behavioral control, and intention) was higher in the control group. Also, results showed all of the theory components significantly increased after the education in the intervention group (p=0.000). Training based on the theory of planned behavior enhances the intention to adherence preventative nutritional behaviors for type-2 diabetes among the studied female adolescents.
Xenopoulos, Alex; Fadgen, Keith; Murphy, Jim; Skilton, St. John; Prentice, Holly; Stapels, Martha
2012-01-01
Assays for identification and quantification of host-cell proteins (HCPs) in biotherapeutic proteins over 5 orders of magnitude in concentration are presented. The HCP assays consist of two types: HCP identification using comprehensive online two-dimensional liquid chromatography coupled with high resolution mass spectrometry (2D-LC/MS), followed by high-throughput HCP quantification by liquid chromatography, multiple reaction monitoring (LC-MRM). The former is described as a “discovery” assay, the latter as a “monitoring” assay. Purified biotherapeutic proteins (e.g., monoclonal antibodies) were digested with trypsin after reduction and alkylation, and the digests were fractionated using reversed-phase (RP) chromatography at high pH (pH 10) by a step gradient in the first dimension, followed by a high-resolution separation at low pH (pH 2.5) in the second dimension. As peptides eluted from the second dimension, a quadrupole time-of-flight mass spectrometer was used to detect the peptides and their fragments simultaneously by alternating the collision cell energy between a low and an elevated energy (MSE methodology). The MSE data was used to identify and quantify the proteins in the mixture using a proven label-free quantification technique (“Hi3” method). The same data set was mined to subsequently develop target peptides and transitions for monitoring the concentration of selected HCPs on a triple quadrupole mass spectrometer in a high-throughput manner (20 min LC-MRM analysis). This analytical methodology was applied to the identification and quantification of low-abundance HCPs in six samples of PTG1, a recombinant chimeric anti-phosphotyrosine monoclonal antibody (mAb). Thirty three HCPs were identified in total from the PTG1 samples among which 21 HCP isoforms were selected for MRM monitoring. The absolute quantification of three selected HCPs was undertaken on two different LC-MRM platforms after spiking isotopically labeled peptides in the samples. Finally, the MRM quantitation results were compared with TOF-based quantification based on the Hi3 peptides, and the TOF and MRM data sets correlated reasonably well. The results show that the assays provide detailed valuable information to understand the relative contributions of purification schemes to the nature and concentrations of HCP impurities in biopharmaceutical samples, and the assays can be used as generic methods for HCP analysis in the biopharmaceutical industry. PMID:22327428
ERIC Educational Resources Information Center
Grossman, Ruth B.; Kegl, Judy
2006-01-01
American Sign Language uses the face to express vital components of grammar in addition to the more universal expressions of emotion. The study of ASL facial expressions has focused mostly on the perception and categorization of various expression types by signing and nonsigning subjects. Only a few studies of the production of ASL facial…
ERIC Educational Resources Information Center
Allen, Laura K.; Perret, Cecile; Likens, Aaron; McNamara, Danielle S.
2017-01-01
In this study, we investigated the degree to which the cognitive processes in which students engage during reading comprehension could be examined through dynamical analyses of their natural language responses to texts. High school students (n = 142) generated typed self-explanations while reading a science text. They then completed a…
Alba Argerich; Roy Haggerty; Eugènia Martí; Francesc Sabater; Jay Zarnetske
2011-01-01
Water transient storage zones are hotspots for metabolic activity in streams although the contribution of different types of transient storage zones to the whole�]reach metabolic activity is difficult to quantify. In this study we present a method to measure the fraction of the transient storage that is metabolically active (MATS) in two consecutive reaches...
Zheng, Weijia; Park, Jin-A; Abd El-Aty, A M; Kim, Seong-Kwan; Cho, Sang-Hyun; Choi, Jeong-Min; Yi, Hee; Cho, Soo-Min; Ramadan, Amer; Jeong, Ji Hoon; Shim, Jae-Han; Shin, Ho-Chul
2018-01-01
Over the past few decades, honey products have been polluted by different contaminants, such as pesticides, which are widely applied in agriculture. In this work, a modified EN - quick, easy, cheap, effective, rugged, and safe (QuEChERS) extraction method was developed for the simultaneous quantification of pesticide residues, including cymiazole, fipronil, coumaphos, fluvalinate, amitraz, and its metabolite 2,4-dimethylaniline (2,4-DMA), in four types of honey (acacia, wild, chestnut, and manuka) and royal jelly. Samples were buffered with 0.2M dibasic sodium phosphate (pH 9), and subsequently, acetonitrile was employed as the extraction solvent. A combination of primary secondary amine (PSA) and C18 sorbents was used for purification prior to liquid chromatography-electrospray ionization tandem mass spectrometry (LC-ESI + /MS-MS) analysis. The estimated linearity measured at six concentration levels presented good correlation coefficients (R 2 )≥0.99. The recovery, calculated from three different spiking levels, was 62.06-108.79% in honey and 67.58-106.34% in royal jelly, with an RSD<12% for all the tested compounds. The matrix effect was also evaluated, and most of the analytes presented signal enhancement. The limits of quantification (LOQ) ranged between 0.001 and 0.005mg/kg in various samples. These are considerably lower than the maximum residue limits (MRL) set by various regulatory authorities. A total of 43 market (domestic and imported) samples were assayed for method application. Among the tested samples, three samples were tested positive (i.e. detected and quantified) only for cymiazole residues. The residues in the rest of the samples were detected but not quantified. We concluded that the protocol developed in this work is simple and versatile for the routine quantification of cymiazole, 2,4-DMA, fipronil, coumaphos, amitraz, and fluvalinate in various types of honey and royal jelly. Copyright © 2017 Elsevier B.V. All rights reserved.
Forment, Josep V.; Jackson, Stephen P.
2016-01-01
Protein accumulation on chromatin has traditionally been studied using immunofluorescence microscopy or biochemical cellular fractionation followed by western immunoblot analysis. As a way to improve the reproducibility of this kind of analysis, make it easier to quantify and allow a stream-lined application in high-throughput screens, we recently combined a classical immunofluorescence microscopy detection technique with flow cytometry1. In addition to the features described above, and by combining it with detection of both DNA content and DNA replication, this method allows unequivocal and direct assignment of cell-cycle distribution of protein association to chromatin without the need for cell culture synchronization. Furthermore, it is relatively quick (no more than a working day from sample collection to quantification), requires less starting material compared to standard biochemical fractionation methods and overcomes the need for flat, adherent cell types that are required for immunofluorescence microscopy. PMID:26226461
Ultrasensitive multiplex optical quantification of bacteria in large samples of biofluids
Pazos-Perez, Nicolas; Pazos, Elena; Catala, Carme; Mir-Simon, Bernat; Gómez-de Pedro, Sara; Sagales, Juan; Villanueva, Carlos; Vila, Jordi; Soriano, Alex; García de Abajo, F. Javier; Alvarez-Puebla, Ramon A.
2016-01-01
Efficient treatments in bacterial infections require the fast and accurate recognition of pathogens, with concentrations as low as one per milliliter in the case of septicemia. Detecting and quantifying bacteria in such low concentrations is challenging and typically demands cultures of large samples of blood (~1 milliliter) extending over 24–72 hours. This delay seriously compromises the health of patients. Here we demonstrate a fast microorganism optical detection system for the exhaustive identification and quantification of pathogens in volumes of biofluids with clinical relevance (~1 milliliter) in minutes. We drive each type of bacteria to accumulate antibody functionalized SERS-labelled silver nanoparticles. Particle aggregation on the bacteria membranes renders dense arrays of inter-particle gaps in which the Raman signal is exponentially amplified by several orders of magnitude relative to the dispersed particles. This enables a multiplex identification of the microorganisms through the molecule-specific spectral fingerprints. PMID:27364357
Comprehensive Design Reliability Activities for Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Whitley, M. R.; Knight, K. C.
2000-01-01
This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.
Umezawa, Keitaro; Kamiya, Mako; Urano, Yasuteru
2018-05-23
The chemical biology of reactive sulfur species, including hydropolysulfides, has been a subject undergoing intense study in recent years, but further understanding of their 'intact' function in living cells has been limited due to a lack of appropriate analytical tools. In order to overcome this limitation, we developed a new type of fluorescent probe which reversibly and selectively reacts to hydropolysulfides. The probe enables live-cell visualization and quantification of endogenous hydropolysulfides without interference from intrinsic thiol species such as glutathione. Additionally, real-time reversible monitoring of oxidative-stress-induced fluctuation of intrinsic hydropolysulfides has been achieved with a temporal resolution in the order of seconds, a result which has not yet been realized using conventional methods. These results reveal the probe's versatility as a new fluorescence imaging tool to understand the function of intracellular hydropolysulfides. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lignin‐Derived Thioacidolysis Dimers: Reevaluation, New Products, Authentication, and Quantification
Yue, Fengxia; Regner, Matt; Sun, Runcang
2017-01-01
Abstract Lignin structural studies play an essential role both in understanding the development of plant cell walls and for valorizing lignocellulosics as renewable biomaterials. Dimeric products released by selectively cleaving β–aryl ether linkages between lignin units reflect the distribution of recalcitrant lignin units, but have been neither absolutely defined nor quantitatively determined. Here, 12 guaiacyl‐type thioacidolysis dimers were identified and quantified using newly synthesized standards. One product previously attributed to deriving from β–1‐coupled units was established as resulting from β–5 units, correcting an analytical quandary. Another longstanding dilemma, that no β–β dimers were recognized in thioacidolysis products from gymnosperms, was resolved with the discovery of two such authenticated compounds. Individual GC response factors for each standard compound allowed rigorous quantification of dimeric products released from softwood lignins, affording insight into the various interunit‐linkage distributions in lignins and thereby guiding the valorization of lignocellulosics. PMID:28125766
Residual transglutaminase in collagen - effects, detection, quantification, and removal.
Schloegl, W; Klein, A; Fürst, R; Leicht, U; Volkmer, E; Schieker, M; Jus, S; Guebitz, G M; Stachel, I; Meyer, M; Wiggenhorn, M; Friess, W
2012-02-01
In the present study, we developed an enzyme-linked immunosorbent assay (ELISA) for microbial transglutaminase (mTG) from Streptomyces mobaraensis to overcome the lack of a quantification method for mTG. We further performed a detailed follow-on-analysis of insoluble porcine collagen type I enzymatically modified with mTG primarily focusing on residuals of mTG. Repeated washing (4 ×) reduced mTG-levels in the washing fluids but did not quantitatively remove mTG from the material (p < 0.000001). Substantial amounts of up to 40% of the enzyme utilized in the crosslinking mixture remained associated with the modified collagen. Binding was non-covalent as could be demonstrated by Western blot analysis. Acidic and alkaline dialysis of mTG treated collagen material enabled complete removal the enzyme. Treatment with guanidinium chloride, urea, or sodium chloride was less effective in reducing the mTG content. Copyright © 2011 Elsevier B.V. All rights reserved.
Substitution effect on a hydroxylated chalcone: Conformational, topological and theoretical studies
NASA Astrophysics Data System (ADS)
Custodio, Jean M. F.; Vaz, Wesley F.; de Andrade, Fabiano M.; Camargo, Ademir J.; Oliveira, Guilherme R.; Napolitano, Hamilton B.
2017-05-01
The effect of substituents on two hydroxylated chalcones was studied in this work. The first chalcone, with a dimethylamine group (HY-DAC) and the second, with three methoxy groups (HY-TRI) were synthesized and crystallized from ethanol on centrosymmetric space group P21/c. The geometric parameters and supramolecular arrangement for both structures obtained from single crystal X-ray diffraction data were analyzed. The intermolecular interactions were investigated by Hirshfeld surfaces with their respective 2D plot for quantification of each type of contact. Additionally, the observed interactions were characterized by QTAIM analysis, and DFT calculations were applied for theoretical vibrational spectra, localization and quantification of frontier orbitals and potential electrostatic map. The flatness of both structures was affected by the substituents, which led to different monoclinic crystalline packing. The calculated harmonic vibrational frequencies and homo-lumo gap confirmed the stability of the structures, while intermolecular interactions were confirmed by potential electrostatic map and QTAIM analysis.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Korram, S.
1977-01-01
The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.
Villoria Sáez, Paola; del Río Merino, Mercedes; Porras-Amores, César
2012-02-01
The management planning of construction and demolition (C&D) waste uses a single indicator which does not provide enough detailed information. Therefore the determination and implementation of other innovative and precise indicators should be determined. The aim of this research work is to improve existing C&D waste quantification tools in the construction of new residential buildings in Spain. For this purpose, several housing projects were studied to determine an estimation of C&D waste generated during their construction process. This paper determines the values of three indicators to estimate the generation of C&D waste in new residential buildings in Spain, itemizing types of waste and construction stages. The inclusion of two more accurate indicators, in addition to the global one commonly in use, provides a significant improvement in C&D waste quantification tools and management planning.
Zooming In on Plant Hormone Analysis: Tissue- and Cell-Specific Approaches.
Novák, Ondřej; Napier, Richard; Ljung, Karin
2017-04-28
Plant hormones are a group of naturally occurring, low-abundance organic compounds that influence physiological processes in plants. Our knowledge of the distribution profiles of phytohormones in plant organs, tissues, and cells is still incomplete, but advances in mass spectrometry have enabled significant progress in tissue- and cell-type-specific analyses of phytohormones over the last decade. Mass spectrometry is able to simultaneously identify and quantify hormones and their related substances. Biosensors, on the other hand, offer continuous monitoring; can visualize local distributions and real-time quantification; and, in the case of genetically encoded biosensors, are noninvasive. Thus, biosensors offer additional, complementary technologies for determining temporal and spatial changes in phytohormone concentrations. In this review, we focus on recent advances in mass spectrometry-based quantification, describe monitoring systems based on biosensors, and discuss validations of the various methods before looking ahead at future developments for both approaches.
Flavor structure in F-theory compactifications
NASA Astrophysics Data System (ADS)
Hayashi, Hirotaka; Kawano, Teruhiko; Tsuchiya, Yoichi; Watari, Taizan
2010-08-01
F-theory is one of frameworks in string theory where supersymmetric grand unification is accommodated, and all the Yukawa couplings and Majorana masses of righthanded neutrinos are generated. Yukawa couplings of charged fermions are generated at codimension-3 singularities, and a contribution from a given singularity point is known to be approximately rank 1. Thus, the approximate rank of Yukawa matrices in low-energy effective theory of generic F-theory compactifications are minimum of either the number of generations N gen = 3 or the number of singularity points of certain types. If there is a geometry with only one E 6 type point and one D 6 type point over the entire 7-brane for SU(5) gauge fields, F-theory compactified on such a geometry would reproduce approximately rank-1 Yukawa matrices in the real world. We found, however, that there is no such geometry. Thus, it is a problem how to generate hierarchical Yukawa eigenvalues in F-theory compactifications. A solution in the literature so far is to take an appropriate factorization limit. In this article, we propose an alternative solution to the hierarchical structure problem (which requires to tune some parameters) by studying how zero mode wavefunctions depend on complex structure moduli. In this solution, the N gen × N gen CKM matrix is predicted to have only N gen entries of order unity without an extra tuning of parameters, and the lepton flavor anarchy is predicted for the lepton mixing matrix. The hierarchy among the Yukawa eigenvalues of the down-type and charged lepton sector is predicted to be smaller than that of the up-type sector, and the Majorana masses of left-handed neutrinos generated through the see-saw mechanism have small hierarchy. All of these predictions agree with what we observe in the real world. We also obtained a precise description of zero mode wavefunctions near the E 6 type singularity points, where the up-type Yukawa couplings are generated.
K-theoretic aspects of string theory dualities
NASA Astrophysics Data System (ADS)
Mendez-Diez, Stefan Milo
String theory is a a physical field theory in which point particles are replaced by 1-manifolds propagating in time, called strings. The 2-manifold representing the time evolution of a string is called the string worldsheet. Strings can be either closed (meaning their worldsheets are closed surfaces) or open (meaning their worldsheets have boundary). A D-brane is a submanifold of the spacetime manifold on which string endpoints are constrained to lie. There are five different string theories that have supersymmetry, and they are all related by various dualities. This dissertation will review how D-branes are classified by K-theory. We will then explore the K-theoretic aspects of a hypothesized duality between the type I theory compactified on a 4-torus and the type IIA theory compactified on a K3 surface, by looking at a certain blow down of the singular limit of K3. This dissertation concludes by classifying D-branes on the type II orientifold Tn/Z2 when the Z2 action is multiplication by -1 and the H-flux is trivial. We find that classifying D-branes on the singular limit of K3, T4/Z2 by equivariant K-theory agrees with the classification of D-branes on a smooth K3 surface by ordinary K-theory.
[Triple-type theory of statistics and its application in the scientific research of biomedicine].
Hu, Liang-ping; Liu, Hui-gang
2005-07-20
To point out the crux of why so many people failed to grasp statistics and to bring forth a "triple-type theory of statistics" to solve the problem in a creative way. Based on the experience in long-time teaching and research in statistics, the "three-type theory" was raised and clarified. Examples were provided to demonstrate that the 3 types, i.e., expressive type, prototype and the standardized type are the essentials for people to apply statistics rationally both in theory and practice, and moreover, it is demonstrated by some instances that the "three types" are correlated with each other. It can help people to see the essence by interpreting and analyzing the problems of experimental designs and statistical analyses in medical research work. Investigations reveal that for some questions, the three types are mutually identical; for some questions, the prototype is their standardized type; however, for some others, the three types are distinct from each other. It has been shown that in some multifactor experimental researches, it leads to the nonexistence of the standardized type corresponding to the prototype at all, because some researchers have committed the mistake of "incomplete control" in setting experimental groups. This is a problem which should be solved by the concept and method of "division". Once the "triple-type" for each question is clarified, a proper experimental design and statistical method can be carried out easily. "Triple-type theory of statistics" can help people to avoid committing statistical mistakes or at least to decrease the misuse rate dramatically and improve the quality, level and speed of biomedical research during the process of applying statistics. It can also help people to improve the quality of statistical textbooks and the teaching effect of statistics and it has demonstrated how to advance biomedical statistics.
Minimal string theories and integrable hierarchies
NASA Astrophysics Data System (ADS)
Iyer, Ramakrishnan
Well-defined, non-perturbative formulations of the physics of string theories in specific minimal or superminimal model backgrounds can be obtained by solving matrix models in the double scaling limit. They provide us with the first examples of completely solvable string theories. Despite being relatively simple compared to higher dimensional critical string theories, they furnish non-perturbative descriptions of interesting physical phenomena such as geometrical transitions between D-branes and fluxes, tachyon condensation and holography. The physics of these theories in the minimal model backgrounds is succinctly encoded in a non-linear differential equation known as the string equation, along with an associated hierarchy of integrable partial differential equations (PDEs). The bosonic string in (2,2m-1) conformal minimal model backgrounds and the type 0A string in (2,4 m) superconformal minimal model backgrounds have the Korteweg-de Vries system, while type 0B in (2,4m) backgrounds has the Zakharov-Shabat system. The integrable PDE hierarchy governs flows between backgrounds with different m. In this thesis, we explore this interesting connection between minimal string theories and integrable hierarchies further. We uncover the remarkable role that an infinite hierarchy of non-linear differential equations plays in organizing and connecting certain minimal string theories non-perturbatively. We are able to embed the type 0A and 0B (A,A) minimal string theories into this single framework. The string theories arise as special limits of a rich system of equations underpinned by an integrable system known as the dispersive water wave hierarchy. We find that there are several other string-like limits of the system, and conjecture that some of them are type IIA and IIB (A,D) minimal string backgrounds. We explain how these and several other string-like special points arise and are connected. In some cases, the framework endows the theories with a non-perturbative definition for the first time. Notably, we discover that the Painleve IV equation plays a key role in organizing the string theory physics, joining its siblings, Painleve I and II, whose roles have previously been identified in this minimal string context. We then present evidence that the conjectured type II theories have smooth non-perturbative solutions, connecting two perturbative asymptotic regimes, in a 't Hooft limit. Our technique also demonstrates evidence for new minimal string theories that are not apparent in a perturbative analysis.
NASA Astrophysics Data System (ADS)
Tinti, Stefano; Armigliato, Alberto
2016-04-01
Seismic hazard and, more recently, tsunami hazard assessments have been undertaken in several countries of the world and globally for the whole Earth planet with the aim of providing a scientifically sound basis to the engineers, technicians, urban and industrial planners, politicians, civil protection operators and in general to the authorities for devising rational risk mitigation strategies and corresponding adequate policies. The main point of this presentation is that the chief-value of all seismic and tsunami hazard studies (including theory, concept, quantification and mapping) resides in the social and political values of the provided products, which is a standpoint entailing a number of relevant geoethical implications. The most relevant implication regards geoscientists who are the subjects mainly involved in carrying out hazard evaluations. Viewed from the classical perspective, the main ethical obligations of geoscientists are restricted to performing hazard estimations in the best possible way from a scientific point of view, which means selecting the "best" available data, adopting sound theoretical models, making use of rigorous methods… What is outlined here, is that this is an insufficient minimalistic position, since it overlooks the basic socio-political and therefore practical value of the hazard-analysis final products. In other words, if one views hazard assessment as a production process leading from data and theories (raw data and production means) to hazard maps (products), the criterion to judge whether it is good or bad needs also to include the usability factor. Seismic and tsunami hazard reports and maps are products that should be usable, which means that they should meet user needs and requirements, and therefore they should be evaluated according to how much they are clearly understandable to, and appropriate for, making-decision users. In the traditional view of a science serving the society, one could represent the interaction process as a line connecting geoscientists and users, where geoscientists possess the knowledge (data, theory and models) and teach, while users get products and learn. The new geoethical perspective is that the line is replaced by a loop, where geoscientists and users interact cyclically: 1) where theory and methods themselves are not determined a-priori, but they result also in response of geoscientists-users interactions, and 2) where user needs can be modified ex-post in response to geoscientists elaborations. These two-way feedback actions, opening also the path to close interdisciplinary approaches involving geo- and social sciences, are the main challenge for the present generation of geoscientists. Unfortunately they are not properly and adequately reflected in the today university educational systems, and in professional societies.
Scarr, Daniel; Lovblom, Leif E; Ostrovski, Ilia; Kelly, Dylan; Wu, Tong; Farooqi, Mohammed A; Halpern, Elise M; Ngo, Mylan; Ng, Eduardo; Orszag, Andrej; Bril, Vera; Perkins, Bruce A
2017-06-01
Quantification of corneal nerve fiber length (CNFL) by in vivo corneal confocal microscopy represents a promising diabetic neuropathy biomarker, but applicability is limited by resource-intensive image analysis. We aimed to evaluate, in cross-sectional analysis of non-diabetic controls and patients with type 1 and type 2 diabetes with and without neuropathy, the agreement between manual and automated analysis protocols. Sixty-eight controls, 139 type 1 diabetes, and 249 type 2 diabetes participants underwent CNFL measurement (N=456). Neuropathy status was determined by clinical and electrophysiological criteria. CNFL was determined by manual (CNFL Manual , reference standard) and automated (CNFL Auto ) protocols, and results were compared for correlation and agreement using Spearman coefficients and the method of Bland and Altman (CNFL Manual subtracted from CNFL Auto ). Participants demonstrated broad variability in clinical characteristics associated with neuropathy. The mean age, diabetes duration, and HbA1c were 53±18years, 15.9±12.6years, and 7.4±1.7%, respectively, and 218 (56%) individuals with diabetes had neuropathy. Mean CNFL Manual was 15.1±4.9mm/mm 2 , and mean CNFL Auto was 10.5±3.7mm/mm 2 (CNFL Auto underestimation bias, -4.6±2.6mm/mm 2 corresponding to -29±17%). Percent bias was similar across non-diabetic controls (-33±12%), type 1 (-30±20%), and type 2 diabetes (-28±16%) subgroups (ANOVA, p=0.068), and similarly in diabetes participants with and without neuropathy. Levels of CNFL Auto and CNFL Manual were both inversely associated with neuropathy status. Although CNFL Auto substantially underestimated CNFL Manual , its bias was non-differential between diverse patient groups and its relationship with neuropathy status was preserved. Determination of diagnostic thresholds specific to CNFL Auto should be pursued in diagnostic studies of diabetic neuropathy. Copyright © 2016 Elsevier Inc. All rights reserved.
Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.
Liu, Jason Yingjie
2014-11-01
The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Cheon, Jongpil; Grant, Michael
2012-01-01
This study proposes a new instrument to measure cognitive load types related to user interface and demonstrates theoretical assumptions about different load types. In reconsidering established cognitive load theory, the inadequacies of the theory are criticized in terms of the adaption of learning efficiency score and distinction of cognitive load…
Developmental Systems of Students' Personal Theories about Education
ERIC Educational Resources Information Center
Barger, Michael M.; Linnenbrink-Garcia, Lisa
2017-01-01
Children hold many personal theories about education: theories about themselves, knowledge, and the learning process. Personal theories help children predict what their actions will cause, and therefore relate to motivation, self-regulation, and achievement. Researchers typically examine how specific types of personal theories develop…
Nigou, J; Vercellone, A; Puzo, G
2000-06-23
Lipoarabinomannans are key molecules of the mycobacterial envelopes involved in many steps of tuberculosis immunopathogenesis. Several of the biological activities of lipoarabinomannans are mediated by their ability to bind human C-type lectins, such as the macrophage mannose receptor, the mannose-binding protein and the surfactant proteins A and D. The lipoarabinomannan mannooligosaccharide caps have been demonstrated to be involved in the binding to the lectin carbohydrate recognition domains. We report an original analytical approach, based on capillary electrophoresis monitored by laser-induced fluorescence, allowing the absolute quantification, in nanomole quantities of lipoarabinomannan, of the number of mannooligosaccharide units per lipoarabinomannan molecule. Moreover, this analytical approach was successful for the glycosidic linkage determination of the mannooligosaccharide motifs and has been applied to the comparative analysis of parietal and cellular lipoarabinomannans of Mycobacterium bovis BCG and Mycobacterium tuberculosis H37Rv, H37Ra and Erdman strains. Significant differences were observed in the amounts of the various mannooligosaccharide units between lipoarabinomannans of different strains and between parietal and cellular lipoarabinomannans of the same strain. Nevertheless, no relationship was found between the number of mannooligosaccharide caps and the virulence of the corresponding strain. The results of the present study should help us to gain more understanding of the molecular basis of lipoarabinomannan discrimination in the process of binding to C-type lectins. Copyright 2000 Academic Press.
FIB/SEM technology and Alzheimer's disease: three-dimensional analysis of human cortical synapses.
Blazquez-Llorca, Lidia; Merchán-Pérez, Ángel; Rodríguez, José-Rodrigo; Gascón, Jorge; DeFelipe, Javier
2013-01-01
The quantification and measurement of synapses is a major goal in the study of brain organization in both health and disease. Serial section electron microscopy (EM) is the ideal method since it permits the direct quantification of crucial features such as the number of synapses per unit volume or the distribution and size of synapses. However, a major limitation is that obtaining long series of ultrathin sections is extremely time-consuming and difficult. Consequently, quantitative EM studies are scarce and the most common method employed to estimate synaptic density in the human brain is indirect, by counting at the light microscopic level immunoreactive puncta using synaptic markers. The recent development of automatic EM methods in experimental animals, such as the combination of focused ion beam milling and scanning electron microscopy (FIB/SEM), are opening new avenues. Here we explored the utility of FIB/SEM to examine the cerebral cortex of Alzheimer's disease patients. We found that FIB/SEM is an excellent tool to study in detail the ultrastructure and alterations of the synaptic organization of the human brain. Using this technology, it is possible to reconstruct different types of plaques and the surrounding neuropil to find new aspects of the pathological process associated with the disease, namely; to count the exact number and types of synapses in different regions of the plaques, to study the spatial distribution of synapses, and to analyze the morphology and nature of the various types of dystrophic neurites and amyloid deposits.
Assessment of functional defecation disorders using anorectal manometry.
Seong, Moo-Kyung
2018-06-01
The aim was to evaluate the discriminating accuracy of anorectal manometry (ARM) between nonconstipated (NC) subjects and functionally constipated (FC) subjects, and between FC subjects with and without functional defecation disorder (FDD). Among female patients who visited anorectal physiology unit, those who could be grouped to following categories were included; FC group with FDD (+FDD subgroup), or without FDD (-FDD subgroup) and NC group. ARM was performed and interpreted not only with absolute pressure values, but also pattern classification and quantification of pressure changes in the rectum and anus during attempted defecation. There were 76 subjects in NC group and 75 in FC group. Among FC group, 63 subjects were in -FDD subgroup and 12 in +FDD subgroup. In pattern classification of pressure changes, type 0, as 'normal' response, was only slightly more prevalent in NC group than in FC group. When all 'abnormal' types (types 1-5) were considered together as positive findings, the sensitivity and specificity of pattern classification in diagnosing FC among all subjects were 89.3% and 22.7%. Those values in diagnosing FDD among FC group were 91.7% and 11.1%. Manometric defecation index (MDI) as a quantification parameter was significantly different between -FDD and +FDD subgroups. Other conventional absolute pressures were mostly comparable between the groups. Among all parameters of ARM, MDI was useful to diagnose FDD in FC patients. Other parameters including the pattern classification were questionable in their ability to diagnose FDD.
Assessment of functional defecation disorders using anorectal manometry
2018-01-01
Purpose The aim was to evaluate the discriminating accuracy of anorectal manometry (ARM) between nonconstipated (NC) subjects and functionally constipated (FC) subjects, and between FC subjects with and without functional defecation disorder (FDD). Methods Among female patients who visited anorectal physiology unit, those who could be grouped to following categories were included; FC group with FDD (+FDD subgroup), or without FDD (−FDD subgroup) and NC group. ARM was performed and interpreted not only with absolute pressure values, but also pattern classification and quantification of pressure changes in the rectum and anus during attempted defecation. Results There were 76 subjects in NC group and 75 in FC group. Among FC group, 63 subjects were in −FDD subgroup and 12 in +FDD subgroup. In pattern classification of pressure changes, type 0, as ‘normal’ response, was only slightly more prevalent in NC group than in FC group. When all ‘abnormal’ types (types 1–5) were considered together as positive findings, the sensitivity and specificity of pattern classification in diagnosing FC among all subjects were 89.3% and 22.7%. Those values in diagnosing FDD among FC group were 91.7% and 11.1%. Manometric defecation index (MDI) as a quantification parameter was significantly different between −FDD and +FDD subgroups. Other conventional absolute pressures were mostly comparable between the groups. Conclusion Among all parameters of ARM, MDI was useful to diagnose FDD in FC patients. Other parameters including the pattern classification were questionable in their ability to diagnose FDD. PMID:29854711
Quantifying circular RNA expression from RNA-seq data using model-based framework.
Li, Musheng; Xie, Xueying; Zhou, Jing; Sheng, Mengying; Yin, Xiaofeng; Ko, Eun-A; Zhou, Tong; Gu, Wanjun
2017-07-15
Circular RNAs (circRNAs) are a class of non-coding RNAs that are widely expressed in various cell lines and tissues of many organisms. Although the exact function of many circRNAs is largely unknown, the cell type-and tissue-specific circRNA expression has implicated their crucial functions in many biological processes. Hence, the quantification of circRNA expression from high-throughput RNA-seq data is becoming important to ascertain. Although many model-based methods have been developed to quantify linear RNA expression from RNA-seq data, these methods are not applicable to circRNA quantification. Here, we proposed a novel strategy that transforms circular transcripts to pseudo-linear transcripts and estimates the expression values of both circular and linear transcripts using an existing model-based algorithm, Sailfish. The new strategy can accurately estimate transcript expression of both linear and circular transcripts from RNA-seq data. Several factors, such as gene length, amount of expression and the ratio of circular to linear transcripts, had impacts on quantification performance of circular transcripts. In comparison to count-based tools, the new computational framework had superior performance in estimating the amount of circRNA expression from both simulated and real ribosomal RNA-depleted (rRNA-depleted) RNA-seq datasets. On the other hand, the consideration of circular transcripts in expression quantification from rRNA-depleted RNA-seq data showed substantial increased accuracy of linear transcript expression. Our proposed strategy was implemented in a program named Sailfish-cir. Sailfish-cir is freely available at https://github.com/zerodel/Sailfish-cir . tongz@medicine.nevada.edu or wanjun.gu@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Blanco-Elorrieta, Esti; Pylkkänen, Liina
2016-01-01
What is the neurobiological basis of our ability to create complex messages with language? Results from multiple methodologies have converged on a set of brain regions as relevant for this general process, but the computational details of these areas remain to be characterized. The left anterior temporal lobe (LATL) has been a consistent node within this network, with results suggesting that although it rather systematically shows increased activation for semantically complex structured stimuli, this effect does not extend to number phrases such as 'three books.' In the present work we used magnetoencephalography to investigate whether numbers in general are an invalid input to the combinatory operations housed in the LATL or whether the lack of LATL engagement for stimuli such as 'three books' is due to the quantificational nature of such phrases. As a relevant test case, we employed complex number terms such as 'twenty-three', where one number term is not a quantifier of the other but rather, the two terms form a type of complex concept. In a number naming paradigm, participants viewed rows of numbers and depending on task instruction, named them as complex number terms ('twenty-three'), numerical quantifications ('two threes'), adjectival modifications ('blue threes') or non-combinatory lists (e.g., 'two, three'). While quantificational phrases failed to engage the LATL as compared to non-combinatory controls, both complex number terms and adjectival modifications elicited a reliable activity increase in the LATL. Our results show that while the LATL does not participate in the enumeration of tokens within a set, exemplified by the quantificational phrases, it does support conceptual combination, including the composition of complex number concepts. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Apparent exchange rate for breast cancer characterization.
Lasič, Samo; Oredsson, Stina; Partridge, Savannah C; Saal, Lao H; Topgaard, Daniel; Nilsson, Markus; Bryskhe, Karin
2016-05-01
Although diffusion MRI has shown promise for the characterization of breast cancer, it has low specificity to malignant subtypes. Higher specificity might be achieved if the effects of cell morphology and molecular exchange across cell membranes could be disentangled. The quantification of exchange might thus allow the differentiation of different types of breast cancer cells. Based on differences in diffusion rates between the intra- and extracellular compartments, filter exchange spectroscopy/imaging (FEXSY/FEXI) provides non-invasive quantification of the apparent exchange rate (AXR) of water between the two compartments. To test the feasibility of FEXSY for the differentiation of different breast cancer cells, we performed experiments on several breast epithelial cell lines in vitro. Furthermore, we performed the first in vivo FEXI measurement of water exchange in human breast. In cell suspensions, pulsed gradient spin-echo experiments with large b values and variable pulse duration allow the characterization of the intracellular compartment, whereas FEXSY provides a quantification of AXR. These experiments are very sensitive to the physiological state of cells and can be used to establish reliable protocols for the culture and harvesting of cells. Our results suggest that different breast cancer subtypes can be distinguished on the basis of their AXR values in cell suspensions. Time-resolved measurements allow the monitoring of the physiological state of cells in suspensions over the time-scale of hours, and reveal an abrupt disintegration of the intracellular compartment. In vivo, exchange can be detected in a tumor, whereas, in normal tissue, the exchange rate is outside the range experimentally accessible for FEXI. At present, low signal-to-noise ratio and limited scan time allows the quantification of AXR only in a region of interest of relatively large tumors. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.
PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*
Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh
2016-01-01
Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314
Chen, Yi-Ting; Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting
2017-05-01
Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Genovese, S; Epifano, F; Carlucci, G; Marcotullio, M C; Curini, M; Locatelli, M
2010-10-10
Oxyprenylated natural products (isopentenyloxy-, geranyloxy- and the less spread farnesyloxy-compounds and their biosynthetic derivatives) represent a family of secondary metabolites that have been consider for years merely as biosynthetic intermediates of the most abundant C-prenylated derivatives. Many of the isolated oxyprenylated natural products were shown to exert in vitro and in vivo remarkable anti-cancer and anti-inflammatory effects. 4'-Geranyloxyferulic acid [3-(4'-geranyloxy-3'-methoxyphenyl)-2-trans-propenoic] has been discovered as a valuable chemopreventive agent of several types of cancer. After development of a high yield and "eco-friendly" synthetic scheme of this secondary metabolite, starting from cheap and non-toxic reagents and substrates, we developed a new HPLC-DAD method for its quantification in grapefruit skin extract. A preliminary study on C18 column showed the separation between GOFA and boropinic acid (having the same core but with an isopentenyloxy side chain), used as internal standard. The tested column were thermostated at 28+/-1 degrees C and the separation was achieved in gradient condition at a flow rate of 1 mL/min with a starting mobile phase of H(2)O:methanol (40:60, v/v, 1% formic acid). The limit of detection (LOD, S/N=3) was 0.5 microg/mL and the limit of quantification (LOQ, S/N=10) was 1 microg/mL. Matrix-matched standard curves showed linearity up to 75 microg/mL. In the analytical range the precision (RSD%) values were
Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.
Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G
2018-06-01
Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).
PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.
Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina
2017-11-01
Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Su, Dian; Gaffrey, Matthew J.; Guo, Jia
2014-02-11
Protein S-glutathionylation (SSG) is an important regulatory posttranslational modification of protein cysteine (Cys) thiol redox switches, yet the role of specific cysteine residues as targets of modification is poorly understood. We report a novel quantitative mass spectrometry (MS)-based proteomic method for site-specific identification and quantification of S-glutathionylation across different conditions. Briefly, this approach consists of initial blocking of free thiols by alkylation, selective reduction of glutathionylated thiols and enrichment using thiol affinity resins, followed by on-resin tryptic digestion and isobaric labeling with iTRAQ (isobaric tags for relative and absolute quantitation) for MS-based identification and quantification. The overall approach was validatedmore » by application to RAW 264.7 mouse macrophages treated with different doses of diamide to induce glutathionylation. A total of 1071 Cys-sites from 690 proteins were identified in response to diamide treatment, with ~90% of the sites displaying >2-fold increases in SSG-modification compared to controls.. This approach was extended to identify potential SSG modified Cys-sites in response to H2O2, an endogenous oxidant produced by activated macrophages and many pathophysiological stimuli. The results revealed 364 Cys-sites from 265 proteins that were sensitive to S-glutathionylation in response to H2O2 treatment. These proteins covered a range of molecular types and molecular functions with free radical scavenging, and cell death and survival included as the most significantly enriched functional categories. Overall the results demonstrate that our approach is effective for site-specific identification and quantification of S-glutathionylated proteins. The analytical strategy also provides a unique approach to determining the major pathways and cell processes most susceptible to glutathionylation at a proteome-wide scale.« less
Go, Young-Mi; Walker, Douglas I; Liang, Yongliang; Uppal, Karan; Soltow, Quinlyn A; Tran, ViLinh; Strobel, Frederick; Quyyumi, Arshed A; Ziegler, Thomas R; Pennell, Kurt D; Miller, Gary W; Jones, Dean P
2015-12-01
The exposome is the cumulative measure of environmental influences and associated biological responses throughout the lifespan, including exposures from the environment, diet, behavior, and endogenous processes. A major challenge for exposome research lies in the development of robust and affordable analytic procedures to measure the broad range of exposures and associated biologic impacts occurring over a lifetime. Biomonitoring is an established approach to evaluate internal body burden of environmental exposures, but use of biomonitoring for exposome research is often limited by the high costs associated with quantification of individual chemicals. High-resolution metabolomics (HRM) uses ultra-high resolution mass spectrometry with minimal sample preparation to support high-throughput relative quantification of thousands of environmental, dietary, and microbial chemicals. HRM also measures metabolites in most endogenous metabolic pathways, thereby providing simultaneous measurement of biologic responses to environmental exposures. The present research examined quantification strategies to enhance the usefulness of HRM data for cumulative exposome research. The results provide a simple reference standardization protocol in which individual chemical concentrations in unknown samples are estimated by comparison to a concurrently analyzed, pooled reference sample with known chemical concentrations. The approach was tested using blinded analyses of amino acids in human samples and was found to be comparable to independent laboratory results based on surrogate standardization or internal standardization. Quantification was reproducible over a 13-month period and extrapolated to thousands of chemicals. The results show that reference standardization protocol provides an effective strategy that will enhance data collection for cumulative exposome research. In principle, the approach can be extended to other types of mass spectrometry and other analytical methods. © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Go, Young-Mi; Walker, Douglas I.; Liang, Yongliang; Uppal, Karan; Soltow, Quinlyn A.; Tran, ViLinh; Strobel, Frederick; Quyyumi, Arshed A.; Ziegler, Thomas R.; Pennell, Kurt D.; Miller, Gary W.; Jones, Dean P.
2015-01-01
The exposome is the cumulative measure of environmental influences and associated biological responses throughout the lifespan, including exposures from the environment, diet, behavior, and endogenous processes. A major challenge for exposome research lies in the development of robust and affordable analytic procedures to measure the broad range of exposures and associated biologic impacts occurring over a lifetime. Biomonitoring is an established approach to evaluate internal body burden of environmental exposures, but use of biomonitoring for exposome research is often limited by the high costs associated with quantification of individual chemicals. High-resolution metabolomics (HRM) uses ultra-high resolution mass spectrometry with minimal sample preparation to support high-throughput relative quantification of thousands of environmental, dietary, and microbial chemicals. HRM also measures metabolites in most endogenous metabolic pathways, thereby providing simultaneous measurement of biologic responses to environmental exposures. The present research examined quantification strategies to enhance the usefulness of HRM data for cumulative exposome research. The results provide a simple reference standardization protocol in which individual chemical concentrations in unknown samples are estimated by comparison to a concurrently analyzed, pooled reference sample with known chemical concentrations. The approach was tested using blinded analyses of amino acids in human samples and was found to be comparable to independent laboratory results based on surrogate standardization or internal standardization. Quantification was reproducible over a 13-month period and extrapolated to thousands of chemicals. The results show that reference standardization protocol provides an effective strategy that will enhance data collection for cumulative exposome research. In principle, the approach can be extended to other types of mass spectrometry and other analytical methods. PMID:26358001
Quantification of type I error probabilities for heterogeneity LOD scores.
Abreu, Paula C; Hodge, Susan E; Greenberg, David A
2002-02-01
Locus heterogeneity is a major confounding factor in linkage analysis. When no prior knowledge of linkage exists, and one aims to detect linkage and heterogeneity simultaneously, classical distribution theory of log-likelihood ratios does not hold. Despite some theoretical work on this problem, no generally accepted practical guidelines exist. Nor has anyone rigorously examined the combined effect of testing for linkage and heterogeneity and simultaneously maximizing over two genetic models (dominant, recessive). The effect of linkage phase represents another uninvestigated issue. Using computer simulation, we investigated type I error (P value) of the "admixture" heterogeneity LOD (HLOD) score, i.e., the LOD score maximized over both recombination fraction theta and admixture parameter alpha and we compared this with the P values when one maximizes only with respect to theta (i.e., the standard LOD score). We generated datasets of phase-known and -unknown nuclear families, sizes k = 2, 4, and 6 children, under fully penetrant autosomal dominant inheritance. We analyzed these datasets (1) assuming a single genetic model, and maximizing the HLOD over theta and alpha; and (2) maximizing the HLOD additionally over two dominance models (dominant vs. recessive), then subtracting a 0.3 correction. For both (1) and (2), P values increased with family size k; rose less for phase-unknown families than for phase-known ones, with the former approaching the latter as k increased; and did not exceed the one-sided mixture distribution xi = (1/2) chi1(2) + (1/2) chi2(2). Thus, maximizing the HLOD over theta and alpha appears to add considerably less than an additional degree of freedom to the associated chi1(2) distribution. We conclude with practical guidelines for linkage investigators. Copyright 2002 Wiley-Liss, Inc.
Jachiet, Pierre-Alain; Colson, Philippe; Lopez, Philippe; Bapteste, Eric
2014-08-07
Complex nongradual evolutionary processes such as gene remodeling are difficult to model, to visualize, and to investigate systematically. Despite these challenges, the creation of composite (or mosaic) genes by combination of genetic segments from unrelated gene families was established as an important adaptive phenomena in eukaryotic genomes. In contrast, almost no general studies have been conducted to quantify composite genes in viruses. Although viral genome mosaicism has been well-described, the extent of gene mosaicism and its rules of emergence remain largely unexplored. Applying methods from graph theory to inclusive similarity networks, and using data from more than 3,000 complete viral genomes, we provide the first demonstration that composite genes in viruses are 1) functionally biased, 2) involved in key aspects of the arm race between cells and viruses, and 3) can be classified into two distinct types of composite genes in all viral classes. Beyond the quantification of the widespread recombination of genes among different viruses of the same class, we also report a striking sharing of genetic information between viruses of different classes and with different nucleic acid types. This latter discovery provides novel evidence for the existence of a large and complex mobilome network, which appears partly bound by the sharing of genetic information and by the formation of composite genes between mobile entities with different genetic material. Considering that there are around 10E31 viruses on the planet, gene remodeling appears as a hugely significant way of generating and moving novel sequences between different kinds of organisms on Earth. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Charged black holes and the AdS/CFT correspondence
NASA Astrophysics Data System (ADS)
Tesileanu, Tiberiu
The AdS/CFT duality is an equivalence between string theory and gauge theory. The duality allows one to use calculations done in classical gravity to derive results in strongly-coupled field theories. This thesis explores several applications of the duality that have some relevance to condensed matter physics. In the first of these applications, it is shown that a large class of strongly-coupled (3 + 1)-dimensional conformal field theories undergo a superfluid phase transition in which a certain chiral primary operator develops a non-zero expectation value at low temperatures. A suggestion is made for the identity of the condensing operator in the field theory. In a different application, the conifold theory, an SU(N) x SU(N) gauge theory, is studied at nonzero chemical potential for baryon number density. In the low-temperature limit, the near-horizon geometry of the dual supergravity solution becomes a warped product AdS 2 x R3 x T1,1, with logarithmic warp factors. This encodes a type of emergent quantum near-criticality in the field theory. A similar construction is analyzed in the context of M theory. This construction is based on branes wrapped around topologically nontrivial cycles of the geometry. Several non-supersymmetric solutions are found, which pass a number of stability checks. Reducing one of the solutions to type IIA string theory, and T-dualizing to type IIB yields a product of a squashed Sasaki-Einstein manifold with an extremal BTZ black hole. Possible field theory interpretations are discussed.
Deconfinement and the Hagedorn transition in string theory.
Chaudhuri, S
2001-03-05
We introduce a new definition of the thermal partition function in string theory. With this new definition, the thermal partition functions of all of the string theories obey thermal duality relations with self-dual Hagedorn temperature beta(2)(H) = 4pi(2)alpha('). A beta-->beta(2)(H)/beta transformation maps the type I theory into a new string theory (type I) with thermal D p-branes, spatial hypersurfaces supporting a p-dimensional finite temperature non-Abelian Higgs-gauge theory for p< or =9. We demonstrate a continuous phase transition in the behavior of the static heavy quark-antiquark potential for small separations r(2)(*)
Goals, intentions and mental states: challenges for theories of autism.
Hamilton, Antonia F de C
2009-08-01
The ability to understand the goals and intentions behind other people's actions is central to many social interactions. Given the profound social difficulties seen in autism, we might expect goal understanding to be impaired in these individuals. Two influential theories, the 'broken mirror' theory and the mentalising theory, can both predict this result. However, a review of the current data provides little empirical support for goal understanding difficulties; several studies demonstrate normal performance by autistic children on tasks requiring the understanding of goals or intentions. I suggest that this conclusion forces us to reject the basic broken mirror theory and to re-evaluate the breadth of the mentalising theory. More subtle theories which distinguish between different types of mirroring and different types of mentalising may be able to account for the present data, and further research is required to test and refine these theories.
Wang, Weilan; Zijlstra, Ruurd T; Gänzle, Michael G
2017-05-15
Diagnosis of enterotoxigenic E. coli (ETEC) associated diarrhea is complicated by the diversity of E.coli virulence factors. This study developed a multiplex quantitative PCR assay based on high-resolution melting curves analysis (HRM-qPCR) to identify and quantify genes encoding five ETEC fimbriae related to diarrhea in swine, i.e. K99, F41, F18, F6 and K88. Five fimbriae expressed by ETEC were amplified in multiple HRM-qPCR reactions to allow simultaneous identification and quantification of five target genes. The assay was calibrated to allow quantification of the most abundant target gene, and validated by analysis of 30 samples obtained from piglets with diarrhea and healthy controls, and comparison to standard qPCR detection. The five amplicons with melting temperatures (Tm) ranging from 74.7 ± 0.06 to 80.5 ± 0.15 °C were well-separated by HRM-qPCR. The area of amplicons under the melting peak correlated linearly to the proportion of the template in the calibration mixture if the proportion exceeded 4.8% (K88) or <1% (all other amplicons). The suitability of the method was evaluated using 30 samples from weaned pigs aged 6-7 weeks; 14 of these animals suffered from diarrhea in consequence of poor sanitary conditions. Genes encoding fimbriae and enterotoxins were quantified by HRM-qPCR and/or qPCR. The multiplex HRM-qPCR allowed accurate analysis when the total gene copy number of targets was more than 1 × 10 5 / g wet feces and the HRM curves were able to simultaneously distinguish fimbriae genes in the fecal samples. The relative quantification of the most abundant F18 based on melting peak area was highly correlated (P < 0.001; r 2 = 0.956) with that of individual qPCR result but the correlation for less abundant fimbriae was much lower. The multiplex HRM assay identifies ETEC virulence factors specifically and efficiently. It correctly indicated the predominant fimbriae type and additionally provides information of presence/ absence of other fimbriae types and it could find broad applications for pathogen diagnosis.
MAVEN-SA: Model-Based Automated Visualization for Enhanced Situation Awareness
2005-11-01
34 methods. But historically, as arts evolve, these how to methods become systematized and codified (e.g. the development and refinement of color theory ...schema (as necessary) 3. Draw inferences from new knowledge to support decision making process 33 Visual language theory suggests that humans process...informed by theories of learning. Over the years, many types of software have been developed to support student learning. The various types of
ERIC Educational Resources Information Center
Soffree-Cady, Flore
To provide a writing pedagogy grounded in theory, a teaching method was developed which sequenced certain types of assignments. The classification of types and the organizational structure of the sequences were based on a teaching model that draws upon theories from various disciplines. Although the teaching activities are not new in themselves,…
Music Participation: Theory, Research, and Policy.
ERIC Educational Resources Information Center
Gates, J. Terry
1991-01-01
Bases a music participation theory on findings in music education, ethnomusicology, and sociology of leisure. Posits six types of music participants: professionals, apprentices, amateurs, hobbyists, recreationists, and dabblers. Distinguishes each type by theoretical variations in cost-benefit relationships as perceived by participants. Discusses…
(2,2) and (0,4) supersymmetric boundary conditions in 3d N =4 theories and type IIB branes
NASA Astrophysics Data System (ADS)
Chung, Hee-Joong; Okazaki, Tadashi
2017-10-01
The half-BPS boundary conditions preserving N =(2 ,2 ) and N =(0 ,4 ) supersymmetry in 3d N =4 supersymmetric gauge theories are examined. The BPS equations admit decomposition of the bulk supermultiplets into specific boundary supermultiplets of preserved supersymmetry. Nahm-like equations arise in the vector multiplet BPS boundary condition preserving N =(0 ,4 ) supersymmetry, and Robin-type boundary conditions appear for the hypermultiplet coupled to the vector multiplet when N =(2 ,2 ) supersymmetry is preserved. The half-BPS boundary conditions are realized in the brane configurations of type IIB string theory.
Spin-orbit coupling control of anisotropy, ground state and frustration in 5d 2Sr 2MgOsO 6
Morrow, Ryan; Taylor, Alice E.; Singh, D. J.; ...
2016-08-30
The influence of spin-orbit coupling (SOC) on the physical properties of the 5d 2 system Sr 2MgOsO 6 is probed via a combination of magnetometry, specific heat measurements, elastic and inelastic neutron scattering, and density functional theory calculations. Although a significant degree of frustration is expected, we find that Sr 2MgOsO 6 orders in a type I antiferromagnetic structure at the remarkably high temperature of 108 K. The measurements presented allow for the first accurate quantification of the size of the magnetic moment in a 5d 2 system of 0.60(2) μ B a significantly reduced moment from the expected valuemore » for such a system. Furthermore, significant anisotropy is identified via a spin excitation gap, and we confirm by first principles calculations that SOC not only provides the magnetocrystalline anisotropy, but also plays a crucial role in determining both the ground state magnetic order and the moment size in this compound. In conclusion, through comparison to Sr 2ScOsO 6, it is demonstrated that SOC-induced anisotropy has the ability to relieve frustration in 5d 2 systems relative to their 5d 3 counterparts, providing an explanation of the high TN found in Sr 2MgOsO 6.« less
Spin-orbit coupling control of anisotropy, ground state and frustration in 5d2 Sr2MgOsO6
Morrow, Ryan; Taylor, Alice E.; Singh, D. J.; Xiong, Jie; Rodan, Steven; Wolter, A. U. B.; Wurmehl, Sabine; Büchner, Bernd; Stone, M. B.; Kolesnikov, A. I.; Aczel, Adam A.; Christianson, A. D.; Woodward, Patrick M.
2016-01-01
The influence of spin-orbit coupling (SOC) on the physical properties of the 5d2 system Sr2MgOsO6 is probed via a combination of magnetometry, specific heat measurements, elastic and inelastic neutron scattering, and density functional theory calculations. Although a significant degree of frustration is expected, we find that Sr2MgOsO6 orders in a type I antiferromagnetic structure at the remarkably high temperature of 108 K. The measurements presented allow for the first accurate quantification of the size of the magnetic moment in a 5d2 system of 0.60(2) μB –a significantly reduced moment from the expected value for such a system. Furthermore, significant anisotropy is identified via a spin excitation gap, and we confirm by first principles calculations that SOC not only provides the magnetocrystalline anisotropy, but also plays a crucial role in determining both the ground state magnetic order and the size of the local moment in this compound. Through comparison to Sr2ScOsO6, it is demonstrated that SOC-induced anisotropy has the ability to relieve frustration in 5d2 systems relative to their 5d3 counterparts, providing an explanation of the high TN found in Sr2MgOsO6. PMID:27571715
Spin-orbit coupling control of anisotropy, ground state and frustration in 5d(2) Sr2MgOsO6.
Morrow, Ryan; Taylor, Alice E; Singh, D J; Xiong, Jie; Rodan, Steven; Wolter, A U B; Wurmehl, Sabine; Büchner, Bernd; Stone, M B; Kolesnikov, A I; Aczel, Adam A; Christianson, A D; Woodward, Patrick M
2016-08-30
The influence of spin-orbit coupling (SOC) on the physical properties of the 5d(2) system Sr2MgOsO6 is probed via a combination of magnetometry, specific heat measurements, elastic and inelastic neutron scattering, and density functional theory calculations. Although a significant degree of frustration is expected, we find that Sr2MgOsO6 orders in a type I antiferromagnetic structure at the remarkably high temperature of 108 K. The measurements presented allow for the first accurate quantification of the size of the magnetic moment in a 5d(2) system of 0.60(2) μB -a significantly reduced moment from the expected value for such a system. Furthermore, significant anisotropy is identified via a spin excitation gap, and we confirm by first principles calculations that SOC not only provides the magnetocrystalline anisotropy, but also plays a crucial role in determining both the ground state magnetic order and the size of the local moment in this compound. Through comparison to Sr2ScOsO6, it is demonstrated that SOC-induced anisotropy has the ability to relieve frustration in 5d(2) systems relative to their 5d(3) counterparts, providing an explanation of the high TN found in Sr2MgOsO6.
The effect of vision elimination during quiet stance tasks with different feet positions.
Sarabon, Nejc; Rosker, Jernej; Loefler, Stefan; Kern, Helmut
2013-09-01
Literature confirms the effects of vision and stance on body sway and indicates possible interactions between the two. However, no attempts have been made to systematically compare the effect of vision on the different types of stance which are frequently used in clinical and research practice. The biomechanical changes that occur after changing shape and size of the support surface suggest possible sensory re-weighting might take place. The purpose of this study was to assess the effect of vision on body sway in relation to different stance configurations and width. Thirty-eight volunteers performed four quiet stance configurations (parallel, semi-tandem, tandem and single leg), repeating them with open and closed eyes. Traditional parameters, recurrence quantification analysis and sample entropy were analyzed from the CoP trajectory signal. Traditional and recurrence quantification analysis parameters were affected by vision removal and stance type. Exceptions were frequency of oscillation, entropy and trapping time. The most prominent effect of vision elimination on traditional parameters was observed for narrower stances. A significant interaction effect between vision removal and stance type was present for most of the parameters observed (p<0.05). The interaction effect between medio-lateral and antero-posterior traditional parameters differed in linearity between stances. The results confirm the effect of vision removal on the body sway. However, for the medio-lateral traditional parameters, the effects did not increase linearly with the change in width and stance type. This suggests that removal of vision could be more effectively compensated by other sensory systems in semi-tandem stance, tandem and single legged stance. Copyright © 2013 Elsevier B.V. All rights reserved.
Effects of the local structure dependence of evaporation fields on field evaporation behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Lan; Marquis, Emmanuelle A., E-mail: emarq@umich.edu; Withrow, Travis
2015-12-14
Accurate three dimensional reconstructions of atomic positions and full quantification of the information contained in atom probe microscopy data rely on understanding the physical processes taking place during field evaporation of atoms from needle-shaped specimens. However, the modeling framework for atom probe microscopy has only limited quantitative justification. Building on the continuum field models previously developed, we introduce a more physical approach with the selection of evaporation events based on density functional theory calculations. This model reproduces key features observed experimentally in terms of sequence of evaporation, evaporation maps, and depth resolution, and provides insights into the physical limit formore » spatial resolution.« less
C. E. Naficy; T. T. Veblen; P. F. Hessburg
2015-01-01
Within the last decade, mixed-severity fire regimes (MSFRs) have gained increasing attention in both the scientific and management communities (Arno and others 2000, Baker and others 2007, Hessburg and others 2007, Perry and others 2011, Halofsky and others 2011, Stine and others 2014). The growing influence of the MSFR model derives from several factors including: (1...
NASA Astrophysics Data System (ADS)
Fu, Yi; Yu, Guoqiang; Levine, Douglas A.; Wang, Niya; Shih, Ie-Ming; Zhang, Zhen; Clarke, Robert; Wang, Yue
2015-09-01
Most published copy number datasets on solid tumors were obtained from specimens comprised of mixed cell populations, for which the varying tumor-stroma proportions are unknown or unreported. The inability to correct for signal mixing represents a major limitation on the use of these datasets for subsequent analyses, such as discerning deletion types or detecting driver aberrations. We describe the BACOM2.0 method with enhanced accuracy and functionality to normalize copy number signals, detect deletion types, estimate tumor purity, quantify true copy numbers, and calculate average-ploidy value. While BACOM has been validated and used with promising results, subsequent BACOM analysis of the TCGA ovarian cancer dataset found that the estimated average tumor purity was lower than expected. In this report, we first show that this lowered estimate of tumor purity is the combined result of imprecise signal normalization and parameter estimation. Then, we describe effective allele-specific absolute normalization and quantification methods that can enhance BACOM applications in many biological contexts while in the presence of various confounders. Finally, we discuss the advantages of BACOM in relation to alternative approaches. Here we detail this revised computational approach, BACOM2.0, and validate its performance in real and simulated datasets.
Functional Classification of Natural Resources for Valuing Natural Resources in Korea
NASA Astrophysics Data System (ADS)
Choi, H.; Lee, W.; Kwak, H.
2013-12-01
The ecosystem services concept emphasizes not only regulating services, but also supporting, provisioning, and cultural/social services according to the Millennium Ecosystem Assessment (MA). While the spatial and quantifying of ecosystem services is becoming increasingly recognized for natural resources conservation, however, due to methodological challenges, ecosystem services quantification is rarely considered in Republic of Korea (ROK). This study matches appropriate indicators, data and mapping for describing respective states, quantification and ecosystem valuation. The results were analyzed with statistical and GIS-based techniques. We classified the ecosystem services function based on reference to the literature, interviews and a modified approach compared to the MA, the Economics of Ecosystems and Biodiversity (TEEB). For quantifying values, we subdivided land cover types using ecological features and normalized numerical information of provisioning services, regulating services and cultural services. Resulting hotspots of ecosystem services are related to landscape features and land cover types in ROK. The mapping results show hotspots of ecosystem services where high level of ecosystem services is distributed - around Baekdudaegan protected area (Gangwon, Gyeongbuk Province, Chungbuk, Jeonam Province). n addition, the results of our study show that ecosystem services function - especially, fostering water resources, erosion control, air quality and pollution control in terrestrial ecosystems - can contribute to planning management policy for ecosystem based management at regional scale.
Stadler, Julia; Eder, Johanna; Pratscher, Barbara; Brandt, Sabine; Schneller, Doris; Müllegger, Robert; Vogl, Claus; Trautinger, Franz; Brem, Gottfried; Burgstaller, Joerg P.
2015-01-01
Cell-free circulating tumor DNA in the plasma of cancer patients has become a common point of interest as indicator of therapy options and treatment response in clinical cancer research. Especially patient- and tumor-specific single nucleotide variants that accurately distinguish tumor DNA from wild type DNA are promising targets. The reliable detection and quantification of these single-base DNA variants is technically challenging. Currently, a variety of techniques is applied, with no apparent “gold standard”. Here we present a novel qPCR protocol that meets the conditions of extreme sensitivity and specificity that are required for detection and quantification of tumor DNA. By consecutive application of two polymerases, one of them designed for extreme base-specificity, the method reaches unprecedented sensitivity and specificity. Three qPCR assays were tested with spike-in experiments, specific for point mutations BRAF V600E, PTEN T167A and NRAS Q61L of melanoma cell lines. It was possible to detect down to one copy of tumor DNA per reaction (Poisson distribution), at a background of up to 200 000 wild type DNAs. To prove its clinical applicability, the method was successfully tested on a small cohort of BRAF V600E positive melanoma patients. PMID:26562020
NDE and SHM Simulation for CFRP Composites
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Parker, F. Raymond
2014-01-01
Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.
Quantification of collagen contraction in three-dimensional cell culture.
Kopanska, Katarzyna S; Bussonnier, Matthias; Geraldo, Sara; Simon, Anthony; Vignjevic, Danijela; Betz, Timo
2015-01-01
Many different cell types including fibroblasts, smooth muscle cells, endothelial cells, and cancer cells exert traction forces on the fibrous components of the extracellular matrix. This can be observed as matrix contraction both macro- and microscopically in three-dimensional (3D) tissues models such as collagen type I gels. The quantification of local contraction at the micron scale, including its directionality and speed, in correlation with other parameters such as cell invasion, local protein or gene expression, can provide useful information to study wound healing, organism development, and cancer metastasis. In this article, we present a set of tools to quantify the flow dynamics of collagen contraction, induced by cells migrating out of a multicellular cancer spheroid into a three-dimensional (3D) collagen matrix. We adapted a pseudo-speckle technique that can be applied to bright-field and fluorescent microscopy time series. The image analysis presented here is based on an in-house written software developed in the Matlab (Mathworks) programming environment. The analysis program is freely available from GitHub following the link: http://dx.doi.org/10.5281/zenodo.10116. This tool provides an automatized technique to measure collagen contraction that can be utilized in different 3D cellular systems. Copyright © 2015 Elsevier Inc. All rights reserved.
Islam, Johirul; Zaman, Kamaruz; Chakrabarti, Srijita; Sharma Bora, Nilutpal; Mandal, Santa; Pratim Pathak, Manash; Srinivas Raju, Pakalapati; Chattopadhyay, Pronobesh
2017-07-01
A simple, accurate and sensitive reversed-phase high-performance liquid chromatographic (RP-HPLC) method has been developed for the estimation of ethyl 2-aminobenzoate (EAB) in a matrix type monolithic polymeric device and validated as per the International Conference on Harmonization guidelines. The analysis was performed isocratically on a ZORBAX Eclipse plus C18 analytical column (250 × 4.4 mm, 5 μm) and a diode array detector (DAD) using acetonitrile and water (75:25 v/v) as the mobile phase by keeping the flow-rate constant at 1.0 mL/min. Determination of EAB was not interfered in the presence of excipients. Inter- and intra-day relative standard deviations were not higher than 2%. Mean recovery was between 98.7 and 101.3%. Calibration curve was linear in the concentration range of 0.5-10 µg/mL. Limits of detection and quantification were 0.19 and 0.60 µg/mL, respectively. Thus, the present report put forward a novel method for the estimation of EAB, an emerging insect repellent, by using RP-HPLC technique. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Ghanbarian, Behzad; Berg, Carl F.
2017-09-01
Accurate quantification of formation resistivity factor F (also called formation factor) provides useful insight into connectivity and pore space topology in fully saturated porous media. In particular the formation factor has been extensively used to estimate permeability in reservoir rocks. One of the widely applied models to estimate F is Archie's law (F = ϕ- m in which ϕ is total porosity and m is cementation exponent) that is known to be valid in rocks with negligible clay content, such as clean sandstones. In this study we compare formation factors determined by percolation and effective-medium theories as well as Archie's law with numerical simulations of electrical resistivity on digital rock models. These digital models represent Bentheimer and Fontainebleau sandstones and are derived either by reconstruction or directly from micro-tomographic images. Results show that the universal quadratic power law from percolation theory accurately estimates the calculated formation factor values in network models over the entire range of porosity. However, it crosses over to the linear scaling from the effective-medium approximation at the porosity of 0.75 in grid models. We also show that the effect of critical porosity, disregarded in Archie's law, is nontrivial, and the Archie model inaccurately estimates the formation factor in low-porosity homogeneous sandstones.
Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana
2018-01-01
The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.
Constraining f(R) theories with cosmography
NASA Astrophysics Data System (ADS)
Anabella Teppa Pannia, Florencia; Esteban Perez Bergliaffa, Santiago
2013-08-01
A method to set constraints on the parameters of extended theories of gravitation is presented. It is based on the comparison of two series expansions of any observable that depends on H(z). The first expansion is of the cosmographical type, while the second uses the dependence of H with z furnished by a given type of extended theory. When applied to f(R) theories together with the redshift drift, the method yields limits on the parameters of two examples (the theory of Hu and Sawicki [1], and the exponential gravity introduced by Linder [2]) that are compatible with or more stringent than the existing ones, as well as a limit for a previously unconstrained parameter.
Proof of Concept in Disrupted Tactical Networking
2017-09-01
because of the risk of detection. In this study , we design projectile-based mesh networking prototypes as one potential type of short-living network...to communicate because of the risk of detection. In this study , we design projectile-based mesh networking prototypes as one potential type of short...reader with a background in systems-theory. This study is designed using systems theory and uses systems theory as a lens through which to observe
On the theory of the type III burst exciter
NASA Technical Reports Server (NTRS)
Smith, R. A.; Goldstein, M. L.; Papadopoulos, K.
1976-01-01
In situ satellite observations of type III burst exciters at 1 AU show that the beam does not evolve into a plateau in velocity space, contrary to the prediction of quasilinear theory. The observations can be explained by a theory that includes mode coupling effects due to excitation of the parametric oscillating two-stream instability and its saturation by anomalous resistivity. The time evolution of the beam velocity distribution is included in the analysis.
Fractonic line excitations: An inroad from three-dimensional elasticity theory
NASA Astrophysics Data System (ADS)
Pai, Shriya; Pretko, Michael
2018-06-01
We demonstrate the existence of a fundamentally new type of excitation, fractonic lines, which are linelike excitations with the restricted mobility properties of fractons. These excitations, described using an amalgamation of higher-form gauge theories with symmetric tensor gauge theories, see direct physical realization as the topological lattice defects of ordinary three-dimensional quantum crystals. Starting with the more familiar elasticity theory, we show how theory maps onto a rank-4 tensor gauge theory, with phonons corresponding to gapless gauge modes and disclination defects corresponding to linelike charges. We derive flux conservation laws which lock these linelike excitations in place, analogous to the higher moment charge conservation laws of fracton theories. This way of encoding mobility restrictions of lattice defects could shed light on melting transitions in three dimensions. This new type of extended object may also be a useful tool in the search for improved quantum error-correcting codes in three dimensions.
ERIC Educational Resources Information Center
Wanzer, Melissa B.; Frymier, Ann B.; Irwin, Jeffrey
2010-01-01
This paper proposes the Instructional Humor Processing Theory (IHPT), a theory that incorporates elements of incongruity-resolution theory, disposition theory, and the elaboration likelihood model (ELM) of persuasion. IHPT is proposed and offered as an explanation for why some types of instructor-generated humor result in increased student…
Quantization of higher abelian gauge theory in generalized differential cohomology
NASA Astrophysics Data System (ADS)
Szabo, R.
We review and elaborate on some aspects of the quantization of certain classes of higher abelian gauge theories using techniques of generalized differential cohomology. Particular emphasis is placed on the examples of generalized Maxwell theory and Cheeger-Simons cohomology, and of Ramond-Ramond fields in Type II superstring theory and differential K-theory.
Breaking Ground: A Study of Gestalt Therapy Theory and Holland's Theory of Vocational Choice.
ERIC Educational Resources Information Center
Hartung, Paul J.
In both Gestalt therapy and Holland's theory of vocational choice, person-environment interaction receives considerable emphasis. Gestalt therapy theory suggests that people make contact (that is, meet needs) through a characteristic style of interacting with the environment. Holland identifies six personality types in his theory and asserts that…
NASA Astrophysics Data System (ADS)
Srinivas, N.; Malik, R. P.
2017-11-01
We derive the off-shell nilpotent symmetries of the two (1 + 1)-dimensional (2D) non-Abelian 1-form gauge theory by using the theoretical techniques of the geometrical superfield approach to Becchi-Rouet-Stora-Tyutin (BRST) formalism. For this purpose, we exploit the augmented version of superfield approach (AVSA) and derive theoretically useful nilpotent (anti-)BRST, (anti-)co-BRST symmetries and Curci-Ferrari (CF)-type restrictions for the self-interacting 2D non-Abelian 1-form gauge theory (where there is no interaction with matter fields). The derivation of the (anti-)co-BRST symmetries and all possible CF-type restrictions are completely novel results within the framework of AVSA to BRST formalism where the ordinary 2D non-Abelian theory is generalized onto an appropriately chosen (2, 2)-dimensional supermanifold. The latter is parametrized by the superspace coordinates ZM = (xμ,𝜃,𝜃¯) where xμ (with μ = 0, 1) are the bosonic coordinates and a pair of Grassmannian variables (𝜃,𝜃¯) obey the relationships: 𝜃2 = 𝜃¯2 = 0, 𝜃𝜃¯ + 𝜃¯𝜃 = 0. The topological nature of our 2D theory allows the existence of a tower of CF-type restrictions.
Bayesian Methods for Effective Field Theories
NASA Astrophysics Data System (ADS)
Wesolowski, Sarah
Microscopic predictions of the properties of atomic nuclei have reached a high level of precision in the past decade. This progress mandates improved uncertainty quantification (UQ) for a robust comparison of experiment with theory. With the uncertainty from many-body methods under control, calculations are now sensitive to the input inter-nucleon interactions. These interactions include parameters that must be fit to experiment, inducing both uncertainty from the fit and from missing physics in the operator structure of the Hamiltonian. Furthermore, the implementation of the inter-nucleon interactions is not unique, which presents the additional problem of assessing results using different interactions. Effective field theories (EFTs) take advantage of a separation of high- and low-energy scales in the problem to form a power-counting scheme that allows the organization of terms in the Hamiltonian based on their expected contribution to observable predictions. This scheme gives a natural framework for quantification of uncertainty due to missing physics. The free parameters of the EFT, called the low-energy constants (LECs), must be fit to data, but in a properly constructed EFT these constants will be natural-sized, i.e., of order unity. The constraints provided by the EFT, namely the size of the systematic uncertainty from truncation of the theory and the natural size of the LECs, are assumed information even before a calculation is performed or a fit is done. Bayesian statistical methods provide a framework for treating uncertainties that naturally incorporates prior information as well as putting stochastic and systematic uncertainties on an equal footing. For EFT UQ Bayesian methods allow the relevant EFT properties to be incorporated quantitatively as prior probability distribution functions (pdfs). Following the logic of probability theory, observable quantities and underlying physical parameters such as the EFT breakdown scale may be expressed as pdfs that incorporate the prior pdfs. Problems of model selection, such as distinguishing between competing EFT implementations, are also natural in a Bayesian framework. In this thesis we focus on two complementary topics for EFT UQ using Bayesian methods--quantifying EFT truncation uncertainty and parameter estimation for LECs. Using the order-by-order calculations and underlying EFT constraints as prior information, we show how to estimate EFT truncation uncertainties. We then apply the result to calculating truncation uncertainties on predictions of nucleon-nucleon scattering in chiral effective field theory. We apply model-checking diagnostics to our calculations to ensure that the statistical model of truncation uncertainty produces consistent results. A framework for EFT parameter estimation based on EFT convergence properties and naturalness is developed which includes a series of diagnostics to ensure the extraction of the maximum amount of available information from data to estimate LECs with minimal bias. We develop this framework using model EFTs and apply it to the problem of extrapolating lattice quantum chromodynamics results for the nucleon mass. We then apply aspects of the parameter estimation framework to perform case studies in chiral EFT parameter estimation, investigating a possible operator redundancy at fourth order in the chiral expansion and the appropriate inclusion of truncation uncertainty in estimating LECs.
Durtschi, Jacob D; Stevenson, Jeffery; Hymas, Weston; Voelkerding, Karl V
2007-02-01
Real-time PCR data analysis for quantification has been the subject of many studies aimed at the identification of new and improved quantification methods. Several analysis methods have been proposed as superior alternatives to the common variations of the threshold crossing method. Notably, sigmoidal and exponential curve fit methods have been proposed. However, these studies have primarily analyzed real-time PCR with intercalating dyes such as SYBR Green. Clinical real-time PCR assays, in contrast, often employ fluorescent probes whose real-time amplification fluorescence curves differ from those of intercalating dyes. In the current study, we compared four analysis methods related to recent literature: two versions of the threshold crossing method, a second derivative maximum method, and a sigmoidal curve fit method. These methods were applied to a clinically relevant real-time human herpes virus type 6 (HHV6) PCR assay that used a minor groove binding (MGB) Eclipse hybridization probe as well as an Epstein-Barr virus (EBV) PCR assay that used an MGB Pleiades hybridization probe. We found that the crossing threshold method yielded more precise results when analyzing the HHV6 assay, which was characterized by lower signal/noise and less developed amplification curve plateaus. In contrast, the EBV assay, characterized by greater signal/noise and amplification curves with plateau regions similar to those observed with intercalating dyes, gave results with statistically similar precision by all four analysis methods.
Quantification of (1→4)-β-d-Galactans in Compression Wood Using an Immuno-Dot Assay
Chavan, Ramesh R.; Fahey, Leona M.; Harris, Philip J.
2015-01-01
Compression wood is a type of reaction wood formed on the underside of softwood stems when they are tilted from the vertical and on the underside of branches. Its quantification is still a matter of some scientific debate. We developed a new technique that has the potential to do this based on the higher proportions of (1→4)-β-d-galactans that occur in tracheid cell walls of compression wood. Wood was milled, partially delignified, and the non-cellulosic polysaccharides, including the (1→4)-β-d-galactans, extracted with 6 M sodium hydroxide. After neutralizing, the solution was serially diluted, and the (1→4)-β-d-galactans determined by an immuno-dot assay using the monoclonal antibody LM5, which specifically recognizes this polysaccharide. Spots were quantified using a dilution series of a commercially available (1→4)-β-d-galactan from lupin seeds. Using this method, compression and opposite woods from radiata pine (Pinus radiata) were easily distinguished based on the amounts of (1→4)-β-d-galactans extracted. The non-cellulosic polysaccharides in the milled wood samples were also hydrolysed using 2 M trifluoroacetic acid followed by the separation and quantification of the released neutral monosaccharides by high performance anion exchange chromatography. This confirmed that the compression woods contained higher proportions of galactose-containing polysaccharides than the opposite woods. PMID:27135316
Detection and quantification of extracellular microRNAs in murine biofluids
2014-01-01
Background MicroRNAs (miRNAs) are short RNA molecules which regulate gene expression in eukaryotic cells, and are abundant and stable in biofluids such as blood serum and plasma. As such, there has been heightened interest in the utility of extracellular miRNAs as minimally invasive biomarkers for diagnosis and monitoring of a wide range of human pathologies. However, quantification of extracellular miRNAs is subject to a number of specific challenges, including the relatively low RNA content of biofluids, the possibility of contamination with serum proteins (including RNases and PCR inhibitors), hemolysis, platelet contamination/activation, a lack of well-established reference miRNAs and the biochemical properties of miRNAs themselves. Protocols for the detection and quantification of miRNAs in biofluids are therefore of high interest. Results The following protocol was validated by quantifying miRNA abundance in C57 (wild-type) and dystrophin-deficient (mdx) mice. Important differences in miRNA abundance were observed depending on whether blood was taken from the jugular or tail vein. Furthermore, efficiency of miRNA recovery was reduced when sample volumes greater than 50 μl were used. Conclusions Here we describe robust and novel procedures to harvest murine serum/plasma, extract biofluid RNA, amplify specific miRNAs by RT-qPCR and analyze the resulting data, enabling the determination of relative and absolute miRNA abundance in extracellular biofluids with high accuracy, specificity and sensitivity. PMID:24629058
Inference and quantification of peptidoforms in large sample cohorts by SWATH-MS
Röst, Hannes L; Ludwig, Christina; Buil, Alfonso; Bensimon, Ariel; Soste, Martin; Spector, Tim D; Dermitzakis, Emmanouil T; Collins, Ben C; Malmström, Lars; Aebersold, Ruedi
2017-01-01
The consistent detection and quantification of protein post-translational modifications (PTMs) across sample cohorts is an essential prerequisite for the functional analysis of biological processes. Data-independent acquisition (DIA), a bottom-up mass spectrometry based proteomic strategy, exemplified by SWATH-MS, provides complete precursor and fragment ion information of a sample and thus, in principle, the information to identify peptidoforms, the modified variants of a peptide. However, due to the convoluted structure of DIA data sets the confident and systematic identification and quantification of peptidoforms has remained challenging. Here we present IPF (Inference of PeptidoForms), a fully automated algorithm that uses spectral libraries to query, validate and quantify peptidoforms in DIA data sets. The method was developed on data acquired by SWATH-MS and benchmarked using a synthetic phosphopeptide reference data set and phosphopeptide-enriched samples. The data indicate that IPF reduced false site-localization by more than 7-fold in comparison to previous approaches, while recovering 85.4% of the true signals. IPF was applied to detect and quantify peptidoforms carrying ten different types of PTMs in DIA data acquired from more than 200 samples of undepleted blood plasma of a human twin cohort. The data approportioned, for the first time, the contribution of heritable, environmental and longitudinal effects on the observed quantitative variability of specific modifications in blood plasma of a human population. PMID:28604659
Dignity realization of patients with stroke in hospital care: A grounded theory.
Rannikko, Sunna; Stolt, Minna; Suhonen, Riitta; Leino-Kilpi, Helena
2017-01-01
Dignity is seen as an important but complex concept in the healthcare context. In this context, the discussion of dignity includes concepts of other ethical principles such as autonomy and privacy. Patients consider dignity to cover individuality, patient's feelings, communication, and the behavior of healthcare personnel. However, there is a lack of knowledge concerning the realization of patients' dignity in hospital care and the focus of the study is therefore on the realization of dignity of the vulnerable group of patients with stroke. The aim of the study was to create a theoretical construct to describe the dignity realization of patients with stroke in hospital care. Research design and participants: Patients with stroke (n = 16) were interviewed in 2015 using a semi-structured interview containing open questions concerning dignity. The data were analyzed using constant comparison of Grounded Theory. Ethical considerations: Ethical approval for the research was obtained from the Ethics Committee of the University. The permission for the research was given by the hospital. Informed consent was obtained from participants. The "Theory of Dignity Realization of Patients with Stroke in Hospital Care" consists of a core category including generic elements of the new situation and dignity realization types. The core category was identified as "Dignity in a new situation" and the generic elements as health history, life history, individuality and stroke. Dignity of patients with stroke is realized through specific types of realization: person-related dignity type, control-related dignity type, independence-related dignity type, social-related dignity type, and care-related dignity type. The theory has similar elements with the previous literature but the whole construct is new. The theory reveals possible special characteristics in dignity realization of patients with stroke. For healthcare personnel, the theory provides a frame for a better understanding and recognition of how dignity of patients with stroke is realized.
ERIC Educational Resources Information Center
1997
This document contains three papers from a symposium on management development. "LMX (Leader-Member Exchange) Theory, Personality Type, and Management Development" (Janet Z. Burns) reports the results of a study on the similarities and differences in personality type (as outlined in the theories of Carl Jung and Isabel Myers) and its relationship…
Type II string theory on Calabi-Yau manifolds with torsion and non-Abelian discrete gauge symmetries
Braun, Volker; Cvetič, Mirjam; Donagi, Ron; ...
2017-07-26
Here, we provide the first explicit example of Type IIB string theory compactication on a globally defined Calabi-Yau threefold with torsion which results in a fourdimensional effective theory with a non-Abelian discrete gauge symmetry. Our example is based on a particular Calabi-Yau manifold, the quotient of a product of three elliptic curves by a fixed point free action of Z 2 X Z 2. Its cohomology contains torsion classes in various degrees. The main technical novelty is in determining the multiplicative structure of the (torsion part of) the cohomology ring, and in particular showing that the cup product of secondmore » cohomology torsion elements goes non-trivially to the fourth cohomology. This specifies a non-Abelian, Heisenberg-type discrete symmetry group of the four-dimensional theory.« less
Type II string theory on Calabi-Yau manifolds with torsion and non-Abelian discrete gauge symmetries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braun, Volker; Cvetič, Mirjam; Donagi, Ron
Here, we provide the first explicit example of Type IIB string theory compactication on a globally defined Calabi-Yau threefold with torsion which results in a fourdimensional effective theory with a non-Abelian discrete gauge symmetry. Our example is based on a particular Calabi-Yau manifold, the quotient of a product of three elliptic curves by a fixed point free action of Z 2 X Z 2. Its cohomology contains torsion classes in various degrees. The main technical novelty is in determining the multiplicative structure of the (torsion part of) the cohomology ring, and in particular showing that the cup product of secondmore » cohomology torsion elements goes non-trivially to the fourth cohomology. This specifies a non-Abelian, Heisenberg-type discrete symmetry group of the four-dimensional theory.« less
NASA Astrophysics Data System (ADS)
Trost, Wiebke; Frühholz, Sascha
2015-06-01
The proposed quartet theory of human emotions by Koelsch and colleagues [1] identifies four different affect systems to be involved in the processing of particular types of emotions. Moreover, the theory integrates both basic emotions and more complex emotion concepts, which include also aesthetic emotions such as musical emotions. The authors identify a particular brain system for each kind of emotion type, also by contrasting them to brain structures that are generally involved in emotion processing irrespective of the type of emotion. A brain system that has been less regarded in emotion theories, but which represents one of the four systems of the quartet to induce attachment related emotions, is the hippocampus.
Pre-Analytical Conditions in Non-Invasive Prenatal Testing of Cell-Free Fetal RHD
Rieneck, Klaus; Krog, Grethe Risum; Nielsen, Leif Kofoed; Tabor, Ann; Dziegiel, Morten Hanefeld
2013-01-01
Background Non-invasive prenatal testing of cell-free fetal DNA (cffDNA) in maternal plasma can predict the fetal RhD type in D negative pregnant women. In Denmark, routine antenatal screening for the fetal RhD gene (RHD) directs the administration of antenatal anti-D prophylaxis only to women who carry an RhD positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based on data from routine antenatal RHD screening. Methods Blood samples were drawn at gestational age 25 weeks. DNA extracted from 1 mL of plasma was analyzed for fetal RHD using a duplex method for exon 7/10. We investigated the effect of blood sample transportation time (n = 110) and ambient outdoor temperatures (n = 1539) on the levels of cffDNA and total DNA. We compared two different quantification methods, the delta Ct method and a universal standard curve. PCR pipetting was compared on two systems (n = 104). Results The cffDNA level was unaffected by blood sample transportation for up to 9 days and by ambient outdoor temperatures ranging from -10°C to 28°C during transport. The universal standard curve was applicable for cffDNA quantification. Identical levels of cffDNA were observed using the two automated PCR pipetting systems. We detected a mean of 100 fetal DNA copies/mL at a median gestational age of 25 weeks (range 10–39, n = 1317). Conclusion The setup for real-time PCR-based, non-invasive prenatal testing of cffDNA in the Capital Region of Denmark is very robust. Our findings regarding the transportation of blood samples demonstrate the high stability of cffDNA. The applicability of a universal standard curve facilitates easy cffDNA quantification. PMID:24204719
[Quantification of pulmonary emphysema in multislice-CT using different software tools].
Heussel, C P; Achenbach, T; Buschsieweke, C; Kuhnigk, J; Weinheimer, O; Hammer, G; Düber, C; Kauczor, H-U
2006-10-01
The data records of thin-section MSCT of the lung with approx. 300 images are difficult to use in manual evaluation. A computer-assisted pre-diagnosis can help with reporting. Furthermore, post-processing techniques, for instance, for quantification of emphysema on the basis of three-dimensional anatomical information might be improved and the workflow might be further automated. The results of 4 programs (Pulmo, Volume, YACTA and PulmoFUNC) for the quantitative analysis of emphysema (lung and emphysema volume, mean lung density and emphysema index) of 30 consecutive thin-section MSCT datasets with different emphysema severity levels were compared. The classification result of the YACTA program for different types of emphysema was also analyzed. Pulmo and Volume have a median operating time of 105 and 59 minutes respectively due to the necessity for extensive manual correction of the lung segmentation. The programs PulmoFUNC and YACTA, which are automated to a large extent, have a median runtime of 26 and 16 minutes, respectively. The evaluation with Pulmo and Volume using 2 different datasets resulted in implausible values. PulmoFUNC crashed with 2 other datasets in a reproducible manner. Only with YACTA could all graphic datasets be evaluated. The lung volume, emphysema volume, emphysema index and mean lung density determined by YACTA and PulmoFUNC are significantly larger than the corresponding values of Volume and Pulmo (differences: Volume: 119 cm(3)/65 cm(3)/1 %/17 HU, Pulmo: 60 cm(3)/96 cm(3)/1 %/37 HU). Classification of the emphysema type was in agreement with that of the radiologist in 26 panlobular cases, in 22 paraseptalen cases and in 15 centrilobular emphysema cases. The substantial expenditure of time obstructs the employment of quantitative emphysema analysis in the clinical routine. The results of YACTA and PulmoFUNC are affected by the dedicated exclusion of the tracheobronchial system. These fully automatic tools enable not only fast quantification without manual interaction, but also a reproducible measurement without user dependence.
Pre-analytical conditions in non-invasive prenatal testing of cell-free fetal RHD.
Clausen, Frederik Banch; Jakobsen, Tanja Roien; Rieneck, Klaus; Krog, Grethe Risum; Nielsen, Leif Kofoed; Tabor, Ann; Dziegiel, Morten Hanefeld
2013-01-01
Non-invasive prenatal testing of cell-free fetal DNA (cffDNA) in maternal plasma can predict the fetal RhD type in D negative pregnant women. In Denmark, routine antenatal screening for the fetal RhD gene (RHD) directs the administration of antenatal anti-D prophylaxis only to women who carry an RhD positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based on data from routine antenatal RHD screening. Blood samples were drawn at gestational age 25 weeks. DNA extracted from 1 mL of plasma was analyzed for fetal RHD using a duplex method for exon 7/10. We investigated the effect of blood sample transportation time (n = 110) and ambient outdoor temperatures (n = 1539) on the levels of cffDNA and total DNA. We compared two different quantification methods, the delta Ct method and a universal standard curve. PCR pipetting was compared on two systems (n = 104). The cffDNA level was unaffected by blood sample transportation for up to 9 days and by ambient outdoor temperatures ranging from -10 °C to 28 °C during transport. The universal standard curve was applicable for cffDNA quantification. Identical levels of cffDNA were observed using the two automated PCR pipetting systems. We detected a mean of 100 fetal DNA copies/mL at a median gestational age of 25 weeks (range 10-39, n = 1317). The setup for real-time PCR-based, non-invasive prenatal testing of cffDNA in the Capital Region of Denmark is very robust. Our findings regarding the transportation of blood samples demonstrate the high stability of cffDNA. The applicability of a universal standard curve facilitates easy cffDNA quantification.
Cusick, Kathleen D; Fitzgerald, Lisa A; Pirlo, Russell K; Cockrell, Allison L; Petersen, Emily R; Biffinger, Justin C
2014-01-01
Neurospora crassa has served as a model organism for studying circadian pathways and more recently has gained attention in the biofuel industry due to its enhanced capacity for cellulase production. However, in order to optimize N. crassa for biotechnological applications, metabolic pathways during growth under different environmental conditions must be addressed. Reverse-transcription quantitative PCR (RT-qPCR) is a technique that provides a high-throughput platform from which to measure the expression of a large set of genes over time. The selection of a suitable reference gene is critical for gene expression studies using relative quantification, as this strategy is based on normalization of target gene expression to a reference gene whose expression is stable under the experimental conditions. This study evaluated twelve candidate reference genes for use with N. crassa when grown in continuous culture bioreactors under different light and temperature conditions. Based on combined stability values from NormFinder and Best Keeper software packages, the following are the most appropriate reference genes under conditions of: (1) light/dark cycling: btl, asl, and vma1; (2) all-dark growth: btl, tbp, vma1, and vma2; (3) temperature flux: btl, vma1, act, and asl; (4) all conditions combined: vma1, vma2, tbp, and btl. Since N. crassa exists as different cell types (uni- or multi-nucleated), expression changes in a subset of the candidate genes was further assessed using absolute quantification. A strong negative correlation was found to exist between ratio and threshold cycle (CT) values, demonstrating that CT changes serve as a reliable reflection of transcript, and not gene copy number, fluctuations. The results of this study identified genes that are appropriate for use as reference genes in RT-qPCR studies with N. crassa and demonstrated that even with the presence of different cell types, relative quantification is an acceptable method for measuring gene expression changes during growth in bioreactors.
NASA Technical Reports Server (NTRS)
Passman, Stephen L.
1989-01-01
Generally, two types of theory are used to describe the field equations for suspensions. The so-called postulated equations are based on the kinetic theory of mixtures, which logically should give reasonable equations for solutions. The basis for the use of such theory for suspensions is tenuous, though it at least gives a logical path for mathematical arguments. It has the disadvantage that it leads to a system of equations which is underdetermined, in a sense that can be made precise. On the other hand, the so-called averaging theory starts with a determined system, but the very process of averaging renders the resulting system underdetermined. A third type of theory is proposed in which the kinetic theory of gases is used to motivate continuum equations for the suspended particles. This entails an interpretation of the stress in the particles that is different from the usual one. Classical theory is used to describe the motion of the suspending medium. The result is a determined system for a dilute suspension. Extension of the theory to more concentrated systems is discussed.
NASA Astrophysics Data System (ADS)
Konapala, Goutam; Mishra, Ashok
2017-12-01
The quantification of spatio-temporal hydroclimatic extreme events is a key variable in water resources planning, disaster mitigation, and preparing climate resilient society. However, quantification of these extreme events has always been a great challenge, which is further compounded by climate variability and change. Recently complex network theory was applied in earth science community to investigate spatial connections among hydrologic fluxes (e.g., rainfall and streamflow) in water cycle. However, there are limited applications of complex network theory for investigating hydroclimatic extreme events. This article attempts to provide an overview of complex networks and extreme events, event synchronization method, construction of networks, their statistical significance and the associated network evaluation metrics. For illustration purpose, we apply the complex network approach to study the spatio-temporal evolution of droughts in Continental USA (CONUS). A different drought threshold leads to a new drought event as well as different socio-economic implications. Therefore, it would be interesting to explore the role of thresholds on spatio-temporal evolution of drought through network analysis. In this study, long term (1900-2016) Palmer drought severity index (PDSI) was selected for spatio-temporal drought analysis using three network-based metrics (i.e., strength, direction and distance). The results indicate that the drought events propagate differently at different thresholds associated with initiation of drought events. The direction metrics indicated that onset of mild drought events usually propagate in a more spatially clustered and uniform approach compared to onsets of moderate droughts. The distance metric shows that the drought events propagate for longer distance in western part compared to eastern part of CONUS. We believe that the network-aided metrics utilized in this study can be an important tool in advancing our knowledge on drought propagation as well as other hydroclimatic extreme events. Although the propagation of droughts is investigated using the network approach, however process (physics) based approaches is essential to further understand the dynamics of hydroclimatic extreme events.
Theory of Type 3 and Type 2 Solar Radio Emissions
NASA Technical Reports Server (NTRS)
Robinson, P. A.; Cairns, I. H.
2000-01-01
The main features of some current theories of type III and type II bursts are outlined. Among the most common solar radio bursts, type III bursts are produced at frequencies of 10 kHz to a few GHz when electron beams are ejected from solar active regions, entering the corona and solar wind at typical speeds of 0.1c. These beams provide energy to generate Langmuir waves via a streaming instability. In the current stochastic-growth theory, Langmuir waves grow in clumps associated with random low-frequency density fluctuations, leading to the observed spiky waves. Nonlinear wave-wave interactions then lead to secondary emission of observable radio waves near the fundamental and harmonic of the plasma frequency. Subsequent scattering processes modify the dynamic radio spectra, while back-reaction of Langmuir waves on the beam causes it to fluctuate about a state of marginal stability. Theories based on these ideas can account for the observed properties of type III bursts, including the in situ waves and the dynamic spectra of the radiation. Type 11 bursts are associated with shock waves propagating through the corona and interplanetary space and radiating from roughly 30 kHz to 1 GHz. Their basic emission mechanisms are believed to be similar to those of type III events and radiation from Earth's foreshock. However, several sub-classes of type II bursts may exist with different source regions and detailed characteristics. Theoretical models for type II bursts are briefly reviewed, focusing on a model with emission from a foreshock region upstream of the shock for which observational evidence has just been reported.
den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M
2017-03-01
Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.
Nonlinear spin susceptibility in topological insulators
NASA Astrophysics Data System (ADS)
Shiranzaei, Mahroo; Fransson, Jonas; Cheraghchi, Hosein; Parhizgar, Fariborz
2018-05-01
We revise the theory of the indirect exchange interaction between magnetic impurities beyond the linear response theory to establish the effect of impurity resonances in the surface states of a three-dimensional topological insulator. The interaction is composed of isotropic Heisenberg, anisotropic Ising, and Dzyaloshinskii-Moriya types of couplings. We find that all three contributions are finite at the Dirac point, which is in stark contrast to the linear response theory which predicts a vanishing Dzyaloshinskii-Moriya-type contribution. We show that the spin-independent component of the impurity scattering can generate large values of the Dzyaloshinskii-Moriya-type coupling in comparison with the Heisenberg and Ising types of couplings, while these latter contributions drastically reduce in magnitude and undergo sign changes. As a result, both collinear and noncollinear configurations are allowed magnetic configurations of the impurities.
Information theory in systems biology. Part I: Gene regulatory and metabolic networks.
Mousavian, Zaynab; Kavousi, Kaveh; Masoudi-Nejad, Ali
2016-03-01
"A Mathematical Theory of Communication", was published in 1948 by Claude Shannon to establish a framework that is now known as information theory. In recent decades, information theory has gained much attention in the area of systems biology. The aim of this paper is to provide a systematic review of those contributions that have applied information theory in inferring or understanding of biological systems. Based on the type of system components and the interactions between them, we classify the biological systems into 4 main classes: gene regulatory, metabolic, protein-protein interaction and signaling networks. In the first part of this review, we attempt to introduce most of the existing studies on two types of biological networks, including gene regulatory and metabolic networks, which are founded on the concepts of information theory. Copyright © 2015 Elsevier Ltd. All rights reserved.
2015-12-02
simplification of the equations but at the expense of introducing modeling errors. We have shown that the Wick solutions have accuracy comparable to...the system of equations for the coefficients of formal power series solutions . Moreover, the structure of this propagator is seemingly universal, i.e...the problem of computing the numerical solution to kinetic partial differential equa- tions involving many phase variables. These types of equations
Wang, X; Hopkins, C
2016-10-01
Advanced Statistical Energy Analysis (ASEA) is used to predict vibration transmission across coupled beams which support multiple wave types up to high frequencies where Timoshenko theory is valid. Bending-longitudinal and bending-torsional models are considered for an L-junction and rectangular beam frame. Comparisons are made with measurements, Finite Element Methods (FEM) and Statistical Energy Analysis (SEA). When beams support at least two local modes for each wave type in a frequency band and the modal overlap factor is at least 0.1, measurements and FEM have relatively smooth curves. Agreement between measurements, FEM, and ASEA demonstrates that ASEA is able to predict high propagation losses which are not accounted for with SEA. These propagation losses tend to become more important at high frequencies with relatively high internal loss factors and can occur when there is more than one wave type. At such high frequencies, Timoshenko theory, rather than Euler-Bernoulli theory, is often required. Timoshenko theory is incorporated in ASEA and SEA using wave theory transmission coefficients derived assuming Euler-Bernoulli theory, but using Timoshenko group velocity when calculating coupling loss factors. The changeover between theories is appropriate above the frequency where there is a 26% difference between Euler-Bernoulli and Timoshenko group velocities.
Ahmavaara, Anni; Houston, Diane M
2007-09-01
Dweck has emphasized the role of pupils' implicit theories about intellectual ability in explaining variations in their engagement, persistence and achievement. She has also highlighted the role of confidence in one's intelligence as a factor influencing educational attainment. The aim of this paper is to develop a model of achievement aspiration in adolescence and to compare young people who are educated at a selective grammar school with those who attend a non-selective 'secondary modern' school. The sample consisted of 856 English secondary school pupils in years 7 and 10 from two selective and two non-selective secondary schools. Questionnaires were completed in schools. The findings are consistent with the model, showing that achievement aspiration is predicted directly by gender, school type and type of intelligence theory. Importantly, school type also affects aspirations indirectly, with effects being mediated by confidence in one's own intelligence and perceived academic performance. Intelligence theory also affects aspirations indirectly with effects being mediated by perceived academic performance, confidence and self-esteem. Additionally, intelligence theory has a stronger effect on aspirations in the selective schools than in the non-selective schools. The findings provide substantial support for Dweck's self-theory, showing that implicit theories are related to aspirations. However, the way in which theory of intelligence relates to age and gender suggests there may be important cross-cultural or contextual differences not addressed by Dweck's theory. Further research should also investigate the causal paths between aspirations, implicit theories of intelligence and the impact of school selection.
A Profile Approach to Self-Determination Theory Motivations at Work
ERIC Educational Resources Information Center
Moran, Christina M.; Diefendorff, James M.; Kim, Tae-Yeol; Liu, Zhi-Qiang
2012-01-01
Self-determination theory (SDT) posits the existence of distinct types of motivation (i.e., external, introjected, identified, integrated, and intrinsic). Research on these different types of motivation has typically adopted a variable-centered approach that seeks to understand how each motivation in isolation relates to employee outcomes. We…
Influence of Adolescent Social Cliques on Vocational Identity.
ERIC Educational Resources Information Center
Johnson, John A.; Cheek, Jonathan M.
While Holland's (1973) theory of personality types and vocational identity is widely used, the theory does not specify the developmental antecedents of the six personality types. To examine the relationship between membership in adolescent social cliques and vocational identity in early adulthood, four groups of college students (N=192)…
ERIC Educational Resources Information Center
Guajardo, Nicole R.; Turley-Ames, Kandi Jo
2004-01-01
Two studies examined associations between theory of mind performance and counterfactual thinking using both antecedent and consequent counterfactual tasks. Moreover, the studies examined children's abilities to generate different types of counterfactual statements in terms of direction and structure. Participants were 3-, 4-, and 5-year-old…
Marriage Counseling Using Differing Personality Types as a Resource.
ERIC Educational Resources Information Center
Emanuel, Joseph; Bernhardt, Greg
Carl Jung's theory of type states that much seemingly chance variation in human behavior results, not from chance, but from basic differences in human functioning. This theory is divided into two major components: fundamental human attitudes (extroversion, introversion) and basic mental processes (sensation, intuition, thinking, feeling).…
Topping, David; Wright, Scott A.; Griffiths, Ronald; Dean, David
2016-01-01
We have developed a physically based method for using two acoustic frequencies to measure suspended-silt-and-clay concentration, suspended-sand concentration, and suspended-sand median grain size in river cross sections at 15-minute intervals over decadal timescales. The method is strongly grounded in the extensive scientific literature on the scattering of sound by suspensions of small particles. In particular, the method takes advantage of the specific theoretical relations among acoustic frequency, acoustic attenuation, acoustic backscatter, suspended-sediment concentration, and suspended-sediment grain-size distribution. We briefly describe the theory and methods, demonstrate the application of the method, and compute biases and errors in the method at 14 stations in the Colorado River and Rio Grande basins, where large numbers of suspended-sediment samples have been collected concurrently with acoustical measurements over many years. Quantification of errors in sediment-transport measurements made using this method is essential if the measurements are to be used effectively, e.g., to evaluate uncertainty in long-term sediment loads and budgets
Global patterns of tropical forest fragmentation
NASA Astrophysics Data System (ADS)
Taubert, Franziska; Fischer, Rico; Groeneveld, Jürgen; Lehmann, Sebastian; Müller, Michael S.; Rödig, Edna; Wiegand, Thorsten; Huth, Andreas
2018-02-01
Remote sensing enables the quantification of tropical deforestation with high spatial resolution. This in-depth mapping has led to substantial advances in the analysis of continent-wide fragmentation of tropical forests. Here we identified approximately 130 million forest fragments in three continents that show surprisingly similar power-law size and perimeter distributions as well as fractal dimensions. Power-law distributions have been observed in many natural phenomena such as wildfires, landslides and earthquakes. The principles of percolation theory provide one explanation for the observed patterns, and suggest that forest fragmentation is close to the critical point of percolation; simulation modelling also supports this hypothesis. The observed patterns emerge not only from random deforestation, which can be described by percolation theory, but also from a wide range of deforestation and forest-recovery regimes. Our models predict that additional forest loss will result in a large increase in the total number of forest fragments—at maximum by a factor of 33 over 50 years—as well as a decrease in their size, and that these consequences could be partly mitigated by reforestation and forest protection.
Stochastic Evolution of Augmented Born-Infeld Equations
NASA Astrophysics Data System (ADS)
Holm, Darryl D.
2018-06-01
This paper compares the results of applying a recently developed method of stochastic uncertainty quantification designed for fluid dynamics to the Born-Infeld model of nonlinear electromagnetism. The similarities in the results are striking. Namely, the introduction of Stratonovich cylindrical noise into each of their Hamiltonian formulations introduces stochastic Lie transport into their dynamics in the same form for both theories. Moreover, the resulting stochastic partial differential equations retain their unperturbed form, except for an additional term representing induced Lie transport by the set of divergence-free vector fields associated with the spatial correlations of the cylindrical noise. The explanation for this remarkable similarity lies in the method of construction of the Hamiltonian for the Stratonovich stochastic contribution to the motion in both cases, which is done via pairing spatial correlation eigenvectors for cylindrical noise with the momentum map for the deterministic motion. This momentum map is responsible for the well-known analogy between hydrodynamics and electromagnetism. The momentum map for the Maxwell and Born-Infeld theories of electromagnetism treated here is the 1-form density known as the Poynting vector. Two appendices treat the Hamiltonian structures underlying these results.
Focused Belief Measures for Uncertainty Quantification in High Performance Semantic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Weaver, Jesse R.
In web-scale semantic data analytics there is a great need for methods which aggregate uncertainty claims, on the one hand respecting the information provided as accurately as possible, while on the other still being tractable. Traditional statistical methods are more robust, but only represent distributional, additive uncertainty. Generalized information theory methods, including fuzzy systems and Dempster-Shafer (DS) evidence theory, represent multiple forms of uncertainty, but are computationally and methodologically difficult. We require methods which provide an effective balance between the complete representation of the full complexity of uncertainty claims in their interaction, while satisfying the needs of both computational complexitymore » and human cognition. Here we build on J{\\o}sang's subjective logic to posit methods in focused belief measures (FBMs), where a full DS structure is focused to a single event. The resulting ternary logical structure is posited to be able to capture the minimal amount of generalized complexity needed at a maximum of computational efficiency. We demonstrate the efficacy of this approach in a web ingest experiment over the 2012 Billion Triple dataset from the Semantic Web Challenge.« less
Verification of Small Hole Theory for Application to Wire Chaffing Resulting in Shield Faults
NASA Technical Reports Server (NTRS)
Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.
2011-01-01
Our work is focused upon developing methods for wire chafe fault detection through the use of reflectometry to assess shield integrity. When shielded electrical aircraft wiring first begins to chafe typically the resulting evidence is small hole(s) in the shielding. We are focused upon developing algorithms and the signal processing necessary to first detect these small holes prior to incurring damage to the inner conductors. Our approach has been to develop a first principles physics model combined with probabilistic inference, and to verify this model with laboratory experiments as well as through simulation. Previously we have presented the electromagnetic small-hole theory and how it might be applied to coaxial cable. In this presentation, we present our efforts to verify this theoretical approach with high-fidelity electromagnetic simulations (COMSOL). Laboratory observations are used to parameterize the computationally efficient theoretical model with probabilistic inference resulting in quantification of hole size and location. Our efforts in characterizing faults in coaxial cable are subsequently leading to fault detection in shielded twisted pair as well as analysis of intermittent faulty connectors using similar techniques.
Thermodynamics, Life, the Universe and Everything
NASA Astrophysics Data System (ADS)
Neswald, Elizabeth
2015-01-01
The laws of thermodynamics were developed in the first half of the nineteenth century to describe processes governing the working of steam engines. The mechanical equivalent of heat, which quantified the relationship between heat and motion, enabled the quantification and comparison of all energy transformation processes. The energy laws and the mechanical equivalent of heat quickly moved out of the narrower field of physics to form the basis of a cosmic narrative that began with stellar evolution and continued to universal heat death. Newer physiological theories turned to the energy laws to explain life processes, energy and entropy were integrated into theories of biological evolution and degeneration, and economists and cultural theorists turned to thermodynamics to explore both the limits of natural resources and economic expansion and the contradictions of industrial modernity. This paper discusses the career of thermodynamics as an explanatory model and cultural commonplace in the late nineteenth and early twentieth centuries, and the different scientific, religious, and social perspectives that could be expressed through this model. Connected through the entropy law intimately to irreversible processes and time, thermodynamics provided an arena to debate which way the world was going.
Measuring the shapes of macromolecules – and why it matters
Li, Jie; Mach, Paul; Koehl, Patrice
2013-01-01
The molecular basis of life rests on the activity of biological macromolecules, mostly nucleic acids and proteins. A perhaps surprising finding that crystallized over the last handful of decades is that geometric reasoning plays a major role in our attempt to understand these activities. In this paper, we address this connection between geometry and biology, focusing on methods for measuring and characterizing the shapes of macromolecules. We briefly review existing numerical and analytical approaches that solve these problems. We cover in more details our own work in this field, focusing on the alpha shape theory as it provides a unifying mathematical framework that enable the analytical calculations of the surface area and volume of a macromolecule represented as a union of balls, the detection of pockets and cavities in the molecule, and the quantification of contacts between the atomic balls. We have shown that each of these quantities can be related to physical properties of the molecule under study and ultimately provides insight on its activity. We conclude with a brief description of new challenges for the alpha shape theory in modern structural biology. PMID:24688748
Integrated phenotypes: understanding trait covariation in plants and animals
Armbruster, W. Scott; Pélabon, Christophe; Bolstad, Geir H.; Hansen, Thomas F.
2014-01-01
Integration and modularity refer to the patterns and processes of trait interaction and independence. Both terms have complex histories with respect to both conceptualization and quantification, resulting in a plethora of integration indices in use. We review briefly the divergent definitions, uses and measures of integration and modularity and make conceptual links to allometry. We also discuss how integration and modularity might evolve. Although integration is generally thought to be generated and maintained by correlational selection, theoretical considerations suggest the relationship is not straightforward. We caution here against uncontrolled comparisons of indices across studies. In the absence of controls for trait number, dimensionality, homology, development and function, it is difficult, or even impossible, to compare integration indices across organisms or traits. We suggest that care be invested in relating measurement to underlying theory or hypotheses, and that summative, theory-free descriptors of integration generally be avoided. The papers that follow in this Theme Issue illustrate the diversity of approaches to studying integration and modularity, highlighting strengths and pitfalls that await researchers investigating integration in plants and animals. PMID:25002693
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
Mueller, Jenna L.; Harmany, Zachary T.; Mito, Jeffrey K.; Kennedy, Stephanie A.; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G.; Willett, Rebecca M.; Brown, J. Quincy; Ramanujam, Nimmi
2013-01-01
Purpose To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features. Materials and Methods Tissue excised from a genetically engineered mouse model of sarcoma was imaged using a subcellular resolution microendoscope after topical application of a fluorescent anatomical contrast agent: acriflavine. An algorithm based on sparse component analysis (SCA) and the circle transform (CT) was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma. Results Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity). For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach. Conclusion The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue. PMID:23824589
Mueller, Jenna L; Harmany, Zachary T; Mito, Jeffrey K; Kennedy, Stephanie A; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G; Willett, Rebecca M; Brown, J Quincy; Ramanujam, Nimmi
2013-01-01
To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features. TISSUE EXCISED FROM A GENETICALLY ENGINEERED MOUSE MODEL OF SARCOMA WAS IMAGED USING A SUBCELLULAR RESOLUTION MICROENDOSCOPE AFTER TOPICAL APPLICATION OF A FLUORESCENT ANATOMICAL CONTRAST AGENT: acriflavine. An algorithm based on sparse component analysis (SCA) and the circle transform (CT) was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma. Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity). For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach. The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue.
Caprioara-Buda, M; Meyer, W; Jeynov, B; Corbisier, P; Trapmann, S; Emons, H
2012-07-01
The reliable quantification of genetically modified organisms (GMOs) by real-time PCR requires, besides thoroughly validated quantitative detection methods, sustainable calibration systems. The latter establishes the anchor points for the measured value and the measurement unit, respectively. In this paper, the suitability of two types of DNA calibrants, i.e. plasmid DNA and genomic DNA extracted from plant leaves, for the certification of the GMO content in reference materials as copy number ratio between two targeted DNA sequences was investigated. The PCR efficiencies and coefficients of determination of the calibration curves as well as the measured copy number ratios for three powder certified reference materials (CRMs), namely ERM-BF415e (NK603 maize), ERM-BF425c (356043 soya), and ERM-BF427c (98140 maize), originally certified for their mass fraction of GMO, were compared for both types of calibrants. In all three systems investigated, the PCR efficiencies of plasmid DNA were slightly closer to the PCR efficiencies observed for the genomic DNA extracted from seed powders rather than those of the genomic DNA extracted from leaves. Although the mean DNA copy number ratios for each CRM overlapped within their uncertainties, the DNA copy number ratios were significantly different using the two types of calibrants. Based on these observations, both plasmid and leaf genomic DNA calibrants would be technically suitable as anchor points for the calibration of the real-time PCR methods applied in this study. However, the most suitable approach to establish a sustainable traceability chain is to fix a reference system based on plasmid DNA.
NASA Astrophysics Data System (ADS)
Hervik, S.; Málek, T.; Pravda, V.; Pravdová, A.
2015-12-01
We study type II universal metrics of the Lorentzian signature. These metrics simultaneously solve vacuum field equations of all theories of gravitation with the Lagrangian being a polynomial curvature invariant constructed from the metric, the Riemann tensor and its covariant derivatives of an arbitrary order. We provide examples of type II universal metrics for all composite number dimensions. On the other hand, we have no examples for prime number dimensions and we prove the non-existence of type II universal spacetimes in five dimensions. We also present type II vacuum solutions of selected classes of gravitational theories, such as Lovelock, quadratic and L({{Riemann}}) gravities.
Hamiltonian structure of Dubrovin's equation of associativity in 2-d topological field theory
NASA Astrophysics Data System (ADS)
Galvão, C. A. P.; Nutku, Y.
1996-12-01
A third order Monge-Ampère type equation of associativity that Dubrovin has obtained in 2-d topological field theory is formulated in terms of a variational principle subject to second class constraints. Using Dirac's theory of constraints this degenerate Lagrangian system is cast into Hamiltonian form and the Hamiltonian operator is obtained from the Dirac bracket. There is a new type of Kac-Moody algebra that corresponds to this Hamiltonian operator. In particular, it is not a W-algebra.
A Thermodynamic Theory of Solid Viscoelasticity. Part II:; Nonlinear Thermo-viscoelasticity
NASA Technical Reports Server (NTRS)
Freed, Alan D.; Leonov, Arkady I.; Gray, Hugh R. (Technical Monitor)
2002-01-01
This paper, second in the series of three papers, develops a general, nonlinear, non-isothermal, compressible theory for finite rubber viscoelasticity and specifies it in a form convenient for solving problems important to the rubber, tire, automobile, and air-space industries, among others. Based on the quasi-linear approach of non-equilibrium thermodynamics, a general nonlinear theory of differential type has been developed for arbitrary non-isothermal deformations of viscoelastic solids. In this theory, the constitutive equations were presented as the sum of a rubber elastic (equilibrium) and a liquid type viscoelastic (non-equilibrium) terms. These equations have then been simplified using several modeling and simplicity arguments.
Yan, Xiaowen; Yang, Limin; Wang, Qiuquan
2013-07-01
Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.
Career Development Theory and Its Application. Career Knowledge Series
ERIC Educational Resources Information Center
National Career Development Association, 2015
2015-01-01
Covers career development theory, models, and techniques and how to apply them; understand the steps in the career development process and why career choice and development theory is important as well as limitations. Presents the assumptions that underlie four different types of theories; trait and factor, learning, developmental, and transition…
Integrating Multiple Intelligences in EFL/ESL Classrooms
ERIC Educational Resources Information Center
Bas, Gokhan
2008-01-01
This article deals with the integration of the theory of Multiple Intelligences in EFL/ESL classrooms. In this study, after the theory of multiple intelligences was presented shortly, the integration of this theory into English classrooms. Intelligence types in MI Theory were discussed and some possible application ways of these intelligence types…
Lignin-Derived Thioacidolysis Dimers: Reevaluation, New Products, Authentication, and Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yue, Fengxia; Lu, Fachuang; Regner, Matt
2017-01-26
Lignin structural studies play an essential role both in understanding the development of plant cell walls and for valorizing lignocellulosics as renewable biomaterials. Dimeric products released by selectively cleaving β–aryl ether linkages between lignin units reflect the distribution of recalcitrant lignin units, but have been neither absolutely defined nor quantitatively determined. Here in this work, 12 guaiacyl-type thioacidolysis dimers were identified and quantified using newly synthesized standards. One product previously attributed to deriving from β–1-coupled units was established as resulting from β–5 units, correcting an analytical quandary. Another longstanding dilemma, that no β–β dimers were recognized in thioacidolysis products frommore » gymnosperms, was resolved with the discovery of two such authenticated compounds. Finally, individual GC response factors for each standard compound allowed rigorous quantification of dimeric products released from softwood lignins, affording insight into the various interunit-linkage distributions in lignins and thereby guiding the valorization of lignocellulosics.« less
NASA Astrophysics Data System (ADS)
Rivaton, A.; Cambon, S.; Gardette, J.-L.
2005-01-01
This paper is devoted to the identification and quantification of the main chemical changes resulting from the radiochemical ageing under oxygen atmosphere of ethylene-propylene-diene monomer (EPDM) and ethylene-propylene rubber (EPR) films containing the same molar ratio of ethylene/propylene. IR and UV-Vis analysis showed that radiooxidation produces a complex mixture of different products and provokes the consumption of the diene double bond. The radiochemical yields of formation of ketones, carboxylic acids, hydroperoxides and alcohols were determined by combining IR analysis with derivatisation reactions and chemical titration. The contributions of secondary and tertiary structures of these two types of -OH groups were separated. Esters and γ-lactones were formed in low concentration. The oxidation products distribution in irradiated films was determined by micro-FTIR spectroscopy. Crosslinking was evaluated by gel fraction methods. In complement, the gas phase composition was analysed by mass spectrometry.
Umezawa, Keitaro; Yoshida, Masafumi; Kamiya, Mako; Yamasoba, Tatsuya; Urano, Yasuteru
2017-03-01
Alterations in glutathione (GSH) homeostasis are associated with a variety of diseases and cellular functions, and therefore, real-time live-cell imaging and quantification of GSH dynamics are important for understanding pathophysiological processes. However, existing fluorescent probes are unsuitable for these purposes due to their irreversible fluorogenic mechanisms or slow reaction rates. In this work, we have successfully overcome these problems by establishing a design strategy inspired by Mayr's work on nucleophilic reaction kinetics. The synthesized probes exhibit concentration-dependent, reversible and rapid absorption/fluorescence changes (t 1/2 = 620 ms at [GSH] = 1 mM), as well as appropriate K d values (1-10 mM: within the range of intracellular GSH concentrations). We also developed FRET-based ratiometric probes, and demonstrated that they are useful for quantifying GSH concentration in various cell types and also for real-time live-cell imaging of GSH dynamics with temporal resolution of seconds.
NASA Astrophysics Data System (ADS)
Umezawa, Keitaro; Yoshida, Masafumi; Kamiya, Mako; Yamasoba, Tatsuya; Urano, Yasuteru
2017-03-01
Alterations in glutathione (GSH) homeostasis are associated with a variety of diseases and cellular functions, and therefore, real-time live-cell imaging and quantification of GSH dynamics are important for understanding pathophysiological processes. However, existing fluorescent probes are unsuitable for these purposes due to their irreversible fluorogenic mechanisms or slow reaction rates. In this work, we have successfully overcome these problems by establishing a design strategy inspired by Mayr's work on nucleophilic reaction kinetics. The synthesized probes exhibit concentration-dependent, reversible and rapid absorption/fluorescence changes (t1/2 = 620 ms at [GSH] = 1 mM), as well as appropriate Kd values (1-10 mM: within the range of intracellular GSH concentrations). We also developed FRET-based ratiometric probes, and demonstrated that they are useful for quantifying GSH concentration in various cell types and also for real-time live-cell imaging of GSH dynamics with temporal resolution of seconds.
Current trends in quantitative proteomics - an update.
Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H
2017-05-01
Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Yue, Fengxia; Lu, Fachuang; Regner, Matt; Sun, Runcang; Ralph, John
2017-03-09
Lignin structural studies play an essential role both in understanding the development of plant cell walls and for valorizing lignocellulosics as renewable biomaterials. Dimeric products released by selectively cleaving β-aryl ether linkages between lignin units reflect the distribution of recalcitrant lignin units, but have been neither absolutely defined nor quantitatively determined. Here, 12 guaiacyl-type thioacidolysis dimers were identified and quantified using newly synthesized standards. One product previously attributed to deriving from β-1-coupled units was established as resulting from β-5 units, correcting an analytical quandary. Another longstanding dilemma, that no β-β dimers were recognized in thioacidolysis products from gymnosperms, was resolved with the discovery of two such authenticated compounds. Individual GC response factors for each standard compound allowed rigorous quantification of dimeric products released from softwood lignins, affording insight into the various interunit-linkage distributions in lignins and thereby guiding the valorization of lignocellulosics. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
NASA Astrophysics Data System (ADS)
Rabiei, Masoud; Sheldon, Jeremy; Palmer, Carl
2012-04-01
The applicability of Electro-Mechanical Impedance (EMI) approach to damage detection, localization and quantification in a mobile bridge structure is investigated in this paper. The developments in this paper focus on assessing the health of Armored Vehicle Launched Bridges (AVLBs). Specifically, two key failure mechanisms of the AVLB to be monitored were fatigue crack growth and damaged (loose) rivets (bolts) were identified. It was shown through experiment that bolt damage (defined here as different torque levels applied to bolts) can be detected, quantified and located using a network of lead zirconate titanate (PZT) transducers distributed on the structure. It was also shown that cracks of various sizes can be detected and quantified using the EMI approach. The experiments were performed on smaller laboratory specimens as well as full size bridge-like components that were built as part of this research. The effects of various parameters such as transducer type and size on the performance of the proposed health assessment approach were also investigated.
Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood
NASA Astrophysics Data System (ADS)
Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver
2016-09-01
Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.
Schwaighofer, Andreas; Kuligowski, Julia; Quintás, Guillermo; Mayer, Helmut K; Lendl, Bernhard
2018-06-30
Analysis of proteins in bovine milk is usually tackled by time-consuming analytical approaches involving wet-chemical, multi-step sample clean-up procedures. The use of external cavity-quantum cascade laser (EC-QCL) based IR spectroscopy was evaluated as an alternative screening tool for direct and simultaneous quantification of individual proteins (i.e. casein and β-lactoglobulin) and total protein content in commercial bovine milk samples. Mid-IR spectra of protein standard mixtures were used for building partial least squares (PLS) regression models. A sample set comprising different milk types (pasteurized; differently processed extended shelf life, ESL; ultra-high temperature, UHT) was analysed and results were compared to reference methods. Concentration values of the QCL-IR spectroscopy approach obtained within several minutes are in good agreement with reference methods involving multiple sample preparation steps. The potential application as a fast screening method for estimating the heat load applied to liquid milk is demonstrated. Copyright © 2018 Elsevier Ltd. All rights reserved.
Automating Access Control Logics in Simple Type Theory with LEO-II
NASA Astrophysics Data System (ADS)
Benzmüller, Christoph
Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound (and complete) embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in and about prominent access control logics.
On the Variability of Wilson Currents by Storm Type and Phase
NASA Technical Reports Server (NTRS)
Deierling, Wiebke; Kalb, Christina; Mach, Douglas; Liu, Chuntao; Peterson, Michael; Blakeslee, Richard
2014-01-01
Storm total conduction currents from electrified clouds are thought to play a major role in maintaining the potential difference between the earth's surface and the upper atmosphere within the Global Electric Circuit (GEC). However, it is not entirely known how the contributions of these currents vary by cloud type and phase of the clouds life cycle. Estimates of storm total conduction currents were obtained from data collected over two decades during multiple field campaigns involving the NASA ER-2 aircraft. In this study the variability of these currents by cloud type and lifecycle is investigated. We also compared radar derived microphysical storm properties with total storm currents to investigate whether these storm properties can be used to describe the current variability of different electrified clouds. The ultimate goal is to help improve modeling of the GEC via quantification and improved parameterization of the conduction current contribution of different cloud types.
Small bending and stretching of sandwich-type shells
NASA Technical Reports Server (NTRS)
Reissner, Eric
1950-01-01
A theory has been developed for small bending and stretching of sandwich-type shells. This theory is an extension of the known theory of homogeneous thin elastic shells. It was found that two effects are important in the present problem, which are not normally of importance in the theory of curved shells: (1) the effect of transverse shear deformation and (2) the effect of transverse normal stress deformation. The first of these two effects has been known to be of importance in the theory of plates and beams. The second effect was found to occur in a manner which is typical for shells and has no counterpart in flat-plate theory. The general results of this report have been applied to the solution of problems concerning flat plates, circular rings, circular cylindrical shells, and spherical shells. In each case numerical examples have been given, illustrating the magnitude of the effects of transverse shear and normal stress deformation.
NASA Astrophysics Data System (ADS)
Bayramov, Emil; Mammadov, Ramiz
2016-07-01
The main goals of this research are the object-based landcover classification of LANDSAT-8 multi-spectral satellite images in 2014 and 2015, quantification of Normalized Difference Vegetation Indices (NDVI) rates within the land-cover classes, change detection analysis between the NDVIs derived from multi-temporal LANDSAT-8 satellite images and the quantification of those changes within the land-cover classes and detection of changes between land-cover classes. The object-based classification accuracy of the land-cover classes was validated through the standard confusion matrix which revealed 80 % of land-cover classification accuracy for both years. The analysis revealed that the area of agricultural lands increased from 30911 sq. km. in 2014 to 31999 sq. km. in 2015. The area of barelands increased from 3933 sq. km. in 2014 to 4187 sq. km. in 2015. The area of forests increased from 8211 sq. km. in 2014 to 9175 sq. km. in 2015. The area of grasslands decreased from 27176 sq. km. in 2014 to 23294 sq. km. in 2015. The area of urban areas increased from 12479 sq. km. in 2014 to 12956 sq. km. in 2015. The decrease in the area of grasslands was mainly explained by the landuse shifts of grasslands to agricultural and urban lands. The quantification of low and medium NDVI rates revealed the increase within the agricultural, urban and forest land-cover classes in 2015. However, the high NDVI rates within agricultural, urban and forest land-cover classes in 2015 revealed to be lower relative to 2014. The change detection analysis between landscover types of 2014 and 2015 allowed to determine that 7740 sq. km. of grasslands shifted to agricultural landcover type whereas 5442sq. km. of agricultural lands shifted to rangelands. This means that the spatio-temporal patters of agricultural activities occurred in Azerbaijan because some of the areas reduced agricultural activities whereas some of them changed their landuse type to agricultural. Based on the achieved results, it is possible to conclude that the area of agricultural lands in Azerbaijan increased from 2014 to 2015. The crop productivity also increased in the croplands, however some of the areas showed lower productivity in 2015 relative to 2014.
Spatial estimation from remotely sensed data via empirical Bayes models
NASA Technical Reports Server (NTRS)
Hill, J. R.; Hinkley, D. V.; Kostal, H.; Morris, C. N.
1984-01-01
Multichannel satellite image data, available as LANDSAT imagery, are recorded as a multivariate time series (four channels, multiple passovers) in two spatial dimensions. The application of parametric empirical Bayes theory to classification of, and estimating the probability of, each crop type at each of a large number of pixels is considered. This theory involves both the probability distribution of imagery data, conditional on crop types, and the prior spatial distribution of crop types. For the latter Markov models indexed by estimable parameters are used. A broad outline of the general theory reveals several questions for further research. Some detailed results are given for the special case of two crop types when only a line transect is analyzed. Finally, the estimation of an underlying continuous process on the lattice is discussed which would be applicable to such quantities as crop yield.
Foundations to the unified psycho-cognitive engine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis; Bier, Asmeret Brooke; Backus, George A.
This document outlines the key features of the SNL psychological engine. The engine is designed to be a generic presentation of cognitive entities interacting among themselves and with the external world. The engine combines the most accepted theories of behavioral psychology with those of behavioral economics to produce a unified simulation of human response from stimuli through executed behavior. The engine explicitly recognizes emotive and reasoned contributions to behavior and simulates the dynamics associated with cue processing, learning, and choice selection. Most importantly, the model parameterization can come from available media or survey information, as well subject-matter-expert information. The frameworkmore » design allows the use of uncertainty quantification and sensitivity analysis to manage confidence in using the analysis results for intervention decisions.« less
Method and Apparatus for the Quantification of Particulate Adhesion Forces on Various Substrates
NASA Technical Reports Server (NTRS)
Wohl, Christopher J.; Atkins, Brad M.; Connell, John W.
2011-01-01
Mitigation strategies for lunar dust adhesion have typically been limited to qualitative analysis. This technical memorandum describes the generation and operation of an adhesion testing device capable of quantitative assessment of adhesion forces between particulates and substrates. An aerosolization technique is described to coat a surface with a monolayer of particulates. Agitation of this surface, via sonication, causes particles to dislodge and be gravitationally fed into an optical particle counter. Experimentally determined adhesion force values are compared to forces calculated from van der Waals interactions and are used to calculate the work of adhesion using Johnson-Kendall-Roberts (JKR) theory. Preliminary results indicate that a reduction in surface energy and available surface area, through topographical modification, improve mitigation of particulate adhesion.
Gaussian geometric discord in terms of Hellinger distance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suciu, Serban, E-mail: serban.suciu@theory.nipne.ro; Isar, Aurelian
2015-12-07
In the framework of the theory of open systems based on completely positive quantum dynamical semigroups, we address the quantification of general non-classical correlations in Gaussian states of continuous variable systems from a geometric perspective. We give a description of the Gaussian geometric discord by using the Hellinger distance as a measure for quantum correlations between two non-interacting non-resonant bosonic modes embedded in a thermal environment. We evaluate the Gaussian geometric discord by taking two-mode squeezed thermal states as initial states of the system and show that it has finite values between 0 and 1 and that it decays asymptoticallymore » to zero in time under the effect of the thermal bath.« less
Entropy changes in brain function.
Rosso, Osvaldo A
2007-04-01
The traditional way of analyzing brain electrical activity, on the basis of electroencephalography (EEG) records, relies mainly on visual inspection and years of training. Although it is quite useful, of course, one has to acknowledge its subjective nature that hardly allows for a systematic protocol. In the present work quantifiers based on information theory and wavelet transform are reviewed. The "relative wavelet energy" provides information about the relative energy associated with different frequency bands present in the EEG and their corresponding degree of importance. The "normalized total wavelet entropy" carries information about the degree of order-disorder associated with a multi-frequency signal response. Their application in the analysis and quantification of short duration EEG signals (event-related potentials) and epileptic EEG records are summarized.
NASA Astrophysics Data System (ADS)
Balasuriya, Sanjeeva
2016-12-01
State-dependent time-impulsive perturbations to a two-dimensional autonomous flow with stable and unstable manifolds are analysed by posing in terms of an integral equation which is valid in both forwards- and backwards-time. The impulses destroy the smooth invariant manifolds, necessitating new definitions for stable and unstable pseudo-manifolds. Their time-evolution is characterised by solving a Volterra integral equation of the second kind with discontinuous inhomogeniety. A criteria for heteroclinic trajectory persistence in this impulsive context is developed, as is a quantification of an instantaneous flux across broken heteroclinic manifolds. Several examples, including a kicked Duffing oscillator and an underwater explosion in the vicinity of an eddy, are used to illustrate the theory.
2017-02-02
Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies
Examining Achievement Goals and Causal Attributions Together as Predictors of Academic Functioning
ERIC Educational Resources Information Center
Wolters, Christopher A.; Fan, Weihua; Daugherty, Stacy G.
2013-01-01
This study was designed to forge stronger theoretical and empirical links between achievement goal theory and attribution theory. High school students ("N" = 224) completed a self-report survey that assessed 3 types of achievement goals, 7 types of attributions, and self-efficacy. Results indicated that students' adoption of achievement…
Cognitive Load Theory: How Many Types of Load Does It Really Need?
ERIC Educational Resources Information Center
Kalyuga, Slava
2011-01-01
Cognitive load theory has been traditionally described as involving three separate and additive types of load. Germane load is considered as a learning-relevant load complementing extraneous and intrinsic load. This article argues that, in its traditional treatment, germane load is essentially indistinguishable from intrinsic load, and therefore…
[Consanguinity between meridian theory and Bianque's pulse theory].
Huang, Longxiang
2015-05-01
The integral meridian theory is composed of five parts, including meridian course, syndrome, diagnostic method, treating principle and treatment, and the core of it is meridian syndrome. It has been proved by multiple evidences that the meridian syndrome induced by the pathological change in meridian and the death syndrome of pulse penetrating or attaching to the syndrome are all originated from Bianque' s facial color and pulse diagnosis. And regarding the pulse syndrome,there are many different interpretations based on the theory of yin-yang in four seasons before the Han Dynasty. The emerging of Biaoben diagnostic method in Bianque's pulse method and its extensive clinical application promote a new theoretic interpretation the connection of meridians interpreting pulse syndrome directly. Besides, along with the new development of blood-pulse theory of Bianque's medicine, the revolution on meridian theory is aroused as well its theoretical paradigm turning from "tree" type to "ring" type. In other words, Bianque's medicine not only gives birth to meridian theory, but also decides its final development.
Superconformal indices of generalized Argyres-Douglas theories from 2d TQFT
Song, Jaewon
2016-02-05
We present superconformal indices of 4d N = 2 class S theories with certain irregular punctures called type I k,N. This class of theories include generalized Argyres-Douglas theories of type (A k-1, A N-1) and more. We conjecture the superconformal indices in certain simplifi ed limits based on the TQFT structure of the class S theories by writing an expression for the wave function corresponding to the puncture I k,N. We write the Schur limit of the wave function when k and N are coprime. When k = 2, we also conjecture a closed-form expression for the Hall-Littlewood index andmore » the Macdonald index for odd N. From the index, we argue that certain short-multiplet which can appear in the OPE of the stress-energy tensor is absent in the (A 1,A 2n) theory. In addition, we discuss the mixed Schur indices for the N = 1 class S theories with irregular punctures.« less
Superconformal indices of generalized Argyres-Douglas theories from 2d TQFT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Jaewon
We present superconformal indices of 4d N = 2 class S theories with certain irregular punctures called type I k,N. This class of theories include generalized Argyres-Douglas theories of type (A k-1, A N-1) and more. We conjecture the superconformal indices in certain simplifi ed limits based on the TQFT structure of the class S theories by writing an expression for the wave function corresponding to the puncture I k,N. We write the Schur limit of the wave function when k and N are coprime. When k = 2, we also conjecture a closed-form expression for the Hall-Littlewood index andmore » the Macdonald index for odd N. From the index, we argue that certain short-multiplet which can appear in the OPE of the stress-energy tensor is absent in the (A 1,A 2n) theory. In addition, we discuss the mixed Schur indices for the N = 1 class S theories with irregular punctures.« less
[The medical theory of Lee Je-ma and its character].
Lee, Kyung-Lock
2005-12-01
Lee Je-ma 1837-1900) was a prominent scholar as well as an Korean physician. classified every people into four distinctive types: greater yang [tai yang] person, lesser yin [shao yin] person, greater yin [tai yin] person, lesser yin [shao yin] person. This theory would dictate proper treatment for each type in accordance with individual differences of physical and temperament features. Using these four types he created The Medical Science of Four Types. This article is intended to look into the connection between Lee Je-Ma's 'The Medical Science of Four Types' and 'The Modern' with organizing his ideas about the human body and the human being. Through The Modern, the theory of human being underwent a complete change. Human being in The Premodern, which was determined by sex, age and social status has been changed to the individual human being, which is featured by equality. Lee Je-Ma's medical theory of The Medical Science of Four Types would be analyzed as follow. His concept of human body is oriented toward observable objectivity. But on the other hand, it still remains transcendent status of medical science, which is subordinated by philosophy. According to Lee Je-Ma's theory of human being, human is an equal individual in a modern way of thinking, not as a part of hierarchical group. But on the other hand, it still remains incomplete from getting rid of morality aspect that includes virtue and vice in the concept of human body. The common factors in Lee Je-Ma's ideas about the human body and the human being is 'Dualism of mind and body that means all kinds of status and results depends on each individual. As is stated above, Lee Je-Ma's medical theory has many aspects of The Modern and it proves that Korean traditional medicine could be modernized by itself.
Testing Components of a Self-Management Theory in Adolescents With Type 1 Diabetes Mellitus.
Verchota, Gwen; Sawin, Kathleen J
The role of self-management in adolescents with type 1 diabetes mellitus is not well understood. The purpose of the research was to examine the relationship of key individual and family self-management theory, context, and process variables on proximal (self-management behaviors) and distal (hemoglobin A1c and diabetes-specific health-related quality of life) outcomes in adolescents with type 1 diabetes. A correlational, cross-sectional study was conducted to identify factors contributing to outcomes in adolescents with Type 1 diabetes and examine potential relationships between context, process, and outcome variables delineated in individual and family self-management theory. Participants were 103 adolescent-parent dyads (adolescents ages 12-17) with Type 1 diabetes from a Midwest, outpatient, diabetes clinic. The dyads completed a self-report survey including instruments intended to measure context, process, and outcome variables from individual and family self-management theory. Using hierarchical multiple regression, context (depressive symptoms) and process (communication) variables explained 37% of the variance in self-management behaviors. Regimen complexity-the only significant predictor-explained 11% of the variance in hemoglobin A1c. Neither process variables nor self-management behaviors were significant. For the diabetes-specific health-related quality of life outcome, context (regimen complexity and depressive symptoms) explained 26% of the variance at step 1; an additional 9% of the variance was explained when process (self-efficacy and communication) variables were added at step 2; and 52% of the variance was explained when self-management behaviors were added at Step 3. In the final model, three variables were significant predictors: depressive symptoms, self-efficacy, and self-management behaviors. The individual and family self-management theory can serve as a cogent theory for understanding key concepts, processes, and outcomes essential to self-management in adolescents and families dealing with Type 1 diabetes mellitus.
1993-08-01
dubia X IX X .Juncus pelocarpus XI Myrophyyllum spicatum X ? Nalas flexilis X Najas marina X Nitella sp ?____ 4 14uphar luteum X Nymphaea odorata IX...Onondaga Lake sediments: Elodea canadensis (EC), Myriophyllum spicatum (MS), Nymphaea odorata (NO), Potamogeton crispus (PC), P. nodosus (PN), P... Nymphaea odorata , S. rigida, and T. latifolia all did poorly on Onondaga Lake sediments and/or water, and are not recommended for transplant efforts
Non-Invasive NIR Sensor for Quantification of Deep Tissue Oxygenation. Phase 1.
1995-10-01
setting when a suitable human monitor is developed. Several potential investigations are possible depending on final penetration depth and ability to...1995I TYPE OF REPORT: Final, Phase I PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 PR[OPRIETARlY MFO...National Research Council (NIH Publication No. 86-23, Revised 1985). For the protection of human subjects, the investigator(s)3 adhered to policies of
Program Theory Evaluation: Logic Analysis
ERIC Educational Resources Information Center
Brousselle, Astrid; Champagne, Francois
2011-01-01
Program theory evaluation, which has grown in use over the past 10 years, assesses whether a program is designed in such a way that it can achieve its intended outcomes. This article describes a particular type of program theory evaluation--logic analysis--that allows us to test the plausibility of a program's theory using scientific knowledge.…
Kongmanas, Kessiri; Xu, Hongbin; Yaghoubian, Arman; Franchini, Laura; Panza, Luigi; Ronchetti, Fiamma; Faull, Kym; Tanphaichitr, Nongnuj
2010-12-01
Seminolipid, also known as sulfogalactosylglycerolipid (SGG), plays important roles in male reproduction. Therefore, an accurate and sensitive method for SGG quantification in testes and sperm is needed. Here we compare SGG quantitation by the traditional colorimetric Azure A assay with LC-ESI-MS/MS using multiple reaction monitoring (MRM). Inclusion of deuterated SGG as the internal standard endowed accuracy to the MRM method. The results showed reasonable agreement between the two procedures for purified samples, but for crude lipid extracts, the colorimetric assay significantly overestimated the SGG content. Using ESI-MS/MS MRM, C16:0-alkyl/C16:0-acyl SGG of Cgt(+/⁻) mice was quantified to be 406.06 ± 23.63 μg/g testis and 0.13 ± 0.02 μg/million sperm, corresponding to 78% and 87% of the wild-type values, respectively. CGT (ceramide galactosyltransferase) is a critical enzyme in the SGG biosynthesis pathway. Cgt⁻/⁻ males depleted of SGG are infertile due to spermatogenesis arrest. However, Cgt(+/⁻) males sire offspring. The higher than 50% expression level of SGG in Cgt(+/⁻) animals, compared with the wild-type expression, might be partly due to compensatory translation of the active CGT enzyme. The results also indicated that 78% of SGG levels in Cgt(+/⁻) mice were sufficient for normal spermatogenesis.
Usability of calcium carbide gas pressure method in hydrological sciences
NASA Astrophysics Data System (ADS)
Arsoy, S.; Ozgur, M.; Keskin, E.; Yilmaz, C.
2013-10-01
Soil moisture is a key engineering variable with major influence on ecological and hydrological processes as well as in climate, weather, agricultural, civil and geotechnical applications. Methods for quantification of the soil moisture are classified into three main groups: (i) measurement with remote sensing, (ii) estimation via (soil water balance) simulation models, and (iii) measurement in the field (ground based). Remote sensing and simulation modeling require rapid ground truthing with one of the ground based methods. Calcium carbide gas pressure (CCGP) method is a rapid measurement procedure for obtaining soil moisture and relies on the chemical reaction of the calcium carbide reagent with the water in soil pores. However, the method is overlooked in hydrological science applications. Therefore, the purpose of this study is to evaluate the usability of the CCGP method in comparison with standard oven-drying and dielectric methods in terms of accuracy, time efficiency, operational ease, cost effectiveness and safety for quantification of the soil moisture over a wide range of soil types. The research involved over 250 tests that were carried out on 15 different soil types. It was found that the accuracy of the method is mostly within ±1% of soil moisture deviation range in comparison to oven-drying, and that CCGP method has significant advantages over dielectric methods in terms of accuracy, cost, operational ease and time efficiency for the purpose of ground truthing.
Mihailov, Rossen; Stoeva, Dilyana; Pencheva, Blagovesta; Pentchev, Eugeni
2018-03-01
In a number of cases the monitoring of patients with type I diabetes mellitus requires measurement of the exogenous insulin levels. For the purpose of a clinical investigation of the efficacy of a medical device for application of exogenous insulin aspart, a verification of the method for measurement of this synthetic analogue of the hormone was needed. The information in the available medical literature for the measurement of the different exogenous insulin analogs is insufficient. Thus, verification was required to be in compliance with the active standards in Republic of Bulgaria. A manufactured method developed for ADVIA Centaur XP Immunoassay, Siemens Healthcare, was used which we verified using standard solutions and a patient serum pool by adding the appropriate quantity exogenous insulin aspart. The method was verified in accordance with the bioanalytical method verification criteria and regulatory requirements for using a standard method: CLIA chemiluminescence immunoassay ADVIA Centaur® XP. The following parameters are determined and monitored: intra-day precision and accuracy, inter-day precision and accuracy, limit of detection and lower limit of quantification, linearity, analytical recovery. The routine application of the method for measurement of immunoreactive insulin using the analyzer ADVIA Centaur® XP is directed to the measurement of endogenous insulin. The method is applicable for measuring different types of exogenous insulin, including insulin aspart.
RAHIMIRAD, Amir; MAALEKINEJAD, Hassan; OSTADI, Araz; YEGANEH, Samal; FAHIMI, Samira
2014-01-01
Abstract Background Aflatoxin M1 (AFM1), a carcinogenic substance is found in milk and dairy products. The effect of season and type of dairy products on AFMi level in northern Iran was investigated in this study. Methods Three hundred samples (each season 75 samples) including raw and pasteurized milk, yoghurt, cheese, and cream samples were collected from three distinct milk producing farms. The samples were subjected to chemical and solid phase extractions and were analyzed by using HPLC technique. Recovery percentages, limit of detection and limit of quantification values were determined. Results Seventy percent and 98% were the minimum and maximum recoveries for cheese and raw milk, respectively and 0.021 and 0.063 ppb were the limit of detection and limit of quantification values for AFM1. We found that in autumn and winter the highest level (0.121 ppb) of AFM1 in cheese and cream samples and failed to detect any AFM1 in spring samples. Interestingly, our data showed that the yoghurt samples had the lowest level of AFM1 in all seasons. Conclusion There are significant differences between the AFM1 levels in dairy products in various seasons and also various types of products, suggesting spring and summer yoghurt samples as the safest products from AFM1 level point of view. PMID:25927044
Kang, Keren; Wu, Peidian; Li, Wenmei; Tang, Shixing; Wang, Jihua; Luo, Xiaochun; Xie, Mingquan
2015-01-01
To develop a rapid, sensitive and specific assay for quantification of serum heart-type fatty acid binding protein (H-FABP) based on immunofluorescence of specific monoclonal antibodies. We generated novel H-FABP-directed monoclonal antibodies by cloning of spleen cells of mice immunized with H-FABP. Epitopes were mapped and antigen affinity was assessed by surface plasmon resonance (SPR). The H-FABP specific monoclonal antibodies were coupled to fluorescent beads and sprayed onto a nitrocellulose membrane facilitating quantification of H-FABP by immunofluorescence. Reagent cross-reactivity, interference resistance, accuracy and sensitivity were examined. A total of 103 clinical samples were used to compare the sensitivity and specificity of the new assay to a commercially available Randox kit. This new assay could be finished within 15 min, with sensitivity reaching 1 ng/ml. In a trial of 103 clinical serum samples, the new testing kit results were highly correlated with those from the Randox kit (R(2) = 0.9707). Using the Randox kit as the reference kit, the sensitivity of the new assay was 98.25%, and specificity was 100%. An immunofluorescence-based H-FABP assay employing novel monoclonal antibodies could rapidly, specifically and sensitively detect H-FABP in serum samples, providing an effective method for rapid clinical assessment of H-FABP index in the clinic.
HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.
2015-05-01
This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.
Oña-Ruales, Jorge O; Ruiz-Morales, Yosadara; Wise, Stephen A
2016-04-15
A methodology for the characterization of groups of polycyclic aromatic hydrocarbons (PAHs) using a combination of normal phase liquid chromatography with ultraviolet-visible spectroscopy (NPLC/UV-vis) and gas chromatography with mass spectrometry (GC/MS) was used for the identification and quantification of seven fused aromatic rings C26H14 peri-condensed benzenoid polycyclic aromatic hydrocarbons, PAHs, in standard reference material (SRM) 1597a, complex mixture of PAHs from coal tar. The NPLC/UV-vis isolated the fractions based on the number of aromatic carbons and the GC/MS allowed the identification and quantification of five of the nine C26H14 PAH isomers; naphtho[1,2,3,4-ghi]perylene, dibenzo[b,ghi]perylene, dibenzo[b,pqr]perylene, naphtho[8,1,2-bcd]perylene, and dibenzo[cd,lm]perylene using a retention time comparison with authentic reference standards. For the other four benzenoid isomers with no available reference standards the following two approaches were used. First, the annellation theory was used to achieve the potential identification of benzo[qr]naphtho[3,2,1,8-defg]chrysene, and second, the elution distribution in the GC fractions was used to support the potential identification of benzo[qr]naphtho[3,2,1,8-defg]chrysene and to reach the tentative identifications of dibenzo[a,ghi]perylene, naphtho[7,8,1,2,3-pqrst]pentaphene, and anthra[2,1,9,8-opqra]naphthacene. It is the first time that naphtho[1,2,3,4-ghi]perylene, dibenzo[b,ghi]perylene, dibenzo[b,pqr]perylene, naphtho[8,1,2-bcd]perylene, and dibenzo[cd,lm]perylene are quantified, and the first time that benzo[qr]naphtho[3,2,1,8-defg]chrysene is potentially identified, in any sample, in any context. Copyright © 2016 Elsevier B.V. All rights reserved.