Sample records for quantification theory type

  1. Aerosol-type retrieval and uncertainty quantification from OMI data

    NASA Astrophysics Data System (ADS)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model

  2. Applications of Jungian Type Theory to Counselor Education.

    ERIC Educational Resources Information Center

    Dilley, Josiah S.

    1987-01-01

    Describes Carl Jung's theory of psychological type and the Myers-Briggs Type Indicator (MBTI), an instrument to assess Jungian type. Cites sources of information on the research and application of the theory and the MBTI. Explores how knowledge of type theory can be useful to counselor educators. (Author)

  3. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies

    PubMed Central

    2016-01-01

    Background Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers’ activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers’ health systematically. Objective The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? Methods A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. Results The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and

  4. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies.

    PubMed

    Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando

    2016-05-27

    Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers' activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers' health systematically. The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. A

  5. Theory of Type 3 and Type 2 Solar Radio Emissions

    NASA Technical Reports Server (NTRS)

    Robinson, P. A.; Cairns, I. H.

    2000-01-01

    The main features of some current theories of type III and type II bursts are outlined. Among the most common solar radio bursts, type III bursts are produced at frequencies of 10 kHz to a few GHz when electron beams are ejected from solar active regions, entering the corona and solar wind at typical speeds of 0.1c. These beams provide energy to generate Langmuir waves via a streaming instability. In the current stochastic-growth theory, Langmuir waves grow in clumps associated with random low-frequency density fluctuations, leading to the observed spiky waves. Nonlinear wave-wave interactions then lead to secondary emission of observable radio waves near the fundamental and harmonic of the plasma frequency. Subsequent scattering processes modify the dynamic radio spectra, while back-reaction of Langmuir waves on the beam causes it to fluctuate about a state of marginal stability. Theories based on these ideas can account for the observed properties of type III bursts, including the in situ waves and the dynamic spectra of the radiation. Type 11 bursts are associated with shock waves propagating through the corona and interplanetary space and radiating from roughly 30 kHz to 1 GHz. Their basic emission mechanisms are believed to be similar to those of type III events and radiation from Earth's foreshock. However, several sub-classes of type II bursts may exist with different source regions and detailed characteristics. Theoretical models for type II bursts are briefly reviewed, focusing on a model with emission from a foreshock region upstream of the shock for which observational evidence has just been reported.

  6. Type 2 and type 3 burst theory

    NASA Technical Reports Server (NTRS)

    Smith, D. F.

    1973-01-01

    The present state of the theory of type 3 bursts is reviewed by dividing the problem into the exciting agency, radiation source, and propagation of radiation between the source and the observer. In-situ measurements indicate that the excitors are electron streams of energy about 40 keV which are continuously relaxing. An investigation of neutralization of an electron stream indicates that n sub s is much less than 100,000 n sub e, where n sub s is the stream density and n sub e the coronal electron density. In situ observations are consistent with this result. An analysis of propagation of electrons in the current sheets of coronal streamers shows that such propagation at heights greater than 1 solar radius is impossible. The mechanisms for radiation are reviewed; it is shown that fundamental radiation at high frequencies (approximately 100 MHz) is highly beamed in the radial direction and that near the earth second harmonic radiation must be dominant. Because of beaming of the fundamental at high frequencies, it can often be quite weak near the limb so that the second harmonic is dominant. In considering propagation to the observer, the results of scattering of radiation are discussed. The present state of the theory of type 2 bursts is reviewed in the same manner as type 3 bursts.

  7. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  8. Predominant Lactobacillus species types of vaginal microbiota in pregnant Korean women: quantification of the five Lactobacillus species and two anaerobes.

    PubMed

    Kim, Jeong Hyun; Yoo, Seung Min; Sohn, Yong Hak; Jin, Chan Hee; Yang, Yun Suk; Hwang, In Taek; Oh, Kwan Young

    2017-10-01

    To investigate the predominant Lactobacillus species types (LSTs) of vaginal microbiota in pregnant Korean women by quantifying five Lactobacillus species and two anaerobes. In all, 168 pregnant Korean women under antenatal care at Eulji University Hospital and local clinics were enrolled in the prospective cohort study during pregnancy (10-14 weeks). Vaginal samples were collected with Eswab for Quantitative polymerase chain reaction (qPCR) and stored in a -80 °C freezer. qPCR was performed for five Lactobacillus species and two anaerobes. To identify the predominant LSTs, quantifications were analyzed by the Cluster and Tree View programs of Eisen Lab. Also the quantifications were compared among classified groups. L. crispatus and L. iners were most commonly found in pregnant Korean women, followed by L. gasseri and L. jensenii; L. vaginalis was nearly absent. Five types (four predominant LSTs and one predominant anaerobe type without predominant Lactobacillus species) were classified. Five predominant LSTs were identified in vaginal microbiota of pregnant Korean women. L. crispatus and L. iners predominant types comprised a large proportion.

  9. Toward a Theory of Psychological Type Congruence for Advertisers.

    ERIC Educational Resources Information Center

    McBride, Michael H.; And Others

    Focusing on the impact of advertisers' persuasive selling messages on consumers, this paper discusses topics relating to the theory of psychological type congruence. Based on an examination of persuasion theory and relevant psychological concepts, including recent cognitive stability and personality and needs theory and the older concept of…

  10. New type IIB backgrounds and aspects of their field theory duals

    NASA Astrophysics Data System (ADS)

    Caceres, Elena; Macpherson, Niall T.; Núñez, Carlos

    2014-08-01

    In this paper we study aspects of geometries in Type IIA and Type IIB String theory and elaborate on their field theory dual pairs. The backgrounds are associated with reductions to Type IIA of solutions with G 2 holonomy in eleven dimensions. We classify these backgrounds according to their G-structure, perform a non-Abelian T-duality on them and find new Type IIB configurations presenting dynamical SU(2)-structure. We study some aspects of the associated field theories defined by these new backgrounds. Various technical details are clearly spelled out.

  11. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE PAGES

    McDonnell, J. D.; Schunck, N.; Higdon, D.; ...

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  12. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonnell, J. D.; Schunck, N.; Higdon, D.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  13. Cognitive Type Theory & Learning Style, A Teacher's Guide.

    ERIC Educational Resources Information Center

    Mamchur, Carolyn

    This guide provides a practical explanation of cognitive type theory and learning style that will help teachers meet students' needs and discover their own strengths as teachers and colleagues. The introduction provides an overview of the book from the perspective of a high school classroom teacher. Part One introduces the theory of psychological…

  14. Use of multiple competitors for quantification of human immunodeficiency virus type 1 RNA in plasma.

    PubMed

    Vener, T; Nygren, M; Andersson, A; Uhlén, M; Albert, J; Lundeberg, J

    1998-07-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens.

  15. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  16. Levels of theory and types of theoretical explanation in theoretical physics

    NASA Astrophysics Data System (ADS)

    Flores, Francisco J.

    In Newtonian physics, there is a clear distinction between a 'framework theory', a collection of general physical principles and definitions of physical terms, and theories that describe specific causal interactions such as gravitation, i.e., 'interaction theories'. I argue that this distinction between levels of theory can also be found in the context of Special Relativity and that recognizing it is essential for a philosophical account of how laws are explained in this theory. As a case study, I consider the history of derivations of mass-energy equivalence which shows, I argue, that there are two distinct types of theoretical explanations (i.e., explanations of laws) in physics. One type is best characterized by the 'top-down' account of scientific explanation, while the other is more accurately described by the 'bottom-up' account. What is significant, I argue, is that the type of explanation a law receives depends on whether it is part of the framework theory or part of an interaction theory. The former only receive 'top-down' explanations while the latter can also receive 'bottom- up' explanations. Thus, I argue that current debates regarding 'top-down' vs 'bottom-up' views of scientific explanation can be clarified by recognizing the distinction between two levels of physical theory.

  17. Timoshenko-Type Theory in the Stability Analysis of Corrugated Cylindrical Shells

    NASA Astrophysics Data System (ADS)

    Semenyuk, N. P.; Neskhodovskaya, N. A.

    2002-06-01

    A technique is proposed for stability analysis of longitudinally corrugated shells under axial compression. The technique employs the equations of the Timoshenko-type nonlinear theory of shells. The geometrical parameters of shells are specified on discrete set of points and are approximated by segments of Fourier series. Infinite systems of homogeneous algebraic equations are derived from a variational equation written in displacements to determine the critical loads and buckling modes. Specific types of corrugated isotropic metal and fiberglass shells are considered. The calculated results are compared with those obtained within the framework of the classical theory of shells. It is shown that the Timoshenko-type theory extends significantly the possibility of exact allowance for the geometrical parameters and material properties of corrugated shells compared with Kirchhoff-Love theory.

  18. Towards deconstruction of the Type D (2,0) theory

    NASA Astrophysics Data System (ADS)

    Bourget, Antoine; Pini, Alessandro; Rodriguez-Gomez, Diego

    2017-12-01

    We propose a four-dimensional supersymmetric theory that deconstructs, in a particular limit, the six-dimensional (2, 0) theory of type D k . This 4d theory is defined by a necklace quiver with alternating gauge nodes O(2 k) and Sp( k). We test this proposal by comparing the 6d half-BPS index to the Higgs branch Hilbert series of the 4d theory. In the process, we overcome several technical difficulties, such as Hilbert series calculations for non-complete intersections, and the choice of O versus SO gauge groups. Consistently, the result matches the Coulomb branch formula for the mirror theory upon reduction to 3d.

  19. A Generalized Perturbation Theory Solver In Rattlesnake Based On PETSc With Application To TREAT Steady State Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunert, Sebastian; Wang, Congjian; Wang, Yaqi

    Rattlesnake and MAMMOTH are the designated TREAT analysis tools currently being developed at the Idaho National Laboratory. Concurrent with development of the multi-physics, multi-scale capabilities, sensitivity analysis and uncertainty quantification (SA/UQ) capabilities are required for predicitive modeling of the TREAT reactor. For steady-state SA/UQ, that is essential for setting initial conditions for the transients, generalized perturbation theory (GPT) will be used. This work describes the implementation of a PETSc based solver for the generalized adjoint equations that constitute a inhomogeneous, rank deficient problem. The standard approach is to use an outer iteration strategy with repeated removal of the fundamental modemore » contamination. The described GPT algorithm directly solves the GPT equations without the need of an outer iteration procedure by using Krylov subspaces that are orthogonal to the operator’s nullspace. Three test problems are solved and provide sufficient verification for the Rattlesnake’s GPT capability. We conclude with a preliminary example evaluating the impact of the Boron distribution in the TREAT reactor using perturbation theory.« less

  20. Use of Multiple Competitors for Quantification of Human Immunodeficiency Virus Type 1 RNA in Plasma

    PubMed Central

    Vener, Tanya; Nygren, Malin; Andersson, AnnaLena; Uhlén, Mathias; Albert, Jan; Lundeberg, Joakim

    1998-01-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens. PMID:9650926

  1. A quantification model for the structure of clay materials.

    PubMed

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  2. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    NASA Astrophysics Data System (ADS)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  3. Quantification of scaling exponent with Crossover type phenomena for different types of forcing in DC glow discharge plasma

    NASA Astrophysics Data System (ADS)

    Saha, Debajyoti; Shaw, Pankaj Kumar; Ghosh, Sabuj; Janaki, M. S.; Sekar Iyengar, A. N.

    2018-01-01

    We have carried out a detailed study of scaling region using detrended fractal analysis test by applying different forcing likewise noise, sinusoidal, square on the floating potential fluctuations acquired under different pressures in a DC glow discharge plasma. The transition in the dynamics is observed through recurrence plot techniques which is an efficient method to observe the critical regime transitions in dynamics. The complexity of the nonlinear fluctuation has been revealed with the help of recurrence quantification analysis which is a suitable tool for investigating recurrence, an ubiquitous feature providing a deep insight into the dynamics of real dynamical system. An informal test for stationarity which checks for the compatibility of nonlinear approximations to the dynamics made in different segments in a time series has been proposed. In case of sinusoidal, noise, square forcing applied on fluctuation acquired at P = 0.12 mbar only one dominant scaling region is observed whereas the forcing applied on fluctuation (P = 0.04 mbar) two prominent scaling regions have been explored reliably using different forcing amplitudes indicating the signature of crossover phenomena. Furthermore a persistence long range behavior has been observed in one of these scaling regions. A comprehensive study of the quantification of scaling exponents has been carried out with the increase in amplitude and frequency of sinusoidal, square type of forcings. The scalings exponent is envisaged to be the roughness of the time series. The method provides a single quantitative idea of the scaling exponent to quantify the correlation properties of a signal.

  4. Information theoretic quantification of diagnostic uncertainty.

    PubMed

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  5. [DNA quantification of blood samples pre-treated with pyramidon].

    PubMed

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  6. Increasing the feasibility of minimally invasive procedures in type A aortic dissections: a framework for segmentation and quantification.

    PubMed

    Morariu, Cosmin Adrian; Terheiden, Tobias; Dohle, Daniel Sebastian; Tsagakis, Konstantinos; Pauli, Josef

    2016-02-01

    Our goal is to provide precise measurements of the aortic dimensions in case of dissection pathologies. Quantification of surface lengths and aortic radii/diameters together with the visualization of the dissection membrane represents crucial prerequisites for enabling minimally invasive treatment of type A dissections, which always also imply the ascending aorta. We seek a measure invariant to luminance and contrast for aortic outer wall segmentation. Therefore, we propose a 2D graph-based approach using phase congruency combined with additional features. Phase congruency is extended to 3D by designing a novel conic directional filter and adding a lowpass component to the 3D Log-Gabor filterbank for extracting the fine dissection membrane, which separates the true lumen from the false one within the aorta. The result of the outer wall segmentation is compared with manually annotated axial slices belonging to 11 CTA datasets. Quantitative assessment of our novel 2D/3D membrane extraction algorithms has been obtained for 10 datasets and reveals subvoxel accuracy in all cases. Aortic inner and outer surface lengths, determined within 2 cadaveric CT datasets, are validated against manual measurements performed by a vascular surgeon on excised aortas of the body donors. This contribution proposes a complete pipeline for segmentation and quantification of aortic dissections. Validation against ground truth of the 3D contour lengths quantification represents a significant step toward custom-designed stent-grafts.

  7. The use of self-quantification systems for personal health information: big data management activities and prospects.

    PubMed

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance

  8. Type IIB flux vacua from G-theory II

    NASA Astrophysics Data System (ADS)

    Candelas, Philip; Constantin, Andrei; Damian, Cesar; Larfors, Magdalena; Morales, Jose Francisco

    2015-02-01

    We find analytic solutions of type IIB supergravity on geometries that locally take the form Mink × M 4 × ℂ with M 4 a generalised complex manifold. The solutions involve the metric, the dilaton, NSNS and RR flux potentials (oriented along the M 4) parametrised by functions varying only over ℂ. Under this assumption, the supersymmetry equations are solved using the formalism of pure spinors in terms of a finite number of holomorphic functions. Alternatively, the solutions can be viewed as vacua of maximally supersymmetric supergravity in six dimensions with a set of scalar fields varying holomorphically over ℂ. For a class of solutions characterised by up to five holomorphic functions, we outline how the local solutions can be completed to four-dimensional flux vacua of type IIB theory. A detailed study of this global completion for solutions with two holomorphic functions has been carried out in the companion paper [1]. The fluxes of the global solutions are, as in F-theory, entirely codified in the geometry of an auxiliary K3 fibration over ℂℙ1. The results provide a geometric construction of fluxes in F-theory.

  9. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    PubMed Central

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  10. The dopant type and amount governs the electrochemical performance of graphene platforms for the antioxidant activity quantification

    NASA Astrophysics Data System (ADS)

    Hui, Kai Hwee; Ambrosi, Adriano; Sofer, Zdeněk; Pumera, Martin; Bonanni, Alessandra

    2015-05-01

    Graphene doped with heteroatoms can show new or improved properties as compared to the original undoped material. It has been reported that the type of heteroatoms and the doping conditions can have a strong influence on the electronic and electrochemical properties of the resulting material. Here, we wish to compare the electrochemical behavior of two n-type and two p-type doped graphenes, namely boron-doped graphenes and nitrogen-doped graphenes containing different amounts of heteroatoms. We show that the boron-doped graphene containing a higher amount of dopants provides the best electroanalytical performance in terms of calibration sensitivity, selectivity and linearity of response for the detection of gallic acid normally used as the standard probe for the quantification of antioxidant activity of food and beverages. Our findings demonstrate that the type and amount of heteroatoms used for the doping have a profound influence on the electrochemical detection of gallic acid rather than the structural properties of the materials such as amounts of defects, oxygen functionalities and surface area. This finding has a profound influence on the application of doped graphenes in the field of analytical chemistry.Graphene doped with heteroatoms can show new or improved properties as compared to the original undoped material. It has been reported that the type of heteroatoms and the doping conditions can have a strong influence on the electronic and electrochemical properties of the resulting material. Here, we wish to compare the electrochemical behavior of two n-type and two p-type doped graphenes, namely boron-doped graphenes and nitrogen-doped graphenes containing different amounts of heteroatoms. We show that the boron-doped graphene containing a higher amount of dopants provides the best electroanalytical performance in terms of calibration sensitivity, selectivity and linearity of response for the detection of gallic acid normally used as the standard probe for

  11. Gauge Theory on a Space with Linear Lie Type Fuzziness

    NASA Astrophysics Data System (ADS)

    Khorrami, Mohammad; Fatollahi, Amir H.; Shariati, Ahmad

    2013-03-01

    The U(1) gauge theory on a space with Lie type noncommutativity is constructed. The construction is based on the group of translations in Fourier space, which in contrast to space itself is commutative. In analogy with lattice gauge theory, the object playing the role of flux of field strength per plaquette, as well as the action, is constructed. It is observed that the theory, in comparison with ordinary U(1) gauge theory, has an extra gauge field component. This phenomena is reminiscent of similar ones in formulation of SU(N) gauge theory in space with canonical noncommutativity, and also appearance of gauge field component in discrete direction of Connes' construction of the Standard Model.

  12. On the theory of the type III burst exciter

    NASA Technical Reports Server (NTRS)

    Smith, R. A.; Goldstein, M. L.; Papadopoulos, K.

    1976-01-01

    In situ satellite observations of type III burst exciters at 1 AU show that the beam does not evolve into a plateau in velocity space, contrary to the prediction of quasilinear theory. The observations can be explained by a theory that includes mode coupling effects due to excitation of the parametric oscillating two-stream instability and its saturation by anomalous resistivity. The time evolution of the beam velocity distribution is included in the analysis.

  13. The use of self-quantification systems for personal health information: big data management activities and prospects

    PubMed Central

    2015-01-01

    Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions

  14. Stochastic Growth Theory of Type 3 Solar Radio Emission

    NASA Technical Reports Server (NTRS)

    Robinson, P. A.; Carins, I. H.

    1993-01-01

    The recently developed stochastic growth theory of type 3 radio sources is extended to predict their electromagnetic volume emissivities and brightness temperatures. Predicted emissivities are consistent with spacecraft observations and independent theoretical constraints.

  15. Quantification of DNA using the luminescent oxygen channeling assay.

    PubMed

    Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S

    2000-09-01

    Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.

  16. Stereomicroscopic imaging technique for the quantification of cold flow in drug-in-adhesive type of transdermal drug delivery systems.

    PubMed

    Krishnaiah, Yellela S R; Katragadda, Usha; Khan, Mansoor A

    2014-05-01

    Cold flow is a phenomenon occurring in drug-in-adhesive type of transdermal drug delivery systems (DIA-TDDS) because of the migration of DIA coat beyond the edge. Excessive cold flow can affect their therapeutic effectiveness, make removal of DIA-TDDS difficult from the pouch, and potentially decrease available dose if any drug remains adhered to pouch. There are no compendial or noncompendial methods available for quantification of this critical quality attribute. The objective was to develop a method for quantification of cold flow using stereomicroscopic imaging technique. Cold flow was induced by applying 1 kg force on punched-out samples of marketed estradiol DIA-TDDS (model product) stored at 25°C, 32°C, and 40°C/60% relative humidity (RH) for 1, 2, or 3 days. At the end of testing period, dimensional change in the area of DIA-TDDS samples was measured using image analysis software, and expressed as percent of cold flow. The percent of cold flow significantly decreased (p < 0.001) with increase in size of punched-out DIA-TDDS samples and increased (p < 0.001) with increase in cold flow induction temperature and time. This first ever report suggests that dimensional change in the area of punched-out samples stored at 32°C/60%RH for 2 days applied with 1 kg force could be used for quantification of cold flow in DIA-TDDS. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  17. Convex geometry of quantum resource quantification

    NASA Astrophysics Data System (ADS)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  18. Quantification of genetically modified soybeans using a combination of a capillary-type real-time PCR system and a plasmid reference standard.

    PubMed

    Toyota, Akie; Akiyama, Hiroshi; Sugimura, Mitsunori; Watanabe, Takahiro; Kikuchi, Hiroyuki; Kanamori, Hisayuki; Hino, Akihiro; Esaka, Muneharu; Maitani, Tamio

    2006-04-01

    Because the labeling of grains and feed- and foodstuffs is mandatory if the genetically modified organism (GMO) content exceeds a certain level of approved genetically modified varieties in many countries, there is a need for a rapid and useful method of GMO quantification in food samples. In this study, a rapid detection system was developed for Roundup Ready Soybean (RRS) quantification using a combination of a capillary-type real-time PCR system, a LightCycler real-time PCR system, and plasmid DNA as the reference standard. In addition, we showed for the first time that the plasmid and genomic DNA should be similar in the established detection system because the PCR efficiencies of using plasmid DNA and using genomic DNA were not significantly different. The conversion factor (Cf) to calculate RRS content (%) was further determined from the average value analyzed in three laboratories. The accuracy and reproducibility of this system for RRS quantification at a level of 5.0% were within a range from 4.46 to 5.07% for RRS content and within a range from 2.0% to 7.0% for the relative standard deviation (RSD) value, respectively. This system rapidly monitored the labeling system and had allowable levels of accuracy and precision.

  19. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  20. Electrical detection and quantification of single and mixed DNA nucleotides in suspension

    NASA Astrophysics Data System (ADS)

    Ahmad, Mahmoud Al; Panicker, Neena G.; Rizvi, Tahir A.; Mustafa, Farah

    2016-09-01

    High speed sequential identification of the building blocks of DNA, (deoxyribonucleotides or nucleotides for short) without labeling or processing in long reads of DNA is the need of the hour. This can be accomplished through exploiting their unique electrical properties. In this study, the four different types of nucleotides that constitute a DNA molecule were suspended in a buffer followed by performing several types of electrical measurements. These electrical parameters were then used to quantify the suspended DNA nucleotides. Thus, we present a purely electrical counting scheme based on the semiconductor theory that allows one to determine the number of nucleotides in a solution by measuring their capacitance-voltage dependency. The nucleotide count was observed to be similar to the multiplication of the corresponding dopant concentration and debye volume after de-embedding the buffer contribution. The presented approach allows for a fast and label-free quantification of single and mixed nucleotides in a solution.

  1. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. A rapid chemiluminescent slot blot immunoassay for the detection and quantification of Clostridium botulinum neurotoxin type E, in cultures.

    PubMed

    Cadieux, Brigitte; Blanchfield, Burke; Smith, James P; Austin, John W

    2005-05-01

    A simple, rapid, cost-effective in vitro slot blot immunoassay was developed for the detection and quantification of botulinum neurotoxin type E (BoNT/E) in cultures. Culture supernatants of 36 strains of clostridia, including 12 strains of Clostridium botulinum type E, 12 strains of other C. botulinum neurotoxin serotypes, and 12 strains of other clostridial species were tested. Samples containing BoNT/E were detected using affinity-purified polyclonal rabbit antisera prepared against BoNT/E with subsequent detection of secondary antibodies using chemiluminescence. All strains of C. botulinum type E tested positive, while all non C. botulinum type E strains tested negative. The sensitivity of the slot blot immunoassay for detection of BoNT/E was approximately four mouse lethal doses (MLD). The intensity of chemiluminescence was directly correlated with the concentration of BoNT/E up to 128 MLD, allowing quantification of BoNT/E between 4 and 128 MLD. The slot blot immunoassay was compared to the mouse bioassay for detection of BoNT/E using cultures derived from fish samples inoculated with C. botulinum type E, and cultures derived from naturally contaminated environmental samples. A total of 120 primary enrichment cultures derived from fish samples, of which 103 were inoculated with C. botulinum type E, and 17 were uninoculated controls, were assayed. Of the 103 primary enrichment cultures derived from inoculated fish samples, all were positive by mouse bioassay, while 94 were also positive by slot blot immunoassay, resulting in a 7.5% false-negative rate. All 17 primary enrichment cultures derived from the uninoculated fish samples were negative by both mouse bioassay and slot blot immunoassay. A total of twenty-six primary enrichment cultures derived from environmental samples were tested by mouse bioassay and slot blot immunoassay. Of 13 primary enrichment cultures positive by mouse bioassay, 12 were also positive by slot blot immunoassay, resulting in a 3

  3. Advantages of a validated UPLC-MS/MS standard addition method for the quantification of A-type dimeric and trimeric proanthocyanidins in cranberry extracts in comparison with well-known quantification methods.

    PubMed

    van Dooren, Ines; Foubert, Kenn; Theunis, Mart; Naessens, Tania; Pieters, Luc; Apers, Sandra

    2018-01-30

    The berries of Vaccinium macrocarpon, cranberry, are widely used for the prevention of urinary tract infections. This species contains A-type proanthocyanidins (PACs), which intervene in the initial phase of the development of urinary tract infections by preventing the adherence of Escherichia coli by their P-type fimbriae to uroepithelial cells. Unfortunately, the existing clinical studies used different cranberry preparations, which were poorly standardized. Because of this, the results were hard to compare, which led sometimes to conflicting results. Currently, PACs are quantified using the rather non-specific spectrophotometric 4-dimethylaminocinnamaldehyde (DMAC) method. In addition, a normal phase HPTLC-densitometric method, a HPLC-UV method and three LC-MS/MS methods for quantification of procyanidin A2 were recently published. All these methods contain some shortcomings and errors. Hence, the development and validation of a fast and sensitive standard addition LC-MS/MS method for the simultaneous quantification of A-type dimers and trimers in a cranberry dry extract was carried out. A linear calibration model could be adopted for dimers and, after logaritmic transformation, for trimers. The maximal interday and interconcentration precision was found to be 4.86% and 4.28% for procyanidin A2, and 5.61% and 7.65% for trimeric PACs, which are all acceptable values for an analytical method using LC-MS/MS. In addition, twelve different cranberry extracts were analyzed by means of the newly validated method and other widely used methods. There appeared to be an enormous variation in dimeric and trimeric PAC content. Comparison of these results with LC-MS/MS analysis without standard addition showed the presence of matrix effects for some of the extracts and proved the necessity of standard addition. A comparison of the well-known and widely used DMAC method, the butanol-HCl assay and this newly developed LC-MS/MS method clearly indicated the need for a reliable

  4. [Triple-type theory of statistics and its application in the scientific research of biomedicine].

    PubMed

    Hu, Liang-ping; Liu, Hui-gang

    2005-07-20

    To point out the crux of why so many people failed to grasp statistics and to bring forth a "triple-type theory of statistics" to solve the problem in a creative way. Based on the experience in long-time teaching and research in statistics, the "three-type theory" was raised and clarified. Examples were provided to demonstrate that the 3 types, i.e., expressive type, prototype and the standardized type are the essentials for people to apply statistics rationally both in theory and practice, and moreover, it is demonstrated by some instances that the "three types" are correlated with each other. It can help people to see the essence by interpreting and analyzing the problems of experimental designs and statistical analyses in medical research work. Investigations reveal that for some questions, the three types are mutually identical; for some questions, the prototype is their standardized type; however, for some others, the three types are distinct from each other. It has been shown that in some multifactor experimental researches, it leads to the nonexistence of the standardized type corresponding to the prototype at all, because some researchers have committed the mistake of "incomplete control" in setting experimental groups. This is a problem which should be solved by the concept and method of "division". Once the "triple-type" for each question is clarified, a proper experimental design and statistical method can be carried out easily. "Triple-type theory of statistics" can help people to avoid committing statistical mistakes or at least to decrease the misuse rate dramatically and improve the quality, level and speed of biomedical research during the process of applying statistics. It can also help people to improve the quality of statistical textbooks and the teaching effect of statistics and it has demonstrated how to advance biomedical statistics.

  5. Quantification of gait parameters in freely walking wild type and sensory deprived Drosophila melanogaster

    PubMed Central

    Mendes, César S; Bartos, Imre; Akay, Turgay; Márka, Szabolcs; Mann, Richard S

    2013-01-01

    Coordinated walking in vertebrates and multi-legged invertebrates such as Drosophila melanogaster requires a complex neural network coupled to sensory feedback. An understanding of this network will benefit from systems such as Drosophila that have the ability to genetically manipulate neural activities. However, the fly's small size makes it challenging to analyze walking in this system. In order to overcome this limitation, we developed an optical method coupled with high-speed imaging that allows the tracking and quantification of gait parameters in freely walking flies with high temporal and spatial resolution. Using this method, we present a comprehensive description of many locomotion parameters, such as gait, tarsal positioning, and intersegmental and left-right coordination for wild type fruit flies. Surprisingly, we find that inactivation of sensory neurons in the fly's legs, to block proprioceptive feedback, led to deficient step precision, but interleg coordination and the ability to execute a tripod gait were unaffected. DOI: http://dx.doi.org/10.7554/eLife.00231.001 PMID:23326642

  6. MOTIVATION INTERNALIZATION AND SIMPLEX STRUCTURE IN SELF-DETERMINATION THEORY.

    PubMed

    Ünlü, Ali; Dettweiler, Ulrich

    2015-12-01

    Self-determination theory, as proposed by Deci and Ryan, postulated different types of motivation regulation. As to the introjected and identified regulation of extrinsic motivation, their internalizations were described as "somewhat external" and "somewhat internal" and remained undetermined in the theory. This paper introduces a constrained regression analysis that allows these vaguely expressed motivations to be estimated in an "optimal" manner, in any given empirical context. The approach was even generalized and applied for simplex structure analysis in self-determination theory. The technique was exemplified with an empirical study comparing science teaching in a classical school class versus an expeditionary outdoor program. Based on a sample of 84 German pupils (43 girls, 41 boys, 10 to 12 years old), data were collected using the German version of the Academic Self-Regulation Questionnaire. The science-teaching format was seen to not influence the pupils' internalization of identified regulation. The internalization of introjected regulation differed and shifted more toward the external pole in the outdoor teaching format. The quantification approach supported the simplex structure of self-determination theory, whereas correlations may disconfirm the simplex structure.

  7. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  8. Automating Access Control Logics in Simple Type Theory with LEO-II

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound (and complete) embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in and about prominent access control logics.

  9. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. R 4 couplings in M- and type II theories on Calabi-Yau spaces

    NASA Astrophysics Data System (ADS)

    Antoniadis, I.; Feffara, S.; Minasian, R.; Narain, K. S.

    1997-02-01

    We discuss several implications of R 4 couplings in M-theory when compactified on Calabi-Yau (CY) manifolds. In particular, these couplings can be predicted by supersymmetry from the mixed gauge-gravitational Chem-Simons couplings in five dimensions and are related to the one-loop holomorphic anomaly in four-dimensional N = 2 theories. We find a new contribution to the Einstein term in five dimensions proportional to the Euler number of the internal CY threefold, which corresponds to a one-loop correction of the hypermultiplet geometry. This correction is reproduced by a direct computation in type 11 string theories. Finally, we discuss a universal non-perturbative correction to the type IIB hyper-metric.

  11. Bianchi Type VI1 Viscous Fluid Cosmological Model in Wesson´s Theory of Gravitation

    NASA Astrophysics Data System (ADS)

    Khadekar, G. S.; Avachar, G. R.

    2007-03-01

    Field equations of a scale invariant theory of gravitation proposed by Wesson [1, 2] are obtained in the presence of viscous fluid with the aid of Bianchi type VIh space-time with the time dependent gauge function (Dirac gauge). It is found that Bianchi type VIh (h = 1) space-time with viscous fluid is feasible in this theory, whereas Bianchi type VIh (h = -1, 0) space-times are not feasible in this theory, even in the presence of viscosity. For the feasible case, by assuming a relation connecting viscosity and metric coefficient, we have obtained a nonsingular-radiating model. We have discussed some physical and kinematical properties of the models.

  12. Resource Theory of Superposition

    NASA Astrophysics Data System (ADS)

    Theurer, T.; Killoran, N.; Egloff, D.; Plenio, M. B.

    2017-12-01

    The superposition principle lies at the heart of many nonclassical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialized to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.

  13. Testing Components of a Self-Management Theory in Adolescents With Type 1 Diabetes Mellitus.

    PubMed

    Verchota, Gwen; Sawin, Kathleen J

    The role of self-management in adolescents with type 1 diabetes mellitus is not well understood. The purpose of the research was to examine the relationship of key individual and family self-management theory, context, and process variables on proximal (self-management behaviors) and distal (hemoglobin A1c and diabetes-specific health-related quality of life) outcomes in adolescents with type 1 diabetes. A correlational, cross-sectional study was conducted to identify factors contributing to outcomes in adolescents with Type 1 diabetes and examine potential relationships between context, process, and outcome variables delineated in individual and family self-management theory. Participants were 103 adolescent-parent dyads (adolescents ages 12-17) with Type 1 diabetes from a Midwest, outpatient, diabetes clinic. The dyads completed a self-report survey including instruments intended to measure context, process, and outcome variables from individual and family self-management theory. Using hierarchical multiple regression, context (depressive symptoms) and process (communication) variables explained 37% of the variance in self-management behaviors. Regimen complexity-the only significant predictor-explained 11% of the variance in hemoglobin A1c. Neither process variables nor self-management behaviors were significant. For the diabetes-specific health-related quality of life outcome, context (regimen complexity and depressive symptoms) explained 26% of the variance at step 1; an additional 9% of the variance was explained when process (self-efficacy and communication) variables were added at step 2; and 52% of the variance was explained when self-management behaviors were added at Step 3. In the final model, three variables were significant predictors: depressive symptoms, self-efficacy, and self-management behaviors. The individual and family self-management theory can serve as a cogent theory for understanding key concepts, processes, and outcomes essential to self

  14. Biased ligand quantification in drug discovery: from theory to high throughput screening to identify new biased μ opioid receptor agonists

    PubMed Central

    Winpenny, David; Clark, Mellissa

    2016-01-01

    Background and Purpose Biased GPCR ligands are able to engage with their target receptor in a manner that preferentially activates distinct downstream signalling and offers potential for next generation therapeutics. However, accurate quantification of ligand bias in vitro is complex, and current best practice is not amenable for testing large numbers of compound. We have therefore sought to apply ligand bias theory to an industrial scale screening campaign for the identification of new biased μ receptor agonists. Experimental Approach μ receptor assays with appropriate dynamic range were developed for both Gαi‐dependent signalling and β‐arrestin2 recruitment. Δlog(Emax/EC50) analysis was validated as an alternative for the operational model of agonism in calculating pathway bias towards Gαi‐dependent signalling. The analysis was applied to a high throughput screen to characterize the prevalence and nature of pathway bias among a diverse set of compounds with μ receptor agonist activity. Key Results A high throughput screening campaign yielded 440 hits with greater than 10‐fold bias relative to DAMGO. To validate these results, we quantified pathway bias of a subset of hits using the operational model of agonism. The high degree of correlation across these biased hits confirmed that Δlog(Emax/EC50) was a suitable method for identifying genuine biased ligands within a large collection of diverse compounds. Conclusions and Implications This work demonstrates that using Δlog(Emax/EC50), drug discovery can apply the concept of biased ligand quantification on a large scale and accelerate the deliberate discovery of novel therapeutics acting via this complex pharmacology. PMID:26791140

  15. Predictive Game Theory

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  16. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  17. Quantification of three-dimensional cell-mediated collagen remodeling using graph theory.

    PubMed

    Bilgin, Cemal Cagatay; Lund, Amanda W; Can, Ali; Plopper, George E; Yener, Bülent

    2010-09-30

    Cell cooperation is a critical event during tissue development. We present the first precise metrics to quantify the interaction between mesenchymal stem cells (MSCs) and extra cellular matrix (ECM). In particular, we describe cooperative collagen alignment process with respect to the spatio-temporal organization and function of mesenchymal stem cells in three dimensions. We defined two precise metrics: Collagen Alignment Index and Cell Dissatisfaction Level, for quantitatively tracking type I collagen and fibrillogenesis remodeling by mesenchymal stem cells over time. Computation of these metrics was based on graph theory and vector calculus. The cells and their three dimensional type I collagen microenvironment were modeled by three dimensional cell-graphs and collagen fiber organization was calculated from gradient vectors. With the enhancement of mesenchymal stem cell differentiation, acceleration through different phases was quantitatively demonstrated. The phases were clustered in a statistically significant manner based on collagen organization, with late phases of remodeling by untreated cells clustering strongly with early phases of remodeling by differentiating cells. The experiments were repeated three times to conclude that the metrics could successfully identify critical phases of collagen remodeling that were dependent upon cooperativity within the cell population. Definition of early metrics that are able to predict long-term functionality by linking engineered tissue structure to function is an important step toward optimizing biomaterials for the purposes of regenerative medicine.

  18. A new approach for the quantification of synchrony of multivariate non-stationary psychophysiological variables during emotion eliciting stimuli

    PubMed Central

    Kelava, Augustin; Muma, Michael; Deja, Marlene; Dagdagan, Jack Y.; Zoubir, Abdelhak M.

    2015-01-01

    Emotion eliciting situations are accompanied by changes of multiple variables associated with subjective, physiological and behavioral responses. The quantification of the overall simultaneous synchrony of psychophysiological reactions plays a major role in emotion theories and has received increased attention in recent years. From a psychometric perspective, the reactions represent multivariate non-stationary intra-individual time series. In this paper, a new time-frequency based latent variable approach for the quantification of the synchrony of the responses is presented. The approach is applied to empirical data, collected during an emotion eliciting situation. The results are compared with a complementary inter-individual approach of Hsieh et al. (2011). Finally, the proposed approach is discussed in the context of emotion theories, and possible future applications and limitations are provided. PMID:25653624

  19. Quantification of complex modular architecture in plants.

    PubMed

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  20. Adjoint-Based Uncertainty Quantification with MCNP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less

  1. Contiguous triple spinal dysraphism associated with Chiari malformation Type II and hydrocephalus: an embryological conundrum between the unified theory of Pang and the unified theory of McLone.

    PubMed

    Dhandapani, Sivashanmugam; Srinivasan, Anirudh

    2016-01-01

    Triple spinal dysraphism is extremely rare. There are published reports of multiple discrete neural tube defects with intervening normal segments that are explained by the multisite closure theory of primary neurulation, having an association with Chiari malformation Type II consistent with the unified theory of McLone. The authors report on a 1-year-old child with contiguous myelomeningocele and lipomyelomeningocele centered on Type I split cord malformation with Chiari malformation Type II and hydrocephalus. This composite anomaly is probably due to select abnormalities of the neurenteric canal during gastrulation, with a contiguous cascading impact on both dysjunction of the neural tube and closure of the neuropore, resulting in a small posterior fossa, probably bringing the unified theory of McLone closer to the unified theory of Pang.

  2. Uncertainty quantification and propagation in nuclear density functional theory

    DOE PAGES

    Schunck, N.; McDonnell, J. D.; Higdon, D.; ...

    2015-12-23

    Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this study, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statisticalmore » analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.« less

  3. Differential Models for B-Type Open-Closed Topological Landau-Ginzburg Theories

    NASA Astrophysics Data System (ADS)

    Babalic, Elena Mirela; Doryn, Dmitry; Lazaroiu, Calin Iuliu; Tavakol, Mehdi

    2018-05-01

    We propose a family of differential models for B-type open-closed topological Landau-Ginzburg theories defined by a pair (X,W), where X is any non-compact Calabi-Yau manifold and W is any holomorphic complex-valued function defined on X whose critical set is compact. The models are constructed at cochain level using smooth data, including the twisted Dolbeault algebra of polyvector-valued forms and a twisted Dolbeault category of holomorphic factorizations of W. We give explicit proposals for cochain level versions of the bulk and boundary traces and for the bulk-boundary and boundary-bulk maps of the Landau-Ginzburg theory. We prove that most of the axioms of an open-closed TFT (topological field theory) are satisfied on cohomology and conjecture that the remaining two axioms (namely non-degeneracy of bulk and boundary traces and the topological Cardy constraint) are also satisfied.

  4. Antibody-free PRISM-SRM for multiplexed protein quantification: Is this the new competition for immunoassays in bioanalysis?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Qian, Weijun

    2013-02-01

    Highly sensitive technologies for multiplexed quantification of a large number of candidate proteins will play an increasingly important role in clinical biomarker discovery, systems biology, and general biomedical research. Herein we introduce the new PRISM-SRM technology, which represents a highly sensitive multiplexed quantification technology capable of simultaneous quantification of many low-abundance proteins without the need of affinity reagents. The versatility of antibody-free PRISM-SRM for quantifying various types of targets including protein isoforms, protein modifications, metabolites, and others, thus offering new competition with immunoassays.

  5. Nilpotent symmetries and Curci-Ferrari-type restrictions in 2D non-Abelian gauge theory: Superfield approach

    NASA Astrophysics Data System (ADS)

    Srinivas, N.; Malik, R. P.

    2017-11-01

    We derive the off-shell nilpotent symmetries of the two (1 + 1)-dimensional (2D) non-Abelian 1-form gauge theory by using the theoretical techniques of the geometrical superfield approach to Becchi-Rouet-Stora-Tyutin (BRST) formalism. For this purpose, we exploit the augmented version of superfield approach (AVSA) and derive theoretically useful nilpotent (anti-)BRST, (anti-)co-BRST symmetries and Curci-Ferrari (CF)-type restrictions for the self-interacting 2D non-Abelian 1-form gauge theory (where there is no interaction with matter fields). The derivation of the (anti-)co-BRST symmetries and all possible CF-type restrictions are completely novel results within the framework of AVSA to BRST formalism where the ordinary 2D non-Abelian theory is generalized onto an appropriately chosen (2, 2)-dimensional supermanifold. The latter is parametrized by the superspace coordinates ZM = (xμ,𝜃,𝜃¯) where xμ (with μ = 0, 1) are the bosonic coordinates and a pair of Grassmannian variables (𝜃,𝜃¯) obey the relationships: 𝜃2 = 𝜃¯2 = 0, 𝜃𝜃¯ + 𝜃¯𝜃 = 0. The topological nature of our 2D theory allows the existence of a tower of CF-type restrictions.

  6. Traditional Chinese medicine: potential approaches from modern dynamical complexity theories.

    PubMed

    Ma, Yan; Zhou, Kehua; Fan, Jing; Sun, Shuchen

    2016-03-01

    Despite the widespread use of traditional Chinese medicine (TCM) in clinical settings, proving its effectiveness via scientific trials is still a challenge. TCM views the human body as a complex dynamical system, and focuses on the balance of the human body, both internally and with its external environment. Such fundamental concepts require investigations using system-level quantification approaches, which are beyond conventional reductionism. Only methods that quantify dynamical complexity can bring new insights into the evaluation of TCM. In a previous article, we briefly introduced the potential value of Multiscale Entropy (MSE) analysis in TCM. This article aims to explain the existing challenges in TCM quantification, to introduce the consistency of dynamical complexity theories and TCM theories, and to inspire future system-level research on health and disease.

  7. Quantification of Noise Sources in EMI Surveys

    DTIC Science & Technology

    2012-04-09

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/ 6110 --12-9400 Quantification of Noise Sources in EMI Surveys ESTCP MR-0508 Final Guidance...NUMBER 2 . REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING ORGANIZATION REPORT NUMBER 7. PERFORMING...Barrow,‡ Jonathan T. Miller,‡ and Thomas H. Bell,‡ Naval Research Laboratory, Code 6110 4555 Overlook Avenue, SW Washington, DC 20375-5320 NRL/MR

  8. Cognitive Load Theory: How Many Types of Load Does It Really Need?

    ERIC Educational Resources Information Center

    Kalyuga, Slava

    2011-01-01

    Cognitive load theory has been traditionally described as involving three separate and additive types of load. Germane load is considered as a learning-relevant load complementing extraneous and intrinsic load. This article argues that, in its traditional treatment, germane load is essentially indistinguishable from intrinsic load, and therefore…

  9. BIonic system: Extraction of Lovelock gravity from a Born-Infeld-type theory

    NASA Astrophysics Data System (ADS)

    Naimi, Yaghoob; Sepehri, Alireza; Ghaffary, Tooraj; Ghaforyan, Hossein; Ebrahimzadeh, Majid

    It was shown that both Lovelock gravity and Born-Infeld (BI) electrodynamics can be obtained from low effective limit of string theory. Motivated by the mentioned unique origin of the gauge-gravity theories, we are going to find a close relation between them. In this research, we start from the Lagrangian of a BI-type nonlinear electrodynamics with an exponential form to extract the action of Lovelock gravity. We investigate the origin of Lovelock gravity in a system of branes which are connected with each other by different wormholes through a BIonic system. These wormholes are produced as due to the nonlinear electrodynamics which are emerged on the interacting branes. By approaching branes, wormholes dissolve into branes and Lovelock gravity is generated. Also, throats of some wormholes become smaller than their horizons and they transit to black holes. Generalizing calculations to M-theory, it is found that by compacting Mp-branes, Lovelock gravity changes to nonlinear electrodynamics and thus both of them have the same origin. This result is consistent with the prediction of BIonic model in string theory.

  10. Quantification of confocal images of biofilms grown on irregular surfaces

    PubMed Central

    Ross, Stacy Sommerfeld; Tu, Mai Han; Falsetta, Megan L.; Ketterer, Margaret R.; Kiedrowski, Megan R.; Horswill, Alexander R.; Apicella, Michael A.; Reinhardt, Joseph M.; Fiegel, Jennifer

    2014-01-01

    Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515

  11. Scale relativity theory and integrative systems biology: 2. Macroscopic quantum-type mechanics.

    PubMed

    Nottale, Laurent; Auffray, Charles

    2008-05-01

    In these two companion papers, we provide an overview and a brief history of the multiple roots, current developments and recent advances of integrative systems biology and identify multiscale integration as its grand challenge. Then we introduce the fundamental principles and the successive steps that have been followed in the construction of the scale relativity theory, which aims at describing the effects of a non-differentiable and fractal (i.e., explicitly scale dependent) geometry of space-time. The first paper of this series was devoted, in this new framework, to the construction from first principles of scale laws of increasing complexity, and to the discussion of some tentative applications of these laws to biological systems. In this second review and perspective paper, we describe the effects induced by the internal fractal structures of trajectories on motion in standard space. Their main consequence is the transformation of classical dynamics into a generalized, quantum-like self-organized dynamics. A Schrödinger-type equation is derived as an integral of the geodesic equation in a fractal space. We then indicate how gauge fields can be constructed from a geometric re-interpretation of gauge transformations as scale transformations in fractal space-time. Finally, we introduce a new tentative development of the theory, in which quantum laws would hold also in scale space, introducing complexergy as a measure of organizational complexity. Initial possible applications of this extended framework to the processes of morphogenesis and the emergence of prokaryotic and eukaryotic cellular structures are discussed. Having founded elements of the evolutionary, developmental, biochemical and cellular theories on the first principles of scale relativity theory, we introduce proposals for the construction of an integrative theory of life and for the design and implementation of novel macroscopic quantum-type experiments and devices, and discuss their potential

  12. Cliophysics: Socio-Political Reliability Theory, Polity Duration and African Political (In)stabilities

    PubMed Central

    Cherif, Alhaji; Barley, Kamal

    2010-01-01

    Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies. PMID:21206911

  13. Type II string theory on Calabi-Yau manifolds with torsion and non-Abelian discrete gauge symmetries

    DOE PAGES

    Braun, Volker; Cvetič, Mirjam; Donagi, Ron; ...

    2017-07-26

    Here, we provide the first explicit example of Type IIB string theory compactication on a globally defined Calabi-Yau threefold with torsion which results in a fourdimensional effective theory with a non-Abelian discrete gauge symmetry. Our example is based on a particular Calabi-Yau manifold, the quotient of a product of three elliptic curves by a fixed point free action of Z 2 X Z 2. Its cohomology contains torsion classes in various degrees. The main technical novelty is in determining the multiplicative structure of the (torsion part of) the cohomology ring, and in particular showing that the cup product of secondmore » cohomology torsion elements goes non-trivially to the fourth cohomology. This specifies a non-Abelian, Heisenberg-type discrete symmetry group of the four-dimensional theory.« less

  14. Type II string theory on Calabi-Yau manifolds with torsion and non-Abelian discrete gauge symmetries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braun, Volker; Cvetič, Mirjam; Donagi, Ron

    Here, we provide the first explicit example of Type IIB string theory compactication on a globally defined Calabi-Yau threefold with torsion which results in a fourdimensional effective theory with a non-Abelian discrete gauge symmetry. Our example is based on a particular Calabi-Yau manifold, the quotient of a product of three elliptic curves by a fixed point free action of Z 2 X Z 2. Its cohomology contains torsion classes in various degrees. The main technical novelty is in determining the multiplicative structure of the (torsion part of) the cohomology ring, and in particular showing that the cup product of secondmore » cohomology torsion elements goes non-trivially to the fourth cohomology. This specifies a non-Abelian, Heisenberg-type discrete symmetry group of the four-dimensional theory.« less

  15. Quantification of brain lipids by FTIR spectroscopy and partial least squares regression

    NASA Astrophysics Data System (ADS)

    Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph

    2009-01-01

    Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.

  16. Resonant modal group theory of membrane-type acoustical metamaterials for low-frequency sound attenuation

    NASA Astrophysics Data System (ADS)

    Ma, Fuyin; Wu, Jiu Hui; Huang, Meng

    2015-09-01

    In order to overcome the influence of the structural resonance on the continuous structures and obtain a lightweight thin-layer structure which can effectively isolate the low-frequency noises, an elastic membrane structure was proposed. In the low-frequency range below 500 Hz, the sound transmission loss (STL) of this membrane type structure is greatly higher than that of the current sound insulation material EVA (ethylene-vinyl acetate copo) of vehicle, so it is possible to replace the EVA by the membrane-type metamaterial structure in practice engineering. Based on the band structure, modal shapes, as well as the sound transmission simulation, the sound insulation mechanism of the designed membrane-type acoustic metamaterials was analyzed from a new perspective, which had been validated experimentally. It is suggested that in the frequency range above 200 Hz for this membrane-mass type structure, the sound insulation effect was principally not due to the low-level locally resonant mode of the mass block, but the continuous vertical resonant modes of the localized membrane. So based on such a physical property, a resonant modal group theory is initially proposed in this paper. In addition, the sound insulation mechanism of the membrane-type structure and thin plate structure were combined by the membrane/plate resonant theory.

  17. RNA-Skim: a rapid method for RNA-Seq quantification at transcript level

    PubMed Central

    Zhang, Zhaojun; Wang, Wei

    2014-01-01

    Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering

  18. Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.

    PubMed

    De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik

    2018-01-01

    Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.

  19. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  20. Preschoolers' Generation of Different Types of Counterfactual Statements and Theory of Mind Understanding

    ERIC Educational Resources Information Center

    Guajardo, Nicole R.; Turley-Ames, Kandi Jo

    2004-01-01

    Two studies examined associations between theory of mind performance and counterfactual thinking using both antecedent and consequent counterfactual tasks. Moreover, the studies examined children's abilities to generate different types of counterfactual statements in terms of direction and structure. Participants were 3-, 4-, and 5-year-old…

  1. Reconciling Experiment and Theory in the Use of Aryl-Extended Calix[4]pyrrole Receptors for the Experimental Quantification of Chloride–π Interactions in Solution

    PubMed Central

    Bauzá, Antonio; Quiñonero, David; Frontera, Antonio; Ballester, Pablo

    2015-01-01

    In this manuscript we consider from a theoretical point of view the recently reported experimental quantification of anion–π interactions (the attractive force between electron deficient aromatic rings and anions) in solution using aryl extended calix[4]pyrrole receptors as model systems. Experimentally, two series of calix[4]pyrrole receptors functionalized, respectively, with two and four aryl rings at the meso positions, were used to assess the strength of chloride–π interactions in acetonitrile solution. As a result of these studies the contribution of each individual chloride–π interaction was quantified to be very small (<1 kcal/mol). This result is in contrast with the values derived from most theoretical calculations. Herein we report a theoretical study using high-level density functional theory (DFT) calculations that provides a plausible explanation for the observed disagreement between theory and experiment. The study reveals the existence of molecular interactions between solvent molecules and the aromatic walls of the receptors that strongly modulate the chloride–π interaction. In addition, the obtained theoretical results also suggest that the chloride-calix[4]pyrrole complex used as reference to dissect experimentally the contribution of the chloride–π interactions to the total binding energy for both the two and four-wall aryl-extended calix[4]pyrrole model systems is probably not ideal. PMID:25913375

  2. Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.

    PubMed

    Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2018-02-01

    Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Type II superstring field theory: geometric approach and operadic description

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Münster, Korbinian

    2013-04-01

    We outline the construction of type II superstring field theory leading to a geometric and algebraic BV master equation, analogous to Zwiebach's construction for the bosonic string. The construction uses the small Hilbert space. Elementary vertices of the non-polynomial action are described with the help of a properly formulated minimal area problem. They give rise to an infinite tower of superstring field products defining a {N} = 1 generalization of a loop homotopy Lie algebra, the genus zero part generalizing a homotopy Lie algebra. Finally, we give an operadic interpretation of the construction.

  4. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    PubMed Central

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  5. Fluorescent quantification of melanin.

    PubMed

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Impaired Theory of Mind and psychosocial functioning among pediatric patients with Type I versus Type II bipolar disorder.

    PubMed

    Schenkel, Lindsay S; Chamberlain, Todd F; Towne, Terra L

    2014-03-30

    Deficits in Theory of Mind (ToM) have been documented among pediatric patients with Bipolar Disorder (BD). However, fewer studies have directly examined differences between type I and type II patients and whether or not ToM deficits are related to psychosocial difficulties. Therefore, the aim of this study was to compare type I versus type II pediatric bipolar patients and matched Healthy Controls (HC) on ToM and interpersonal functioning tasks. All participants completed the Revised Mind in the Eyes Task (MET), the Cognitive and Emotional Perspective Taking Task (CEPTT), and the Index of Peer Relations (IPR). Type I BD patients reported greater peer difficulties on the IPR compared to HC, and also performed more poorly on the MET and the cognitive condition of the CEPTT, but did not differ significantly on the emotional condition. There were no significant group differences between type II BD patients and HC. More impaired ToM performance was associated with poorer interpersonal functioning. Type I BD patients show deficits in the ability to understand another's mental state, irrespective of emotional valence. Deficits in understanding others' mental states could be an important treatment target for type I pediatric patients with BD. © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Higher derivatives in Type II and M-theory on Calabi-Yau threefolds

    NASA Astrophysics Data System (ADS)

    Grimm, Thomas W.; Mayer, Kilian; Weissenbacher, Matthias

    2018-02-01

    The four- and five-dimensional effective actions of Calabi-Yau threefold compactifications are derived with a focus on terms involving up to four space-time derivatives. The starting points for these reductions are the ten- and eleven-dimensional supergravity actions supplemented with the known eight-derivative corrections that have been inferred from Type II string amplitudes. The corrected background solutions are determined and the fluctuations of the Kähler structure of the compact space and the form-field back-ground are discussed. It is concluded that the two-derivative effective actions for these fluctuations only takes the expected supergravity form if certain additional ten- and eleven-dimensional higher-derivative terms for the form-fields are included. The main results on the four-derivative terms include a detailed treatment of higher-derivative gravity coupled to Kähler structure deformations. This is supplemented by a derivation of the vector sector in reductions to five dimensions. While the general result is only given as an expansion in the fluctuations, a complete treatment of the one-Kähler modulus case is presented for both Type II theories and M-theory.

  8. HPAEC-PAD quantification of Haemophilus influenzae type b polysaccharide in upstream and downstream samples.

    PubMed

    van der Put, Robert M F; de Haan, Alex; van den IJssel, Jan G M; Hamidi, Ahd; Beurret, Michel

    2015-11-27

    Due to the rapidly increasing introduction of Haemophilus influenzae type b (Hib) and other conjugate vaccines worldwide during the last decade, reliable and robust analytical methods are needed for the quantitative monitoring of intermediate samples generated during fermentation (upstream processing, USP) and purification (downstream processing, DSP) of polysaccharide vaccine components. This study describes the quantitative characterization of in-process control (IPC) samples generated during the fermentation and purification of the capsular polysaccharide (CPS), polyribosyl-ribitol-phosphate (PRP), derived from Hib. Reliable quantitative methods are necessary for all stages of production; otherwise accurate process monitoring and validation is not possible. Prior to the availability of high performance anion exchange chromatography methods, this polysaccharide was predominantly quantified either with immunochemical methods, or with the colorimetric orcinol method, which shows interference from fermentation medium components and reagents used during purification. Next to an improved high performance anion exchange chromatography-pulsed amperometric detection (HPAEC-PAD) method, using a modified gradient elution, both the orcinol assay and high performance size exclusion chromatography (HPSEC) analyses were evaluated. For DSP samples, it was found that the correlation between the results obtained by HPAEC-PAD specific quantification of the PRP monomeric repeat unit released by alkaline hydrolysis, and those from the orcinol method was high (R(2)=0.8762), and that it was lower between HPAEC-PAD and HPSEC results. Additionally, HPSEC analysis of USP samples yielded surprisingly comparable results to those obtained by HPAEC-PAD. In the early part of the fermentation, medium components interfered with the different types of analysis, but quantitative HPSEC data could still be obtained, although lacking the specificity of the HPAEC-PAD method. Thus, the HPAEC

  9. Quantification and Segmentation of Brain Tissues from MR Images: A Probabilistic Neural Network Approach

    PubMed Central

    Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt

    2007-01-01

    This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510

  10. Discovering cell types in flow cytometry data with random matrix theory

    NASA Astrophysics Data System (ADS)

    Shen, Yang; Nussenblatt, Robert; Losert, Wolfgang

    Flow cytometry is a widely used experimental technique in immunology research. During the experiments, peripheral blood mononuclear cells (PBMC) from a single patient, labeled with multiple fluorescent stains that bind to different proteins, are illuminated by a laser. The intensity of each stain on a single cell is recorded and reflects the amount of protein expressed by that cell. The data analysis focuses on identifying specific cell types related to a disease. Different cell types can be identified by the type and amount of protein they express. To date, this has most often been done manually by labelling a protein as expressed or not while ignoring the amount of expression. Using a cross correlation matrix of stain intensities, which contains both information on the proteins expressed and their amount, has been largely ignored by researchers as it suffers from measurement noise. Here we present an algorithm to identify cell types in flow cytometry data which uses random matrix theory (RMT) to reduce noise in a cross correlation matrix. We demonstrate our method using a published flow cytometry data set. Compared with previous analysis techniques, we were able to rediscover relevant cell types in an automatic way. Department of Physics, University of Maryland, College Park, MD 20742.

  11. On SYM theory and all order bulk singularity structures of BPS strings in type II theory

    NASA Astrophysics Data System (ADS)

    Hatefi, Ehsan

    2018-06-01

    The complete forms of the S-matrix elements of a transverse scalar field, two world volume gauge fields, and a Potential Cn-1 Ramond-Ramond (RR) form field are investigated. In order to find an infinite number of t , s , (t + s + u)-channel bulk singularity structures of this particular mixed open-closed amplitude, we employ all the conformal field theory techniques to , exploring all the entire correlation functions and all order α‧ contact interactions to these supersymmetric Yang-Mills (SYM) couplings. Singularity and contact term comparisons with the other symmetric analysis, and are also carried out in detail. Various couplings from pull-Back of branes, Myers terms and several generalized Bianchi identities should be taken into account to be able to reconstruct all order α‧ bulk singularities of type IIB (IIA) superstring theory. Finally, we make a comment on how to derive without any ambiguity all order α‧ contact terms of this S-matrix which carry momentum of RR in transverse directions.

  12. Type synthesis for 4-DOF parallel press mechanism using GF set theory

    NASA Astrophysics Data System (ADS)

    He, Jun; Gao, Feng; Meng, Xiangdun; Guo, Weizhong

    2015-07-01

    Parallel mechanisms is used in the large capacity servo press to avoid the over-constraint of the traditional redundant actuation. Currently, the researches mainly focus on the performance analysis for some specific parallel press mechanisms. However, the type synthesis and evaluation of parallel press mechanisms is seldom studied, especially for the four degrees of freedom(DOF) press mechanisms. The type synthesis of 4-DOF parallel press mechanisms is carried out based on the generalized function(GF) set theory. Five design criteria of 4-DOF parallel press mechanisms are firstly proposed. The general procedure of type synthesis of parallel press mechanisms is obtained, which includes number synthesis, symmetrical synthesis of constraint GF sets, decomposition of motion GF sets and design of limbs. Nine combinations of constraint GF sets of 4-DOF parallel press mechanisms, ten combinations of GF sets of active limbs, and eleven combinations of GF sets of passive limbs are synthesized. Thirty-eight kinds of press mechanisms are presented and then different structures of kinematic limbs are designed. Finally, the geometrical constraint complexity( GCC), kinematic pair complexity( KPC), and type complexity( TC) are proposed to evaluate the press types and the optimal press type is achieved. The general methodologies of type synthesis and evaluation for parallel press mechanism are suggested.

  13. Monte Carlo Modeling-Based Digital Loop-Mediated Isothermal Amplification on a Spiral Chip for Absolute Quantification of Nucleic Acids.

    PubMed

    Xia, Yun; Yan, Shuangqian; Zhang, Xian; Ma, Peng; Du, Wei; Feng, Xiaojun; Liu, Bi-Feng

    2017-03-21

    Digital loop-mediated isothermal amplification (dLAMP) is an attractive approach for absolute quantification of nucleic acids with high sensitivity and selectivity. Theoretical and numerical analysis of dLAMP provides necessary guidance for the design and analysis of dLAMP devices. In this work, a mathematical model was proposed on the basis of the Monte Carlo method and the theories of Poisson statistics and chemometrics. To examine the established model, we fabricated a spiral chip with 1200 uniform and discrete reaction chambers (9.6 nL) for absolute quantification of pathogenic DNA samples by dLAMP. Under the optimized conditions, dLAMP analysis on the spiral chip realized quantification of nucleic acids spanning over 4 orders of magnitude in concentration with sensitivity as low as 8.7 × 10 -2 copies/μL in 40 min. The experimental results were consistent with the proposed mathematical model, which could provide useful guideline for future development of dLAMP devices.

  14. (2,2) and (0,4) supersymmetric boundary conditions in 3d N =4 theories and type IIB branes

    NASA Astrophysics Data System (ADS)

    Chung, Hee-Joong; Okazaki, Tadashi

    2017-10-01

    The half-BPS boundary conditions preserving N =(2 ,2 ) and N =(0 ,4 ) supersymmetry in 3d N =4 supersymmetric gauge theories are examined. The BPS equations admit decomposition of the bulk supermultiplets into specific boundary supermultiplets of preserved supersymmetry. Nahm-like equations arise in the vector multiplet BPS boundary condition preserving N =(0 ,4 ) supersymmetry, and Robin-type boundary conditions appear for the hypermultiplet coupled to the vector multiplet when N =(2 ,2 ) supersymmetry is preserved. The half-BPS boundary conditions are realized in the brane configurations of type IIB string theory.

  15. Development of a situation-specific theory for explaining health-related quality of life among older South Korean adults with type 2 diabetes.

    PubMed

    Chang, Sun Ju; Im, Eun-Ok

    2014-01-01

    The purpose of the study was to develop a situation-specific theory for explaining health-related quality of life (QOL) among older South Korean adults with type 2 diabetes. To develop a situation-specific theory, three sources were considered: (a) the conceptual model of health promotion and QOL for people with chronic and disabling conditions (an existing theory related to the QOL in patients with chronic diseases); (b) a literature review using multiple databases including Cumulative Index for Nursing and Allied Health Literature (CINAHL), PubMed, PsycINFO, and two Korean databases; and (c) findings from our structural equation modeling study on health-related QOL in older South Korean adults with type 2 diabetes. The proposed situation-specific theory is constructed with six major concepts including barriers, resources, perceptual factors, psychosocial factors, health-promoting behaviors, and health-related QOL. The theory also provides the interrelationships among concepts. Health care providers and nurses could incorporate the proposed situation-specific theory into development of diabetes education programs for improving health-related QOL in older South Korean adults with type 2 diabetes.

  16. Processing and domain selection: Quantificational variability effects

    PubMed Central

    Harris, Jesse A.; Clifton, Charles; Frazier, Lyn

    2014-01-01

    Three studies investigated how readers interpret sentences with variable quantificational domains, e.g., The army was mostly in the capital, where mostly may quantify over individuals or parts (Most of the army was in the capital) or over times (The army was in the capital most of the time). It is proposed that a general conceptual economy principle, No Extra Times (Majewski 2006, in preparation), discourages the postulation of potentially unnecessary times, and thus favors the interpretation quantifying over parts. Disambiguating an ambiguously quantified sentence to a quantification over times interpretation was rated as less natural than disambiguating it to a quantification over parts interpretation (Experiment 1). In an interpretation questionnaire, sentences with similar quantificational variability were constructed so that both interpretations of the sentence would require postulating multiple times; this resulted in the elimination of the preference for a quantification over parts interpretation, suggesting the parts preference observed in Experiment 1 is not reducible to a lexical bias of the adverb mostly (Experiment 2). An eye movement recording study showed that, in the absence of prior evidence for multiple times, readers exhibit greater difficulty when reading material that forces a quantification over times interpretation than when reading material that allows a quantification over parts interpretation (Experiment 3). These experiments contribute to understanding readers’ default assumptions about the temporal properties of sentences, which is essential for understanding the selection of a domain for adverbial quantifiers and, more generally, for understanding how situational constraints influence sentence processing. PMID:25328262

  17. Kinetic quantification of plyometric exercise intensity.

    PubMed

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.

  18. Effectiveness of training on preventative nutritional behaviors for type-2 diabetes among the female adolescents: Examination of theory of planned behavior.

    PubMed

    Maleki, Farzaneh; Hosseini Nodeh, Zahra; Rahnavard, Zahra; Arab, Masoume

    2016-01-01

    Since type-2 diabetes is the most common chronic disease among Iranian female adolescents, we applied theory of planned behavior to examine the effect of training to intention to preventative nutritional behaviors for type-2 diabetes among female adolescents. In this experimental study 200 (11-14 year old) girls from 8 schools of Tehran city (100 in each intervention and control group) were recruited based on cluster sampling method during two stages. For intervention group, an educational program was designed based on the theory of planned behavior and presented in 6 workshop sessions to prevent type-2 diabetes. The data were collected before and two months after the workshops using a valid and reliable (α=0.72 and r=0.80) authormade questionnaire based on Ajzens TPB questionnaire manual. The data were analyzed using t-test, chi-square test and analysis of covariance. Findings indicate that the two groups were homogeneous regarding the demographic characteristics before education, but the mean score of the theory components (attitudes, subjective norms, perceived behavioral control, and intention) was higher in the control group. Also, results showed all of the theory components significantly increased after the education in the intervention group (p=0.000). Training based on the theory of planned behavior enhances the intention to adherence preventative nutritional behaviors for type-2 diabetes among the studied female adolescents.

  19. Prevalence, quantification and typing of adenoviruses detected in river and treated drinking water in South Africa.

    PubMed

    van Heerden, J; Ehlers, M M; Heim, A; Grabow, W O K

    2005-01-01

    risk of infection constituted by these viruses. The risk of infection may have implications for the management of drinking water quality. This study is unique as it is the first report on the quantification and typing of HAds in treated drinking water and river water. This baseline data is necessary for the meaningful assessment of the potential risk of infection constituted by these viruses.

  20. Design and validation of an immunoaffinity LC-MS/MS assay for the quantification of a collagen type II neoepitope peptide in human urine: application as a biomarker of osteoarthritis.

    PubMed

    Nemirovskiy, Olga; Li, Wenlin Wendy; Szekely-Klepser, Gabriella

    2010-01-01

    Biomarkers play an increasingly important role for drug efficacy and safety evaluation in all stages of drug development. It is especially important to develop and validate sensitive and selective biomarkers for diseases where the onset of the disease is very slow and/or the disease progression is hard to follow, i.e., osteoarthritis (OA). The degradation of Type II collagen has been associated with the disease state of OA. Matrix metalloproteinases (MMPs) are enzymes that catalyze the degradation of collagen and therefore pursued as potential targets for the treatment of OA. Peptide biomarkers of MMP activity related to type II collagen degradation were identified and the presence of these peptides in MMP digests of human articular cartilage (HAC) explants and human urine were confirmed. An immunoaffinity LC/MS/MS assay for the quantification of the most abundant urinary type II collagen neoepitope (uTIINE) peptide, a 45-mer with 5 HO-proline residues was developed and clinically validated. The assay has subsequently been applied to analyze human urine samples from clinical studies. We have shown that the assay is able to differentiate between symptomatic OA and normal subjects, indicating that uTIINE can be used as potential biomarker for OA. This chapter discusses the assay procedure and provides information on the validation experiments used to evaluate the accuracy, precision, and selectivity data with attention to the specific challenges related to the quantification of endogenous protein/peptide biomarker analytes. The generalized approach can be used as a follow-up to studies whereby proteomics-based urinary biomarkers are identified and an assay needs to be developed. Considerations for the validation of such an assay are described.

  1. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  2. Coordinated encoding between cell types in the retina: insights from the theory of phase transitions

    NASA Astrophysics Data System (ADS)

    Sharpee, Tatyana

    2015-03-01

    In this talk I will describe how the emergence of some types of neurons in the brain can be quantitatively described by the theory of transitions between different phases of matter. The two key parameters that control the separation of neurons into subclasses are the mean and standard deviation of noise levels among neurons in the population. The mean noise level plays the role of temperature in the classic theory of phase transitions, whereas the standard deviation is equivalent to pressure, in the case of liquid-gas transitions, or to magnetic field for magnetic transitions. Our results account for properties of two recently discovered types of salamander OFF retinal ganglion cells, as well as the absence of multiple types of ON cells. We further show that, across visual stimulus contrasts, retinal circuits continued to operate near the critical point whose quantitative characteristics matched those expected near a liquid-gas critical point and described by the nearest-neighbor Ising model in three dimensions. Because the retina needs to operate under changing stimulus conditions, the observed parameters of cell types corresponded to metastable states in the region between the spinodal line and the line describing maximally informative solutions. Such properties of neural circuits can maximize information transmission in a given environment while retaining the ability to quickly adapt to a new environment. NSF CAREER award 1254123 and NIH R01EY019493

  3. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  4. An information theory account of cognitive control.

    PubMed

    Fan, Jin

    2014-01-01

    Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  5. An information theory account of cognitive control

    PubMed Central

    Fan, Jin

    2014-01-01

    Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory. PMID:25228875

  6. Bianchi type-II String Cosmological Model with Magnetic Field in Scale-Covariant Theory of Gravitation

    NASA Astrophysics Data System (ADS)

    Sharma, N. K.; Singh, J. K.

    2014-12-01

    The spatially homogeneous and totally anisotropic Bianchi type-II cosmological solutions of massive strings have been investigated in the presence of the magnetic field in the framework of scale-covariant theory of gravitation formulated by Canuto et al. (Phys. Rev. Lett. 39, 429, 1977). With the help of special law of variation for Hubble's parameter proposed by Berman (Nuovo Cimento 74, 182, 1983) string cosmological model is obtained in this theory. We use the power law relation between scalar field ϕ and scale factor R to find the solutions. Some physical and kinematical properties of the model are also discussed.

  7. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    NASA Astrophysics Data System (ADS)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  8. Virus detection and quantification using electrical parameters

    NASA Astrophysics Data System (ADS)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  9. 43 CFR 11.71 - Quantification phase-service reduction quantification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...

  10. 43 CFR 11.71 - Quantification phase-service reduction quantification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...

  11. An accurate proteomic quantification method: fluorescence labeling absolute quantification (FLAQ) using multidimensional liquid chromatography and tandem mass spectrometry.

    PubMed

    Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin

    2012-08-01

    A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Uncertainty quantification and experimental design based on unsupervised machine learning identification of contaminant sources and groundwater types using hydrogeochemical data

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.

    2017-12-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical species. Numerous geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. As a result, these types of model analyses are typically extremely challenging. Here, we demonstrate a new contaminant source identification approach that performs decomposition of the observation mixtures based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. We also demonstrate how NMFk can be extended to perform uncertainty quantification and experimental design related to real-world site characterization. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios). The NMFk algorithm has been extensively tested on synthetic datasets; NMFk analyses have been actively performed on real-world data collected at the Los Alamos National

  13. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is

  14. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, J. M., E-mail: jschmidt@physics.usyd.edu.au; Cairns, Iver H.

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME andmore » plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 10{sup 6} and ≈ 10{sup 3}, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth’s magnetosphere and drive space weather events.« less

  15. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    NASA Astrophysics Data System (ADS)

    Schmidt, J. M.; Cairns, Iver H.

    2016-03-01

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME and plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 106 and ≈ 103, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth's magnetosphere and drive space weather events.

  16. Dirac Theory on a Space with Linear Lie Type Fuzziness

    NASA Astrophysics Data System (ADS)

    Shariati, Ahmad; Khorrami, Mohammad; Fatollahi, Amir H.

    2012-08-01

    A spinor theory on a space with linear Lie type noncommutativity among spatial coordinates is presented. The model is based on the Fourier space corresponding to spatial coordinates, as this Fourier space is commutative. When the group is compact, the real space exhibits lattice characteristics (as the eigenvalues of space operators are discrete), and the similarity of such a lattice with ordinary lattices is manifested, among other things, in a phenomenon resembling the famous fermion doubling problem. A projection is introduced to make the dynamical number of spinors equal to that corresponding to the ordinary space. The actions for free and interacting spinors (with Fermi-like interactions) are presented. The Feynman rules are extracted and 1-loop corrections are investigated.

  17. A Variational Statistical-Field Theory for Polar Liquid Mixtures

    NASA Astrophysics Data System (ADS)

    Zhuang, Bilin; Wang, Zhen-Gang

    Using a variational field-theoretic approach, we derive a molecularly-based theory for polar liquid mixtures. The resulting theory consists of simple algebraic expressions for the free energy of mixing and the dielectric constant as functions of mixture composition. Using only the dielectric constants and the molar volumes of the pure liquid constituents, the theory evaluates the mixture dielectric constants in good agreement with the experimental values for a wide range of liquid mixtures, without using adjustable parameters. In addition, the theory predicts that liquids with similar dielectric constants and molar volumes dissolve well in each other, while sufficient disparity in these parameters result in phase separation. The calculated miscibility map on the dielectric constant-molar volume axes agrees well with known experimental observations for a large number of liquid pairs. Thus the theory provides a quantification for the well-known empirical ``like-dissolves-like'' rule. Bz acknowledges the A-STAR fellowship for the financial support.

  18. Sex and Theories of Deviance: Toward a Functional Theory of Deviant Type-Scripts

    ERIC Educational Resources Information Center

    Harris, Anthony R.

    1977-01-01

    Asserts that the continuing failure to consider women has critically weakened contemporary criminal deviance theory, examines the major paradigms in criminal deviance, argues that the inclusion of sex as a variable has more or less disastrous consequences for those paradigms, and argues that the primary purpose of labeling theory is to detect…

  19. General N=1 supersymmetric flux vacua of massive type IIA string theory.

    PubMed

    Behrndt, Klaus; Cvetic, Mirjam

    2005-07-08

    We derive conditions for the existence of four-dimensional N=1 supersymmetric flux vacua of massive type IIA string theory with general supergravity fluxes turned on. For an SU(3) singlet Killing spinor, we show that such flux vacua exist when the internal geometry is nearly Kähler. The geometry is not warped, all the allowed fluxes are proportional to the mass parameter, and the dilaton is fixed by a ratio of (quantized) fluxes. The four-dimensional cosmological constant, while negative, becomes small in the vacuum with the weak string coupling.

  20. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    NASA Astrophysics Data System (ADS)

    Seebauer, Matthias

    2014-03-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.

  1. Constraints on Nonlinear and Stochastic Growth Theories for Type 3 Solar Radio Bursts from the Corona to 1 AU

    NASA Technical Reports Server (NTRS)

    Cairns, Iver H.; Robinson, P. A.

    1998-01-01

    Existing, competing theories for coronal and interplanetary type III solar radio bursts appeal to one or more of modulational instability, electrostatic (ES) decay processes, or stochastic growth physics to preserve the electron beam, limit the levels of Langmuir-like waves driven by the beam, and produce wave spectra capable of coupling nonlinearly to generate the observed radio emission. Theoretical constraints exist on the wavenumbers and relative sizes of the wave bandwidth and nonlinear growth rate for which Langmuir waves are subject to modulational instability and the parametric and random phase versions of ES decay. A constraint also exists on whether stochastic growth theory (SGT) is appropriate. These constraints are evaluated here using the beam, plasma, and wave properties (1) observed in specific interplanetary type III sources, (2) predicted nominally for the corona, and (3) predicted at heliocentric distances greater than a few solar radii by power-law models based on interplanetary observations. It is found that the Langmuir waves driven directly by the beam have wavenumbers that are almost always too large for modulational instability but are appropriate to ES decay. Even for waves scattered to lower wavenumbers (by ES decay, for instance), the wave bandwidths are predicted to be too large and the nonlinear growth rates too small for modulational instability to occur for the specific interplanetary events studied or the great majority of Langmuir wave packets in type III sources at arbitrary heliocentric distances. Possible exceptions are for very rare, unusually intense, narrowband wave packets, predominantly close to the Sun, and for the front portion of very fast beams traveling through unusually dilute, cold solar wind plasmas. Similar arguments demonstrate that the ES decay should proceed almost always as a random phase process rather than a parametric process, with similar exceptions. These results imply that it is extremely rare for

  2. Methods for the physical characterization and quantification of extracellular vesicles in biological samples.

    PubMed

    Rupert, Déborah L M; Claudio, Virginia; Lässer, Cecilia; Bally, Marta

    2017-01-01

    Our body fluids contain a multitude of cell-derived vesicles, secreted by most cell types, commonly referred to as extracellular vesicles. They have attracted considerable attention for their function as intercellular communication vehicles in a broad range of physiological processes and pathological conditions. Extracellular vesicles and especially the smallest type, exosomes, have also generated a lot of excitement in view of their potential as disease biomarkers or as carriers for drug delivery. In this context, state-of-the-art techniques capable of comprehensively characterizing vesicles in biological fluids are urgently needed. This review presents the arsenal of techniques available for quantification and characterization of physical properties of extracellular vesicles, summarizes their working principles, discusses their advantages and limitations and further illustrates their implementation in extracellular vesicle research. The small size and physicochemical heterogeneity of extracellular vesicles make their physical characterization and quantification an extremely challenging task. Currently, structure, size, buoyant density, optical properties and zeta potential have most commonly been studied. The concentration of vesicles in suspension can be expressed in terms of biomolecular or particle content depending on the method at hand. In addition, common quantification methods may either provide a direct quantitative measurement of vesicle concentration or solely allow for relative comparison between samples. The combination of complementary methods capable of detecting, characterizing and quantifying extracellular vesicles at a single particle level promises to provide new exciting insights into their modes of action and to reveal the existence of vesicle subpopulations fulfilling key biological tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. New approach for the quantification of processed animal proteins in feed using light microscopy.

    PubMed

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  4. Subnuclear foci quantification using high-throughput 3D image cytometry

    NASA Astrophysics Data System (ADS)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  5. Quantification of Neural Ethanol and Acetaldehyde Using Headspace GC-MS

    PubMed Central

    Heit, Claire; Eriksson, Peter; Thompson, David C; Fritz, Kristofer S; Vasiliou, Vasilis

    2016-01-01

    BACKGROUND There is controversy regarding the active agent responsible for alcohol addiction. The theory that ethanol itself was the agent in alcohol drinking behavior was widely accepted until acetaldehyde was found in the brain. The importance of acetaldehyde formation in the brain role is still subject to speculation due to the lack of a method to accurately assay the acetaldehyde levels directly. A highly sensitive GC-MS method to reliably determine acetaldehyde concentration with certainty is needed to address whether neural acetaldehyde is indeed responsible for increased alcohol consumption. METHODS A headspace gas chromatograph coupled to selected ion monitoring mass spectrometry was utilized to develop a quantitative assay for acetaldehyde and ethanol. Our GC-MS approach was carried out using a Bruker Scion 436-GC SQ MS. RESULTS Our approach yields limits of detection of acetaldehyde in the nanomolar range and limits of quantification in the low micromolar range. Our linear calibration includes 5 concentrations with a least square regression greater than 0.99 for both acetaldehyde and ethanol. Tissue analyses using this method revealed the capacity to quantify ethanol and acetaldehyde in blood, brain, and liver tissue from mice. CONCLUSIONS By allowing quantification of very low concentrations, this method may be used to examine the formation of ethanol metabolites, specifically acetaldehyde, in murine brain tissue in alcohol research. PMID:27501276

  6. Quantification of type I error probabilities for heterogeneity LOD scores.

    PubMed

    Abreu, Paula C; Hodge, Susan E; Greenberg, David A

    2002-02-01

    Locus heterogeneity is a major confounding factor in linkage analysis. When no prior knowledge of linkage exists, and one aims to detect linkage and heterogeneity simultaneously, classical distribution theory of log-likelihood ratios does not hold. Despite some theoretical work on this problem, no generally accepted practical guidelines exist. Nor has anyone rigorously examined the combined effect of testing for linkage and heterogeneity and simultaneously maximizing over two genetic models (dominant, recessive). The effect of linkage phase represents another uninvestigated issue. Using computer simulation, we investigated type I error (P value) of the "admixture" heterogeneity LOD (HLOD) score, i.e., the LOD score maximized over both recombination fraction theta and admixture parameter alpha and we compared this with the P values when one maximizes only with respect to theta (i.e., the standard LOD score). We generated datasets of phase-known and -unknown nuclear families, sizes k = 2, 4, and 6 children, under fully penetrant autosomal dominant inheritance. We analyzed these datasets (1) assuming a single genetic model, and maximizing the HLOD over theta and alpha; and (2) maximizing the HLOD additionally over two dominance models (dominant vs. recessive), then subtracting a 0.3 correction. For both (1) and (2), P values increased with family size k; rose less for phase-unknown families than for phase-known ones, with the former approaching the latter as k increased; and did not exceed the one-sided mixture distribution xi = (1/2) chi1(2) + (1/2) chi2(2). Thus, maximizing the HLOD over theta and alpha appears to add considerably less than an additional degree of freedom to the associated chi1(2) distribution. We conclude with practical guidelines for linkage investigators. Copyright 2002 Wiley-Liss, Inc.

  7. Simultaneous quantification and semi-quantification of ginkgolic acids and their metabolites in rat plasma by UHPLC-LTQ-Orbitrap-MS and its application to pharmacokinetics study.

    PubMed

    Qian, Yiyun; Zhu, Zhenhua; Duan, Jin-Ao; Guo, Sheng; Shang, Erxin; Tao, Jinhua; Su, Shulan; Guo, Jianming

    2017-01-15

    A highly sensitive method using ultra-high-pressure liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap-MS) has been developed and validated for the simultaneous identification and quantification of ginkgolic acids and semi-quantification of their metabolites in rat plasma. For the five selected ginkgolic acids, the method was found to be with good linearities (r>0.9991), good intra- and inter-day precisions (RSD<15%), and good accuracies (RE, from -10.33% to 4.92%) as well. Extraction recoveries, matrix effects and stabilities for rat plasm samples were within the required limits. The validated method was successfully applied to investigate the pharmacokinetics of the five ginkgolic acids in rat plasma after oral administration of 3 dosage groups (900mg/kg, 300mg/kg and 100mg/kg). Meanwhile, six metabolites of GA (15:1) and GA (17:1) were identified by comparison of MS data with reported values. The results of validation in terms of linear ranges, precisions and stabilities were established for semi-quantification of metabolites. The curves of relative changes of these metabolites during the metabolic process were constructed by plotting the peak area ratios of metabolites to salicylic acid (internal standard, IS), respectively. Double peaks were observed in all 3 dose groups. Different type of metabolites and different dosage of each metabolite both resulted in different T max . Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Quantification of Poly(I:C)-Mediated Protection against Genital Herpes Simplex Virus Type 2 Infection

    PubMed Central

    Herbst-Kralovetz, Melissa M.; Pyles, Richard B.

    2006-01-01

    Alternative strategies for controlling the growing herpes simplex virus type 2 (HSV-2) epidemic are needed. A novel class of immunomodulatory microbicides has shown promise as antiherpetics, including intravaginally applied CpG-containing oligodeoxynucleotides that stimulate toll-like receptor 9 (TLR9). In the current study, we quantified protection against experimental genital HSV-2 infection provided by an alternative nucleic acid-based TLR agonist, polyinosine-poly(C) (PIC) (TLR3 agonist). Using a protection quantification paradigm, groups of mice were PIC treated and then subdivided into groups challenged with escalating doses of HSV-2. Using this paradigm, a temporal window of PIC efficacy for single applications was defined as 1 day prior to (prophylactic) through 4 h after (therapeutic) viral challenge. PIC treatment within this window protected against 10-fold-higher HSV-2 challenges, as indicated by increased 50% infectious dose values relative to those for vehicle-treated controls. Disease resolution and survival were significantly enhanced by repetitive PIC doses. Using optimal PIC regimens, cytokine induction was evaluated in murine vaginal lavages and in human vaginal epithelial cells. Similar induction patterns were observed, with kinetics that explained the limited durability of PIC-afforded protection. Daily PIC delivery courses did not generate sustained cytokine levels in murine vaginal fluids that would be indicative of local immunotoxicity. No evidence of immunotoxicity was observed in selected organs that were analyzed following repetitive vaginal PIC doses. Animal and in vitro data indicate that PIC may prove to be a valuable preventative microbicide and/or therapeutic agent against genital herpes by increasing resistance to HSV-2 and enhancing disease resolution following a failure of prevention. PMID:17005677

  9. A new approach to comprehensive quantification of linear landscape elements using biotope types on a regional scale

    NASA Astrophysics Data System (ADS)

    Hirt, Ulrike; Mewes, Melanie; Meyer, Burghard C.

    The structure of a landscape is highly relevant for research and planning (such as fulfilling the requirements of the Water Framework Directive - WFD - and for implementation of comprehensive catchment planning). There is a high potential for restoration of linear landscape elements in most European landscapes. By implementing the WFD in Germany, the restoration of linear landscape elements could be a valuable measure, for example to reduce nutrient input into rivers. Despite this importance of landscape structures for water and nutrients fluxes, biodiversity and the appearance of a landscape, specific studies of the linear elements are rare for larger catchment areas. Existing studies are limited because they either use remote sensing data, which does not adequately differentiate all types of linear landscape elements, or they focus only on a specific type of linear element. To address these limitations, we developed a framework allowing comprehensive quantification of linear landscape elements for catchment areas, using publicly available biotope type data. We analysed the dependence of landscape structures on natural regions and regional soil characteristics. Three data sets (differing in biotopes, soil parameters and natural regions) were generated for the catchment area of the middle Mulde River (2700 km 2) in Germany, using overlay processes in geographic information systems (GIS), followed by statistical evaluation. The linear landscape components of the total catchment area are divided into roads (55%), flowing water (21%), tree rows (14%), avenues (5%), and hedges (2%). The occurrence of these landscape components varies regionally among natural units and different soil regions. For example, the mixed deciduous stands (3.5 m/ha) are far more frequent in foothills (6 m/ha) than in hill country (0.9 m/ha). In contrast, fruit trees are more frequent in hill country (5.2 m/ha) than in the cooler foothills (0.5 m/ha). Some 70% of avenues, and 40% of tree rows

  10. Integrability of generalised type II defects in affine Toda field theory

    NASA Astrophysics Data System (ADS)

    Bristow, Rebecca

    2017-11-01

    The Liouville integrability of the generalised type II defects is investigated. Full integrability is not considered, only the existence of an infinite number of conserved quantities associated with a system containing a defect. For defects in affine Toda field theories (ATFTs) it is shown that momentum conservation is very likely to be a necessary condition for integrability. The defect Lax matrices which guarantee zero curvature, and so an infinite number of conserved quantities, are calculated for the momentum conserving Tzitzéica defect and the momentum conserving D 4 ATFT defect. Some additional calculations pertaining to the D 4 defect are also carried out to find a more complete set of defect potentials than has appeared previously.

  11. Development of a Protein Standard Absolute Quantification (PSAQ™) assay for the quantification of Staphylococcus aureus enterotoxin A in serum.

    PubMed

    Adrait, Annie; Lebert, Dorothée; Trauchessec, Mathieu; Dupuis, Alain; Louwagie, Mathilde; Masselon, Christophe; Jaquinod, Michel; Chevalier, Benoît; Vandenesch, François; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-06-06

    Enterotoxin A (SEA) is a staphylococcal virulence factor which is suspected to worsen septic shock prognosis. However, the presence of SEA in the blood of sepsis patients has never been demonstrated. We have developed a mass spectrometry-based assay for the targeted and absolute quantification of SEA in serum. To enhance sensitivity and specificity, we combined an immunoaffinity-based sample preparation with mass spectrometry analysis in the selected reaction monitoring (SRM) mode. Absolute quantification of SEA was performed using the PSAQ™ method (Protein Standard Absolute Quantification), which uses a full-length isotope-labeled SEA as internal standard. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) were estimated at 352pg/mL and 1057pg/mL, respectively. SEA recovery after immunocapture was determined to be 7.8±1.4%. Therefore, we assumed that less than 1femtomole of each SEA proteotypic peptide was injected on the liquid chromatography column before SRM analysis. From a 6-point titration experiment, quantification accuracy was determined to be 77% and precision at LLOQ was lower than 5%. With this sensitive PSAQ-SRM assay, we expect to contribute to decipher the pathophysiological role of SEA in severe sepsis. This article is part of a Special Issue entitled: Proteomics: The clinical link. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Quantification of multiple gene expression in individual cells.

    PubMed

    Peixoto, António; Monteiro, Marta; Rocha, Benedita; Veiga-Fernandes, Henrique

    2004-10-01

    Quantitative gene expression analysis aims to define the gene expression patterns determining cell behavior. So far, these assessments can only be performed at the population level. Therefore, they determine the average gene expression within a population, overlooking possible cell-to-cell heterogeneity that could lead to different cell behaviors/cell fates. Understanding individual cell behavior requires multiple gene expression analyses of single cells, and may be fundamental for the understanding of all types of biological events and/or differentiation processes. We here describe a new reverse transcription-polymerase chain reaction (RT-PCR) approach allowing the simultaneous quantification of the expression of 20 genes in the same single cell. This method has broad application, in different species and any type of gene combination. RT efficiency is evaluated. Uniform and maximized amplification conditions for all genes are provided. Abundance relationships are maintained, allowing the precise quantification of the absolute number of mRNA molecules per cell, ranging from 2 to 1.28 x 10(9) for each individual gene. We evaluated the impact of this approach on functional genetic read-outs by studying an apparently homogeneous population (monoclonal T cells recovered 4 d after antigen stimulation), using either this method or conventional real-time RT-PCR. Single-cell studies revealed considerable cell-to-cell variation: All T cells did not express all individual genes. Gene coexpression patterns were very heterogeneous. mRNA copy numbers varied between different transcripts and in different cells. As a consequence, this single-cell assay introduces new and fundamental information regarding functional genomic read-outs. By comparison, we also show that conventional quantitative assays determining population averages supply insufficient information, and may even be highly misleading.

  13. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  14. Cues, quantification, and agreement in language comprehension.

    PubMed

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  15. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  16. Assessing Spontaneous Combustion Instability with Recurrence Quantification Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, Chad J.; Casiano, Matthew J.

    2016-01-01

    Spontaneous instabilities can pose a significant challenge to verification of combustion stability, and characterizing its onset is an important avenue of improvement for stability assessments of liquid propellant rocket engines. Recurrence Quantification Analysis (RQA) is used here to explore nonlinear combustion dynamics that might give insight into instability. Multiple types of patterns representative of different dynamical states are identified within fluctuating chamber pressure data, and markers for impending instability are found. A class of metrics which describe these patterns is also calculated. RQA metrics are compared with and interpreted against another metric from nonlinear time series analysis, the Hurst exponent, to help better distinguish between stable and unstable operation.

  17. Automated quantification of myocardial perfusion SPECT using simplified normal limits.

    PubMed

    Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido

    2005-01-01

    To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.

  18. Optimizing total reflection X-ray fluorescence for direct trace element quantification in proteins I: Influence of sample homogeneity and reflector type

    NASA Astrophysics Data System (ADS)

    Wellenreuther, G.; Fittschen, U. E. A.; Achard, M. E. S.; Faust, A.; Kreplin, X.; Meyer-Klaucke, W.

    2008-12-01

    Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence (μXRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.

  19. Nonlinear stability of solar type 3 radio bursts. 1: Theory

    NASA Technical Reports Server (NTRS)

    Smith, R. A.; Goldstein, M. L.; Papadopoulos, K.

    1978-01-01

    A theory of the excitation of solar type 3 bursts is presented. Electrons initially unstable to the linear bump-in-tail instability are shown to rapidly amplify Langmuir waves to energy densities characteristic of strong turbulence. The three-dimensional equations which describe the strong coupling (wave-wave) interactions are derived. For parameters characteristic of the interplanetary medium the equations reduce to one dimension. In this case, the oscillating two stream instability (OTSI) is the dominant nonlinear instability, and is stablized through the production of nonlinear ion density fluctuations that efficiently scatter Langmuir waves out of resonance with the electron beam. An analytical model of the electron distribution function is also developed which is used to estimate the total energy losses suffered by the electron beam as it propagates from the solar corona to 1 A.U. and beyond.

  20. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  1. Renormalization group theory for percolation in time-varying networks.

    PubMed

    Karschau, Jens; Zimmerling, Marco; Friedrich, Benjamin M

    2018-05-22

    Motivated by multi-hop communication in unreliable wireless networks, we present a percolation theory for time-varying networks. We develop a renormalization group theory for a prototypical network on a regular grid, where individual links switch stochastically between active and inactive states. The question whether a given source node can communicate with a destination node along paths of active links is equivalent to a percolation problem. Our theory maps the temporal existence of multi-hop paths on an effective two-state Markov process. We show analytically how this Markov process converges towards a memoryless Bernoulli process as the hop distance between source and destination node increases. Our work extends classical percolation theory to the dynamic case and elucidates temporal correlations of message losses. Quantification of temporal correlations has implications for the design of wireless communication and control protocols, e.g. in cyber-physical systems such as self-organized swarms of drones or smart traffic networks.

  2. Simultaneous quantification of amoxicillin and potassium clavulanate in different commercial drugs using PIXE technique

    NASA Astrophysics Data System (ADS)

    Bejjani, A.; Roumié, M.; Akkad, S.; El-Yazbi, F.; Nsouli, B.

    2016-03-01

    We have demonstrated, in previous studies that Particle Induced X-ray Emission (PIXE) is one of the most rapid and accurate choices for quantification of an active ingredient, in a solid drug, from the reactions induced on its specific heteroatom using pellets made from original tablets. In this work, PIXE is used, for the first time, for simultaneous quantification of two active ingredients, amoxicillin trihydrate and potassium clavulanate, in six different commercial antibiotic type of drugs. Since the quality control process of a drug covers a large number of samples, the scope of this study was also to found the most rapid and low cost sample preparation needed to analyze these drugs with a good precision. The chosen drugs were analyzed in their tablets' "as received" form, in pellets made from the powder of the tablets and also in pellets made from the powder of the tablets after being heated up to 70 °C to avoid any molecular destruction until constant weight and removal of humidity. The quantification validity related to the aspects of each sample preparation (homogeneity of the drug components and humidity) are presented and discussed.

  3. Quantification of maltol in Korean ginseng (Panax ginseng) products by high-performance liquid chromatography-diode array detector

    PubMed Central

    Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won

    2015-01-01

    Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746

  4. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  5. Theory of p-type Zinc Oxide

    NASA Astrophysics Data System (ADS)

    Zhang, Shengbai

    2002-03-01

    Recent advances in bipolar doping of wide gap semiconductors challenge our understanding of impurity and defect properties in these materials, as theories based on equilibrium thermodynamics cannot keep up with these recent developments. For ZnO, the puzzling experimental results involve doping with nitrogen(M. Joseph, H. Tabata, and T. Kawai, Jpn. J. Appl. Phys. 38), L1205 (1999)., arsenic(Y. R. Ryu, S. Zhu, D. C. Look, J. M. Wrobel, H. M. Jeong, and H. W. White, J. Crys. Growth 216), 330 (2000)., and phosphorus(T. Aoki, Y. Hatanaka, and D. C. Look, Appl. Phys. Lett. 76), 3257 (2000).. In this talk, I will discuss some recent theoretical efforts trying to explain the experiments by first-principles total energy calculations. I will first discuss the acceptor level positions for group I and group V impurities. A general trend is observed(C. H. Park, S. B. Zhang, and S.-H. Wei, submitted to Phys. Rev. B.) that substitutional group V impurities on O range from relatively deep (e.g. N) to very deep (e.g. P and As) with high formation energies, whereas substitutional group I impurities on Zn are shallow acceptors. However, substitutional group I impurities are unstable against the formation of interstitials that are donors. A careful examination of N doping in Ref. [1] suggests that one can kinetically suppress the formation of N2 molecules by engineering dopant sources/footnoteY. Yan, S. B. Zhang, and S. T. Pantelides, Phys. Rev. Lett. 86, 5723 (2001).. This leads to significantly enhanced N solubility and hence p-type ZnO. For As [2], our preliminary studies show that the formation energy of AsO is so high that it is an exothermic process to form low-energy complexes that act effectively as relatively shallow acceptors.

  6. Development and application of bio-sample quantification to evaluate stability and pharmacokinetics of inulin-type fructo-oligosaccharides from Morinda Officinalis.

    PubMed

    Chi, Liandi; Chen, Lingxiao; Zhang, Jiwen; Zhao, Jing; Li, Shaoping; Zheng, Ying

    2018-07-15

    Inulin-type fructooligosaccharides (FOS) purified from Morinda Officinalis, with degrees of polymerization (DP) from 3 to 9, have been approved in China as an oral prescribed drug for mild and moderate depression episode, while the stability and oral absorption of this FOS mixtures are largely unknown. As the main active component and quality control marker for above FOS, DP5 was selected as the representative FOS in this study. Desalting method by ion exchange resin was developed to treat bio-sample, followed by separation and quantification by high performance liquid chromatography-charged aerosol detector. Results showed that the DP5 was stepwisely hydrolyzed in simulated gastric fluid and gut microbiota, while maintained stable in intestinal fluid. DP5 has poor permeability across Caco-2 monolayer with P app of 5.22 × 10 -7  cm/s, and very poor oral absorption with bioavailability of (0.50 ± 0.12)% in rat. In conclusion, FOS in Morinda Officinalis demonstrated poor chemical stability in simulated gastric fluid and human gut microbiota, and low oral absorption in rats. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Rapid quantification and sex determination of forensic evidence materials.

    PubMed

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  8. [Consanguinity between meridian theory and Bianque's pulse theory].

    PubMed

    Huang, Longxiang

    2015-05-01

    The integral meridian theory is composed of five parts, including meridian course, syndrome, diagnostic method, treating principle and treatment, and the core of it is meridian syndrome. It has been proved by multiple evidences that the meridian syndrome induced by the pathological change in meridian and the death syndrome of pulse penetrating or attaching to the syndrome are all originated from Bianque' s facial color and pulse diagnosis. And regarding the pulse syndrome,there are many different interpretations based on the theory of yin-yang in four seasons before the Han Dynasty. The emerging of Biaoben diagnostic method in Bianque's pulse method and its extensive clinical application promote a new theoretic interpretation the connection of meridians interpreting pulse syndrome directly. Besides, along with the new development of blood-pulse theory of Bianque's medicine, the revolution on meridian theory is aroused as well its theoretical paradigm turning from "tree" type to "ring" type. In other words, Bianque's medicine not only gives birth to meridian theory, but also decides its final development.

  9. Information Theory Applied to Dolphin Whistle Vocalizations with Possible Application to SETI Signals

    NASA Astrophysics Data System (ADS)

    Doyle, Laurance R.; McCowan, Brenda; Hanser, Sean F.

    2002-01-01

    Information theory allows a quantification of the complexity of a given signaling system. We are applying information theory to dolphin whistle vocalizations, humpback whale songs, squirrel monkey chuck calls, and several other animal communication systems' in order to develop a quantitative and objective way to compare inter species communication systems' complexity. Once signaling units have been correctly classified the communication system must obey certain statistical distributions in order to contain complexity whether it is human languages, dolphin whistle vocalizations, or even a system of communication signals received from an extraterrestrial source.

  10. Quantification of nitrogenous bases, DNA and Collagen type I for the estimation of the postmortem interval in bone remains.

    PubMed

    Pérez-Martínez, Cristina; Pérez-Cárceles, María D; Legaz, Isabel; Prieto-Bonete, Gemma; Luna, Aurelio

    2017-12-01

    Estimating the postmortem interval (PMI) is an important goal in forensic medicine and continues to be one of the most difficult tasks of the forensic investigator. Few accurate methods exist to determine the time since death of skeletonized human remains due to the great number of intrinsic and external factors that may alter the normal course of postmortem change. The purpose of this research was to assess the usefulness of various biochemical parameters, such as nitrogenous bases (adenine, guanine, purines, cytosine, thymine, pyrimidines, hypoxanthine and xanthine), DNA and Collagen Type I peptides to estimate PMI. These parameters were analysed in cortical bone for the establishment of data in a total of 80 long bones of 80 corpses (50 males, 30 females) with a mean age of 68.31 years (S.D.=18.021, range=20-97). The bones were removed from the cement niches of a cemetery in Murcia (south-eastern Spain), where they had lain for between 5 and 47 years (mean time 23.83 years, S.D.=10.85). Our results show a significant decrease in adenine (p=0.0004), guanine (p=0.0001), purines (p=0.0001), cytosine (p=0.0001), thymine (p=0.0226), pyrimidines (p=0.0002) and the number of peptides of Collagen type I (p=0.0053) in those with a PMI≥20 years. In a curvilinear regression analysis the results show that 30.6% of the variable PMI could be explained by guanine concentration, in bones with a PMI<20 years, while in cases of a PMI≥20 years, the variable that best explained membership of this group was adenine (38.0%). In the discriminant analysis applied to the all the variables as a function of PMI when two groups were established, 86.7% of the cases were correctly classified. These results show that the quantification of Collagen type I proteins and nitrogenous bases could be used as a complementary tool, together with other analyses, in the estimation of PMI. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Use of a medication quantification scale for comparison of pain medication usage in patients with complex regional pain syndrome (CRPS).

    PubMed

    Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman

    2015-03-01

    To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P < 0.002), 10.389 (P < 0.001), and 4.984 (P = 0.303), respectively. There appears to be only a weak correlation between amount of pain medication prescribed and patients' reported subjective pain intensity within this limited patient population. The Medication Quantification Scale is a viable tool for the analysis of pharmaceutical treatment of CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.

  12. Development and validation of an RP-HPLC method for quantification of cinnamic acid derivatives and kaurane-type diterpenes in Mikania laevigata and Mikania glomerata.

    PubMed

    Bertolucci, Suzan Kelly; Pereira, Ana Bárbara; Pinto, José Eduardo; de Aquino Ribeiro, José Antônio; de Oliveira, Alaíde Braga; Braga, Fernão Castro

    2009-02-01

    MIKANIA GLOMERATA and MIKANIA LAEVIGATA (Asteraceae) are medicinal plants popularly named 'guaco' in Brazil. The leaves of both species are used to treat respiratory diseases, with coumarin (CO) and kaurane-type diterpenes being regarded as the bioactive constituents. A new and simple RP-HPLC method was developed and validated for the simultaneous quantification of CO, O-coumaric (OC), benzoylgrandifloric (BA), cinnamoylgrandifloric (CA) and kaurenoic (KA) acids in the species. Optimal separation was achieved with an alternating gradient elution of methanol and acetonitrile and detection was carried out by DAD at three different wavelengths: 210 nm for CO, OC, KA; 230 nm for BA; and 270 nm for CA. The extracts showed good stability during 42 hours under normal laboratory conditions (temperature of 23 +/- 2 degrees C). The standard curves were linear over the range 0.5 - 5.0 microg (CO), 0.25 - 4.0 microg (OC), 1.0 - 8.0 microg (BA), 0.5 - 3.0 microg (CA) and 0.8 - 12.0 microg (KA), with R(2) > 0.999 for all compounds. The method showed good precision for intra-day (RSD < 4.6 %) and inter-day assays (RSD < 4.4 %). The recovery was between 99.9 and 105.3 %, except for CO and OC in M. glomerata (73.2 - 91.6 % and 86.3 - 117.4 %, respectively). The limits of quantification and detection were in the range of 0.025 - 0.800 microg and 0.007 - 0.240 microg. The method was tested for new and old columns, temperature variation (26 and 28 degrees C) and by different operators in the same laboratory. The method was successfully applied to samples of both species.

  13. A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.

    PubMed

    Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo

    2008-01-01

    Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.

  14. Quantification and characterization of grouped type I myofibers in human aging.

    PubMed

    Kelly, Neil A; Hammond, Kelley G; Stec, Michael J; Bickel, C Scott; Windham, Samuel T; Tuggle, S Craig; Bamman, Marcas M

    2018-01-01

    Myofiber type grouping is a histological hallmark of age-related motor unit remodeling. Despite the accepted concept that denervation-reinnervation events lead to myofiber type grouping, the completeness of those conversions remains unknown. Type I myofiber grouping was assessed in vastus lateralis biopsies from Young (26 ± 4 years; n = 27) and Older (66 ± 4 years; n = 91) adults. Grouped and ungrouped type I myofibers were evaluated for phenotypic differences. Higher type I grouping in Older versus Young was driven by more myofibers per group (i.e., larger group size) (P < 0.05). In Older only, grouped type I myofibers displayed larger cross-sectional area, more myonuclei, lower capillary supply, and more sarco(endo)plasmic reticulum calcium ATPase I (SERCA I) expression (P < 0.05) than ungrouped type I myofibers. Grouped type I myofibers retain type II characteristics suggesting that conversion during denervation-reinnervation events is either progressive or incomplete. Muscle Nerve 57: E52-E59, 2018. © 2017 Wiley Periodicals, Inc.

  15. Collaborative study on saccharide quantification of the Haemophilus influenzae type b component in liquid vaccine presentations.

    PubMed

    Rosskopf, U; Daas, A; Terao, E; von Hunolstein, C

    2017-01-01

    Before release onto the market, it must be demonstrated that the total and free polysaccharide (poly ribosyl-ribitol-phosphate, PRP) content of Haemophilus influenzae type b (Hib) vaccine complies with requirements. However, manufacturers use different methods to assay PRP content: a national control laboratory must establish and validate the relevant manufacturer methodology before using it to determine PRP content. An international study was organised by the World Health Organization (WHO), in collaboration with the Biological Standardisation Programme (BSP) of the Council of Europe/European Directorate for the Quality of Medicines & HealthCare (EDQM) and of the European Union Commission, to verify the suitability of a single method for determining PRP content in liquid pentavalent vaccines (DTwP-HepB-Hib) containing a whole-cell pertussis component. It consists of HCl hydrolysis followed by chromatographic separation and quantification of ribitol on a CarboPac MA1 column using high-performance anion exchange chromatography coupled with pulsed amperometric detection (HPAEC-PAD). The unconjugated, free, PRP is separated from the total PRP using C4 solid-phase extraction cartridges (SPE C4). Ten quality control laboratories performed two independent analyses applying the proposed analytical test protocol to five vaccine samples, including a vaccine lot with sub-potent PRP content and very high free PRP content. Both WHO PRP standard and ribitol reference standard were included as calibrating standards. A significant bias between WHO PRP standard and ribitol reference standard was observed. Study results showed that the proposed analytical method is, in principle, suitable for the intended use provided that a validation is performed as usually expected from quality control laboratories.

  16. Physical activity: The importance of the extended theory of planned behavior, in type 2 diabetes patients.

    PubMed

    Ferreira, Gabriela; Pereira, M Graça

    2017-09-01

    This study focused on the contribution of the extended theory of planned behavior regarding intention to perform physical activity, adherence to physical activity, and its mediator role in the relationship between trust in the physician and adherence to physical activity, in a sample of 120 patients with type 2 diabetes. The results revealed that positive attitudes and perception of control predicted a stronger intention to do physical activity. The intention to do physical activity was the only predictor of adherence to physical activity. Planning mediated the relationship between trust in the physician and adherence. Implications for patients with type 2 diabetes are discussed.

  17. Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems

    DTIC Science & Technology

    2017-11-27

    ARL-TR-8218•NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J...NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J Ramsey...Rev. 8/98)    Prescribed by ANSI Std. Z39.18 November 2017 Technical Report Survey of Existing Uncertainty Quantification Capabilities for Army

  18. Application of type synthesis theory to the redesign of a complex surgical instrument.

    PubMed

    Lim, Jonas J B; Erdman, Arthur G

    2002-06-01

    Surgical instruments consist of basic mechanical components such as gears, links, pivots, sliders, etc., which are common in mechanical design. This paper describes the application of a method in the analysis and design of complex surgical instruments such as those employed in laparoscopic surgery. This is believed to be the first application of type synthesis theory to a complex medical instrument. Type synthesis is a methodology that can be applied during the conceptual phase of mechanical design. A handle assembly from a patented laparoscopic surgical stapler is used to illustrate the application of the design method developed. Type synthesis is applied on specific subsystems of the mechanism within the handle assembly where alternative design concepts are generated. Chosen concepts are then combined to form a new conceptual design for the handle assembly. The new handle assembly is improved because it has fewer number of parts, is a simpler design and is easier to assemble. Surgical instrument designers may use the methodology presented here to analyze the mechanical subsystems within complex instruments and to create new options that may offer improvements to the original design.

  19. Abelian gauge symmetries in F-theory and dual theories

    NASA Astrophysics Data System (ADS)

    Song, Peng

    In this dissertation, we focus on important physical and mathematical aspects, especially abelian gauge symmetries, of F-theory compactifications and its dual formulations within type IIB and heterotic string theory. F-theory is a non-perturbative formulation of type IIB string theory which enjoys important dualities with other string theories such as M-theory and E8 x E8 heterotic string theory. One of the main strengths of F-theory is its geometrization of many physical problems in the dual string theories. In particular, its study requires a lot of mathematical tools such as advanced techniques in algebraic geometry. Thus, it has also received a lot of interests among mathematicians, and is a vivid area of research within both the physics and the mathematics community. Although F-theory has been a long-standing theory, abelian gauge symmetry in Ftheory has been rarely studied, until recently. Within the mathematics community, in 2009, Grassi and Perduca first discovered the possibility of constructing elliptically fibered varieties with non-trivial toric Mordell-Weil group. In the physics community, in 2012, Morrison and Park first made a major advancement by constructing general F-theory compactifications with U(1) abelian gauge symmetry. They found that in such cases, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the blow-up of the weighted projective space P(1;1;2) at one point. Subsequent developments have been made by Cvetic, Klevers and Piragua extended the works of Morrison and Park and constructed general F-theory compactifications with U(1) x U(1) abelian gauge symmetry. They found that in the U(1) x U(1) abelian gauge symmetry case, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the del Pezzo surface dP2. In chapter 2 of this dissertation, I bring this a step further by

  20. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  1. Development of defined microbial population standards using fluorescence activated cell sorting for the absolute quantification of S. aureus using real-time PCR.

    PubMed

    Martinon, Alice; Cronin, Ultan P; Wilkinson, Martin G

    2012-01-01

    In this article, four types of standards were assessed in a SYBR Green-based real-time PCR procedure for the quantification of Staphylococcus aureus (S. aureus) in DNA samples. The standards were purified S. aureus genomic DNA (type A), circular plasmid DNA containing a thermonuclease (nuc) gene fragment (type B), DNA extracted from defined populations of S. aureus cells generated by Fluorescence Activated Cell Sorting (FACS) technology with (type C) or without purification of DNA by boiling (type D). The optimal efficiency of 2.016 was obtained on Roche LightCycler(®) 4.1. software for type C standards, whereas the lowest efficiency (1.682) corresponded to type D standards. Type C standards appeared to be more suitable for quantitative real-time PCR because of the use of defined populations for construction of standard curves. Overall, Fieller Confidence Interval algorithm may be improved for replicates having a low standard deviation in Cycle Threshold values such as found for type B and C standards. Stabilities of diluted PCR standards stored at -20°C were compared after 0, 7, 14 and 30 days and were lower for type A or C standards compared with type B standards. However, FACS generated standards may be useful for bacterial quantification in real-time PCR assays once optimal storage and temperature conditions are defined.

  2. Sequence optimization to reduce velocity offsets in cardiovascular magnetic resonance volume flow quantification - A multi-vendor study

    PubMed Central

    2011-01-01

    Purpose Eddy current induced velocity offsets are of concern for accuracy in cardiovascular magnetic resonance (CMR) volume flow quantification. However, currently known theoretical aspects of eddy current behavior have not led to effective guidelines for the optimization of flow quantification sequences. This study is aimed at identifying correlations between protocol parameters and the resulting velocity error in clinical CMR flow measurements in a multi-vendor study. Methods Nine 1.5T scanners of three different types/vendors were studied. Measurements were performed on a large stationary phantom. Starting from a clinical breath-hold flow protocol, several protocol parameters were varied. Acquisitions were made in three clinically relevant orientations. Additionally, a time delay between the bipolar gradient and read-out, asymmetric versus symmetric velocity encoding, and gradient amplitude and slew rate were studied in adapted sequences as exploratory measurements beyond the protocol. Image analysis determined the worst-case offset for a typical great-vessel flow measurement. Results The results showed a great variation in offset behavior among scanners (standard deviation among samples of 0.3, 0.4, and 0.9 cm/s for the three different scanner types), even for small changes in the protocol. Considering the absolute values, none of the tested protocol settings consistently reduced the velocity offsets below the critical level of 0.6 cm/s neither for all three orientations nor for all three scanner types. Using multilevel linear model analysis, oblique aortic and pulmonary slices showed systematic higher offsets than the transverse aortic slices (oblique aortic 0.6 cm/s, and pulmonary 1.8 cm/s higher than transverse aortic). The exploratory measurements beyond the protocol yielded some new leads for further sequence development towards reduction of velocity offsets; however those protocols were not always compatible with the time-constraints of breath

  3. Hydrologic Impacts of Climate Change: Quantification of Uncertainties (Alexander von Humboldt Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Mujumdar, Pradeep P.

    2014-05-01

    with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.

  4. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  5. Prediction of autosomal STR typing success in ancient and Second World War bone samples.

    PubMed

    Zupanič Pajnič, Irena; Zupanc, Tomaž; Balažic, Jože; Geršak, Živa Miriam; Stojković, Oliver; Skadrić, Ivan; Črešnar, Matija

    2017-03-01

    Human-specific quantitative PCR (qPCR) has been developed for forensic use in the last 10 years and is the preferred DNA quantification technique since it is very accurate, sensitive, objective, time-effective and automatable. The amount of information that can be gleaned from a single quantification reaction using commercially available quantification kits has increased from the quantity of nuclear DNA to the amount of male DNA, presence of inhibitors and, most recently, to the degree of DNA degradation. In skeletal remains samples from disaster victims, missing persons and war conflict victims, the DNA is usually degraded. Therefore the new commercial qPCR kits able to assess the degree of degradation are potentially able to predict the success of downstream short tandem repeat (STR) typing. The goal of this study was to verify the quantification step using the PowerQuant kit with regard to its suitability as a screening method for autosomal STR typing success on ancient and Second World War (WWII) skeletal remains. We analysed 60 skeletons excavated from five archaeological sites and four WWII mass graves from Slovenia. The bones were cleaned, surface contamination was removed and the bones ground to a powder. Genomic DNA was obtained from 0.5g of bone powder after total demineralization. The DNA was purified using a Biorobot EZ1 device. Following PowerQuant quantification, DNA samples were subjected to autosomal STR amplification using the NGM kit. Up to 2.51ng DNA/g of powder were extracted. No inhibition was detected in any of bones analysed. 82% of the WWII bones gave full profiles while 73% of the ancient bones gave profiles not suitable for interpretation. Four bone extracts yielded no detectable amplification or zero quantification results and no profiles were obtained from any of them. Full or useful partial profiles were produced only from bone extracts where short autosomal (Auto) and long degradation (Deg) PowerQuant targets were detected. It is

  6. Bianchi type-I domain walls with negative constant deceleration parameter in Brans-Dicke theory

    NASA Astrophysics Data System (ADS)

    Katore, S. D.

    2011-04-01

    Bianchi type-I space-time is considered in the presence of a domain walls source in the scalar-tensor theory of gravitation proposed by Brans and Dicke (C.H. Brans and R.H. Dicke, Phys. Rev. 24, 925 (1961)). With the help of the special law of variation for Hubble's parameter proposed by Bermann (M.S. Berman, Nuovo Cimento B 74, 182 (1983)) a cosmological model with negative constant deceleration parameter is obtained in the presence of domain walls. Some physical properties of the model are also discussed.

  7. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  8. Game Theory and Uncertainty Quantification for Cyber Defense Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna

    Cyber-system defenders face the challenging task of protecting critical assets and information continually against multiple types of malicious attackers. Defenders typically operate within resource constraints while attackers operate at relatively low costs. As a result, design and development of resilient cyber-systems that can support mission goals under attack while accounting for the dynamics between attackers and defenders is an important research problem.

  9. Quantification of differential gene expression by multiplexed targeted resequencing of cDNA

    PubMed Central

    Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.

    2017-01-01

    Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677

  10. The parallel reaction monitoring method contributes to a highly sensitive polyubiquitin chain quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsuchiya, Hikaru; Tanaka, Keiji, E-mail: tanaka-kj@igakuken.or.jp; Saeki, Yasushi, E-mail: saeki-ys@igakuken.or.jp

    2013-06-28

    Highlights: •The parallel reaction monitoring method was applied to ubiquitin quantification. •The ubiquitin PRM method is highly sensitive even in biological samples. •Using the method, we revealed that Ufd4 assembles the K29-linked ubiquitin chain. -- Abstract: Ubiquitylation is an essential posttranslational protein modification that is implicated in a diverse array of cellular functions. Although cells contain eight structurally distinct types of polyubiquitin chains, detailed function of several chain types including K29-linked chains has remained largely unclear. Current mass spectrometry (MS)-based quantification methods are highly inefficient for low abundant atypical chains, such as K29- and M1-linked chains, in complex mixtures thatmore » typically contain highly abundant proteins. In this study, we applied parallel reaction monitoring (PRM), a quantitative, high-resolution MS method, to quantify ubiquitin chains. The ubiquitin PRM method allows us to quantify 100 attomole amounts of all possible ubiquitin chains in cell extracts. Furthermore, we quantified ubiquitylation levels of ubiquitin-proline-β-galactosidase (Ub-P-βgal), a historically known model substrate of the ubiquitin fusion degradation (UFD) pathway. In wild-type cells, Ub-P-βgal is modified with ubiquitin chains consisting of 21% K29- and 78% K48-linked chains. In contrast, K29-linked chains are not detected in UFD4 knockout cells, suggesting that Ufd4 assembles the K29-linked ubiquitin chain(s) on Ub-P-βgal in vivo. Thus, the ubiquitin PRM is a novel, useful, quantitative method for analyzing the highly complicated ubiquitin system.« less

  11. Current trends in nursing theories.

    PubMed

    Im, Eun-Ok; Chang, Sun Ju

    2012-06-01

    To explore current trends in nursing theories through an integrated literature review. The literature related to nursing theories during the past 10 years was searched through multiple databases and reviewed to determine themes reflecting current trends in nursing theories. The trends can be categorized into six themes: (a) foci on specifics; (b) coexistence of various types of theories; (c) close links to research; (d) international collaborative works; (e) integration to practice; and (f) selective evolution. We need to make our continuous efforts to link research and practice to theories, to identify specifics of our theories, to develop diverse types of theories, and to conduct international collaborative works. Our paper gives implications for future theoretical development in diverse clinical areas of nursing research and practice. © 2012 Sigma Theta Tau International.

  12. Artifacts Quantification of Metal Implants in MRI

    NASA Astrophysics Data System (ADS)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  13. Uncertainty quantification for optical model parameters

    DOE PAGES

    Lovell, A. E.; Nunes, F. M.; Sarich, J.; ...

    2017-02-21

    Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of our work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fitmore » and create corresponding 95% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. Here, we study a number of reactions involving neutron and deuteron projectiles with energies in the range of 5–25 MeV/u, on targets with mass A=12–208. We investigate the correlations between the parameters in the fit. The case of deuterons on 12C is discussed in detail: the elastic-scattering fit and the prediction of 12C(d,p) 13C transfer angular distributions, using both uncorrelated and correlated χ 2 minimization functions. The general features for all cases are compiled in a systematic manner to identify trends. This work shows that, in many cases, the correlated χ 2 functions (in comparison to the uncorrelated χ 2 functions) provide a more natural parameterization of the process. These correlated functions do, however, produce broader confidence bands. Further optimization may require improvement in the models themselves and/or more information included in the fit.« less

  14. Collagen Quantification in Tissue Specimens.

    PubMed

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  15. Use of the Hage framework for theory construction: Factors affecting glucose control in the college-aged student with type 1 diabetes.

    PubMed

    Meyer, Rebecca A; Fish, Anne F; Lou, Qinqing

    2017-10-01

    This article describes the Hage framework for theory construction and its application to the clinical problem of glycemic control in college-aged students with type 1 diabetes. College-aged students with type 1 diabetes struggle to self-manage their condition. Glycated hemoglobin (HbA1c), if controlled within acceptable limits (6-8%), is associated with the prevention or delay of serious diabetic complications such as kidney and cardiovascular disease. Diabetes educators provide knowledge and skills, but young adults must self-manage their condition on a daily basis, independent of parents. The Hage framework includes five tasks of theory construction: narrowing and naming the concepts, specifying the definitions, creating the theoretical statements, specifying the linkages, and ordering components in preparation for model building. During the process, concepts within the theory were revised as the literature was reviewed, and measures and hypotheses, foundational to research, were generated. We were successful in applying the framework and creating a model of factors affecting glycemic control, emphasizing that physical activity, thought of as a normal part of wellness, can be a two-edged sword producing positive effect but also serious negative effects in some college-aged students with type 1 diabetes. Contextual factors important to self-management in college-aged students are emphasized. The Hage framework, already used to a small extent in nursing curricula, deserves more attention and, because of its generic nature, may be used as a template for theory construction to examine a wide variety of nursing topics. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  17. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  18. The early life origin theory in the development of cardiovascular disease and type 2 diabetes.

    PubMed

    Lindblom, Runa; Ververis, Katherine; Tortorella, Stephanie M; Karagiannis, Tom C

    2015-04-01

    Life expectancy has been examined from a variety of perspectives in recent history. Epidemiology is one perspective which examines causes of morbidity and mortality at the population level. Over the past few 100 years there have been dramatic shifts in the major causes of death and expected life length. This change has suffered from inconsistency across time and space with vast inequalities observed between population groups. In current focus is the challenge of rising non-communicable diseases (NCD), such as cardiovascular disease and type 2 diabetes mellitus. In the search to discover methods to combat the rising incidence of these diseases, a number of new theories on the development of morbidity have arisen. A pertinent example is the hypothesis published by David Barker in 1995 which postulates the prenatal and early developmental origin of adult onset disease, and highlights the importance of the maternal environment. This theory has been subject to criticism however it has gradually gained acceptance. In addition, the relatively new field of epigenetics is contributing evidence in support of the theory. This review aims to explore the implication and limitations of the developmental origin hypothesis, via an historical perspective, in order to enhance understanding of the increasing incidence of NCDs, and facilitate an improvement in planning public health policy.

  19. Group field theory with noncommutative metric variables.

    PubMed

    Baratin, Aristide; Oriti, Daniele

    2010-11-26

    We introduce a dual formulation of group field theories as a type of noncommutative field theories, making their simplicial geometry manifest. For Ooguri-type models, the Feynman amplitudes are simplicial path integrals for BF theories. We give a new definition of the Barrett-Crane model for gravity by imposing the simplicity constraints directly at the level of the group field theory action.

  20. THE EFFECT OF FORWARD SPEED ON DAMPING FOR A VARIETY OF SHIP TYPES AS CALCULATED BY THIN SHIP THEORY,

    DTIC Science & Technology

    Since the damping coefficients play a predominant role in the motion response of ships in pitch and heave at resonant frequencies in a seaway, use...was made of two computer programs recently developed at M. I. T. to calculate, by thin ship theory, the effect of ship speed on the damping coefficients...in pitch and heave for four diverse types of ship hulls--cargo ship, tanker, destroyer, and trawler. Results indicate that, for all four hull types

  1. Quantification of micro stickies

    Treesearch

    Mahendra Doshi; Jeffrey Dyer; Salman Aziz; Kristine Jackson; Said M. Abubakr

    1997-01-01

    The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...

  2. Quantification of human responses

    NASA Technical Reports Server (NTRS)

    Steinlage, R. C.; Gantner, T. E.; Lim, P. Y. W.

    1992-01-01

    Human perception is a complex phenomenon which is difficult to quantify with instruments. For this reason, large panels of people are often used to elicit and aggregate subjective judgments. Print quality, taste, smell, sound quality of a stereo system, softness, and grading Olympic divers and skaters are some examples of situations where subjective measurements or judgments are paramount. We usually express what is in our mind through language as a medium but languages are limited in available choices of vocabularies, and as a result, our verbalizations are only approximate expressions of what we really have in mind. For lack of better methods to quantify subjective judgments, it is customary to set up a numerical scale such as 1, 2, 3, 4, 5 or 1, 2, 3, ..., 9, 10 for characterizing human responses and subjective judgments with no valid justification except that these scales are easy to understand and convenient to use. But these numerical scales are arbitrary simplifications of the complex human mind; the human mind is not restricted to such simple numerical variations. In fact, human responses and subjective judgments are psychophysical phenomena that are fuzzy entities and therefore difficult to handle by conventional mathematics and probability theory. The fuzzy mathematical approach provides a more realistic insight into understanding and quantifying human responses. This paper presents a method for quantifying human responses and subjective judgments without assuming a pattern of linear or numerical variation for human responses. In particular, quantification and evaluation of linguistic judgments was investigated.

  3. Troubling Theory in Case Study Research

    ERIC Educational Resources Information Center

    Hammersley, Martyn

    2012-01-01

    The article begins by examining the variety of meanings that can be given to the word "theory", the different attitudes that may be taken towards theories of these various types and some of the problems associated with them. The second half of the article focuses on one of these types, explanatory theory, and the question of what is required if…

  4. Toward an integrative account of social cognition: marrying theory of mind and interactionism to study the interplay of Type 1 and Type 2 processes

    PubMed Central

    Bohl, Vivian; van den Bos, Wouter

    2012-01-01

    Traditional theory of mind (ToM) accounts for social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional ToM accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering ToM and interactionism as mutually exclusive opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labeled Type 1) refers to processes that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labeled Type 2) refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, ToM is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behavior may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction. PMID:23087631

  5. Toward an integrative account of social cognition: marrying theory of mind and interactionism to study the interplay of Type 1 and Type 2 processes.

    PubMed

    Bohl, Vivian; van den Bos, Wouter

    2012-01-01

    Traditional theory of mind (ToM) accounts for social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional ToM accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering ToM and interactionism as mutually exclusive opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labeled Type 1) refers to processes that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labeled Type 2) refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, ToM is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behavior may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction.

  6. Spatial distribution of sediment storage types and quantification of valley fill deposits in an alpine basin, Reintal, Bavarian Alps, Germany

    NASA Astrophysics Data System (ADS)

    Schrott, Lothar; Hufschmidt, Gabi; Hankammer, Martin; Hoffmann, Thomas; Dikau, Richard

    2003-09-01

    Spatial patterns of sediment storage types and associated volumes using a novel approach for quantifying valley fill deposits are presented for a small alpine catchment (17 km 2) in the Bavarian Alps. The different sediment storage types were analysed with respect to geomorphic coupling and sediment flux activity. The most landforms in the valley in terms of surface area were found to be talus slopes (sheets and cones) followed by rockfall deposits and alluvial fans and plains. More than two-thirds of the talus slopes are relict landforms, completely decoupled from the geomorphic system. Notable sediment transport is limited to avalanche tracks, debris flows, and along floodplains. Sediment volumes were calculated using a combination of polynomial functions of cross sections, seismic refraction, and GIS modelling. A total of, 66 seismic refraction profiles were carried out throughout the valley for a more precise determination of sediment thicknesses and to check the bedrock data generated from geomorphometric analysis. We calculated the overall sediment volume of the valley fill deposits to be 0.07 km 3. This corresponds to a mean sediment thickness of 23.3 m. The seismic refraction data showed that large floodplains and sedimentation areas, which have been developed through damming effects from large rockfalls, are in general characterised by shallow sediment thicknesses (<20 m). By contrast, the thickness of several talus slopes is more than twice as much. For some locations (e.g., narrow sections of valley), the polynomial-generated cross sections resulted in overestimations of up to one order of magnitude; whereas in sections with a moderate valley shape, the modelled cross sections are in good accordance with the obtained seismic data. For the quantification of valley fill deposits, a combined application of bedrock data derived from polynomials and geophysical prospecting is highly recommended.

  7. Validated RP-HPLC/DAD Method for the Quantification of Insect Repellent Ethyl 2-Aminobenzoate in Membrane-Moderated Matrix Type Monolithic Polymeric Device.

    PubMed

    Islam, Johirul; Zaman, Kamaruz; Chakrabarti, Srijita; Sharma Bora, Nilutpal; Mandal, Santa; Pratim Pathak, Manash; Srinivas Raju, Pakalapati; Chattopadhyay, Pronobesh

    2017-07-01

    A simple, accurate and sensitive reversed-phase high-performance liquid chromatographic (RP-HPLC) method has been developed for the estimation of ethyl 2-aminobenzoate (EAB) in a matrix type monolithic polymeric device and validated as per the International Conference on Harmonization guidelines. The analysis was performed isocratically on a ZORBAX Eclipse plus C18 analytical column (250 × 4.4 mm, 5 μm) and a diode array detector (DAD) using acetonitrile and water (75:25 v/v) as the mobile phase by keeping the flow-rate constant at 1.0 mL/min. Determination of EAB was not interfered in the presence of excipients. Inter- and intra-day relative standard deviations were not higher than 2%. Mean recovery was between 98.7 and 101.3%. Calibration curve was linear in the concentration range of 0.5-10 µg/mL. Limits of detection and quantification were 0.19 and 0.60 µg/mL, respectively. Thus, the present report put forward a novel method for the estimation of EAB, an emerging insect repellent, by using RP-HPLC technique. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Lesion Quantification in Dual-Modality Mammotomography

    NASA Astrophysics Data System (ADS)

    Li, Heng; Zheng, Yibin; More, Mitali J.; Goodale, Patricia J.; Williams, Mark B.

    2007-02-01

    This paper describes a novel x-ray/SPECT dual modality breast imaging system that provides 3D structural and functional information. While only a limited number of views on one side of the breast can be acquired due to mechanical and time constraints, we developed a technique to compensate for the limited angle artifact in reconstruction images and accurately estimate both the lesion size and radioactivity concentration. Various angular sampling strategies were evaluated using both simulated and experimental data. It was demonstrated that quantification of lesion size to an accuracy of 10% and quantification of radioactivity to an accuracy of 20% are feasible from limited-angle data acquired with clinically practical dosage and acquisition time

  9. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  10. Protein quantification using a cleavable reporter peptide.

    PubMed

    Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno

    2015-02-06

    Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.

  11. Phylogenetic Quantification of Intra-tumour Heterogeneity

    PubMed Central

    Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian

    2014-01-01

    Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184

  12. Objective quantification of the tinnitus decompensation by synchronization measures of auditory evoked single sweeps.

    PubMed

    Strauss, Daniel J; Delb, Wolfgang; D'Amelio, Roberto; Low, Yin Fen; Falkai, Peter

    2008-02-01

    Large-scale neural correlates of the tinnitus decompensation might be used for an objective evaluation of therapies and neurofeedback based therapeutic approaches. In this study, we try to identify large-scale neural correlates of the tinnitus decompensation using wavelet phase stability criteria of single sweep sequences of late auditory evoked potentials as synchronization stability measure. The extracted measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. We provide an interpretation for our results by a neural model of top-down projections based on the Jastreboff tinnitus model combined with the adaptive resonance theory which has not been applied to model tinnitus so far. Using this model, our stability measure of evoked potentials can be linked to the focus of attention on the tinnitus signal. It is concluded that the wavelet phase stability of late auditory evoked potential single sweeps might be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory.

  13. Scoliosis: review of types of curves, etiological theories and conservative treatment.

    PubMed

    Shakil, Halima; Iqbal, Zaheen A; Al-Ghadir, Ahmad H

    2014-01-01

    Scoliosis is the deviation in the normal vertical spine. Although there are numerous studies available about treatment approaches for scoliosis, the numbers of studies that talk about its etiology and pathology are limited. Aim of this study was to discuss the different types of scoliosis; its curves and etiological theories; and to note their implication on its treatment. We examined various electronic databases including Pub MED, Medline, Cinhal, Cochrane library and Google scholar using key words "scoliosis", "etiology", "pathology" and "conservative treatment". References of obtained articles were also examined for cross references. The search was limited to articles in English language. A total of 145 papers, about Prevalence, History, Symptoms, classification, Biomechanics, Pathogenesis, Kinematics and Treatment of scoliosis were identified to be relevant. To choose the appropriate treatment approach for scoliosis we need to understand its etiology and pathogenesis first. Early intervention with conservative treatment like physiotherapy and bracing can prevent surgery.

  14. Ultrasensitive Quantification of Hepatitis B Virus A1762T/G1764A Mutant by a SimpleProbe PCR Using a Wild-Type-Selective PCR Blocker and a Primer-Blocker-Probe Partial-Overlap Approach ▿

    PubMed Central

    Nie, Hui; Evans, Alison A.; London, W. Thomas; Block, Timothy M.; Ren, Xiangdong David

    2011-01-01

    Hepatitis B virus (HBV) carrying the A1762T/G1764A double mutation in the basal core promoter (BCP) region is associated with HBe antigen seroconversion and increased risk of liver cirrhosis and hepatocellular carcinoma (HCC). Quantification of the mutant viruses may help in predicting the risk of HCC. However, the viral genome tends to have nucleotide polymorphism, which makes it difficult to design hybridization-based assays including real-time PCR. Ultrasensitive quantification of the mutant viruses at the early developmental stage is even more challenging, as the mutant is masked by excessive amounts of the wild-type (WT) viruses. In this study, we developed a selective inhibitory PCR (siPCR) using a locked nucleic acid-based PCR blocker to selectively inhibit the amplification of the WT viral DNA but not the mutant DNA. At the end of siPCR, the proportion of the mutant could be increased by about 10,000-fold, making the mutant more readily detectable by downstream applications such as real-time PCR and DNA sequencing. We also describe a primer-probe partial overlap approach which significantly simplified the melting curve patterns and minimized the influence of viral genome polymorphism on assay accuracy. Analysis of 62 patient samples showed a complete match of the melting curve patterns with the sequencing results. More than 97% of HBV BCP sequences in the GenBank database can be correctly identified by the melting curve analysis. The combination of siPCR and the SimpleProbe real-time PCR enabled mutant quantification in the presence of a 100,000-fold excess of the WT DNA. PMID:21562108

  15. Quantification of sensory and food quality: the R-index analysis.

    PubMed

    Lee, Hye-Seong; van Hout, Danielle

    2009-08-01

    The accurate quantification of sensory difference/similarity between foods, as well as consumer acceptance/preference and concepts, is greatly needed to optimize and maintain food quality. The R-Index is one class of measures of the degree of difference/similarity, and was originally developed for sensory difference tests for food quality control, product development, and so on. The index is based on signal detection theory and is free of the response bias that can invalidate difference testing protocols, including categorization and same-different and A-Not A tests. It is also a nonparametric analysis, making no assumptions about sensory distributions, and is simple to compute and understand. The R-Index is also flexible in its application. Methods based on R-Index analysis have been used as detection and sensory difference tests, as simple alternatives to hedonic scaling, and for the measurement of consumer concepts. This review indicates the various computational strategies for the R-Index and its practical applications to consumer and sensory measurements in food science.

  16. Quantification of protein carbonylation.

    PubMed

    Wehr, Nancy B; Levine, Rodney L

    2013-01-01

    Protein carbonylation is the most commonly used measure of oxidative modification of proteins. It is most often measured spectrophotometrically or immunochemically by derivatizing proteins with the classical carbonyl reagent 2,4 dinitrophenylhydrazine (DNPH). We present protocols for the derivatization and quantification of protein carbonylation with these two methods, including a newly described dot blot with greatly increased sensitivity.

  17. Detection and quantification of genetically modified organisms using very short, locked nucleic acid TaqMan probes.

    PubMed

    Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio

    2008-06-25

    Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.

  18. Quantification of regional cerebral blood flow and volume with dynamic susceptibility contrast-enhanced MR imaging.

    PubMed

    Rempp, K A; Brix, G; Wenz, F; Becker, C R; Gückel, F; Lorenz, W J

    1994-12-01

    Quantification of regional cerebral blood flow (rCBF) and volume (rCBV) with dynamic magnetic resonance (MR) imaging. After bolus administration of a paramagnetic contrast medium, rapid T2*-weighted gradient-echo images of two sections were acquired for the simultaneous creation of concentration-time curves in the brain-feeding arteries and in brain tissue. Absolute rCBF and rCBV values were determined for gray and white brain matter in 12 subjects with use of principles of the indicator dilution theory. The mean rCBF value in gray matter was 69.7 mL/min +/- 29.7 per 100 g tissue and in white matter, 33.6 mL/min +/- 11.5 per 100 g tissue; the average rCBV was 8.0 mL +/- 3.1 per 100 g tissue and 4.2 mL +/- 1.0 per 100 g tissue, respectively. An age-related decrease in rCBF and rCBV for gray and white matter was observed. Preliminary data demonstrate that the proposed technique allows the quantification of rCBF and rCBV. Although the results are in good agreement with data from positron emission tomography studies, further evaluation is needed to establish the validity of method.

  19. Evaluation of digital PCR for absolute RNA quantification.

    PubMed

    Sanders, Rebecca; Mason, Deborah J; Foy, Carole A; Huggett, Jim F

    2013-01-01

    Gene expression measurements detailing mRNA quantities are widely employed in molecular biology and are increasingly important in diagnostic fields. Reverse transcription (RT), necessary for generating complementary DNA, can be both inefficient and imprecise, but remains a quintessential RNA analysis tool using qPCR. This study developed a Transcriptomic Calibration Material and assessed the RT reaction using digital (d)PCR for RNA measurement. While many studies characterise dPCR capabilities for DNA quantification, less work has been performed investigating similar parameters using RT-dPCR for RNA analysis. RT-dPCR measurement using three, one-step RT-qPCR kits was evaluated using single and multiplex formats when measuring endogenous and synthetic RNAs. The best performing kit was compared to UV quantification and sensitivity and technical reproducibility investigated. Our results demonstrate assay and kit dependent RT-dPCR measurements differed significantly compared to UV quantification. Different values were reported by different kits for each target, despite evaluation of identical samples using the same instrument. RT-dPCR did not display the strong inter-assay agreement previously described when analysing DNA. This study demonstrates that, as with DNA measurement, RT-dPCR is capable of accurate quantification of low copy RNA targets, but the results are both kit and target dependent supporting the need for calibration controls.

  20. Diagnosis of dementia--automatic quantification of brain structures.

    PubMed

    Engedal, Knut; Brækhus, Anne; Andreassen, Ole A; Nakstad, Per Hj

    2012-08-21

    The aim of the present study was to examine the usefulness of a fully automatic quantification of brain structures by means of magnetic resonance imaging (MRI) for diagnosing dementia of the Alzheimer's type (DAT). MRI scans of the brains of 122 patients, referred to a memory clinic, were analysed using Neuroquant® software, which quantifies the volume of various brain structures. Clinical diagnoses were made by two doctors without knowledge of the MRI results. We performed Receiver Operating Characteristic analyses and calculated the area under the curve (AUC). A value of 1 means that all ill patients have been diagnosed as diseased and no patient has been falsely diagnosed as diseased. The mean age of the patients was 67.2 years (SD 10.5 years), 60 % were men, 63 had DAT, 24 had another type of dementia, 25 had mild cognitive impairment (MCI) and ten had subjective cognitive impairment (SCI). In the comparison between DAT patients and patients with SCI or MCI, seven of eleven volumes were significantly larger than AUC 0.5. Positive and negative likelihood ratios were less than 5 and more than 0.2, respectively, for the best limit values of the volumes. Apart from the cerebellum (AUC 0.67), none of the brain structures was significantly different from AUC 0.5 in patients with dementia conditions other than dementia Alzheimer's type. MRI scans with Neuroquant analyses cannot be used alone to distinguish between persons with dementia of Alzheimer's type and persons without dementia.

  1. Demographic and Motivation Differences Among Online Sex Offenders by Type of Offense: An Exploration of Routine Activities Theories.

    PubMed

    Navarro, Jordana N; Jasinski, Jana L

    2015-01-01

    This article presents an analysis of the relationship between online sexual offenders' demographic background and characteristics indicative of motivation and offense type. Specifically, we investigate whether these characteristics can distinguish different online sexual offender groups from one another as well as inform routine activity theorists on what potentially motivates perpetrators. Using multinomial logistic regression, this study found that online sexual offenders' demographic backgrounds and characteristics indicative of motivation do vary by offense types. Two important implications of this study are that the term "online sexual offender" encompasses different types of offenders, including some who do not align with mainstream media's characterization of "predators," and that the potential offender within routine activity theory can be the focus of empirical investigation rather than taken as a given in research.

  2. Flory-type theories of polymer chains under different external stimuli

    NASA Astrophysics Data System (ADS)

    Budkov, Yu A.; Kiselev, M. G.

    2018-01-01

    In this Review, we present a critical analysis of various applications of the Flory-type theories to a theoretical description of the conformational behavior of single polymer chains in dilute polymer solutions under a few external stimuli. Different theoretical models of flexible polymer chains in the supercritical fluid are discussed and analysed. Different points of view on the conformational behavior of the polymer chain near the liquid-gas transition critical point of the solvent are presented. A theoretical description of the co-solvent-induced coil-globule transitions within the implicit-solvent-explicit-co-solvent models is discussed. Several explicit-solvent-explicit-co-solvent theoretical models of the coil-to-globule-to-coil transition of the polymer chain in a mixture of good solvents (co-nonsolvency) are analysed and compared with each other. Finally, a new theoretical model of the conformational behavior of the dielectric polymer chain under the external constant electric field in the dilute polymer solution with an explicit account for the many-body dipole correlations is discussed. The polymer chain collapse induced by many-body dipole correlations of monomers in the context of statistical thermodynamics of dielectric polymers is analysed.

  3. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    PubMed

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  4. Transition operators in electromagnetic-wave diffraction theory - General theory

    NASA Technical Reports Server (NTRS)

    Hahne, G. E.

    1992-01-01

    A formal theory is developed for the scattering of time-harmonic electromagnetic waves from impenetrable immobile obstacles with given linear, homogeneous, and generally nonlocal boundary conditions of Leontovich (impedance) type for the wave of the obstacle's surface. The theory is modeled on the complete Green's function and the transition (T) operator in time-independent formal scattering theory of nonrelativistic quantum mechanics. An expression for the differential scattering cross section for plane electromagnetic waves is derived in terms of certain matrix elements of the T operator for the obstacle.

  5. From Quantification to Visualization: A Taxonomy of Uncertainty Visualization Approaches

    PubMed Central

    Potter, Kristin; Rosen, Paul; Johnson, Chris R.

    2014-01-01

    Quantifying uncertainty is an increasingly important topic across many domains. The uncertainties present in data come with many diverse representations having originated from a wide variety of disciplines. Communicating these uncertainties is a task often left to visualization without clear connection between the quantification and visualization. In this paper, we first identify frequently occurring types of uncertainty. Second, we connect those uncertainty representations to ones commonly used in visualization. We then look at various approaches to visualizing this uncertainty by partitioning the work based on the dimensionality of the data and the dimensionality of the uncertainty. We also discuss noteworthy exceptions to our taxonomy along with future research directions for the uncertainty visualization community. PMID:25663949

  6. Automated lobar quantification of emphysema in patients with severe COPD.

    PubMed

    Revel, Marie-Pierre; Faivre, Jean-Baptiste; Remy-Jardin, Martine; Deken, Valérie; Duhamel, Alain; Marquette, Charles-Hugo; Tacelli, Nunzia; Bakai, Anne-Marie; Remy, Jacques

    2008-12-01

    Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p > 0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC = 0.94), left lower lobe (ICC = 0.98), and right lower lobe (ICC = 0.80). The agreement was good for right upper lobe (ICC = 0.68) and moderate for middle lobe (IC = 0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring.

  7. Installation Restoration Program, Phase II - Confirmation/Quantification Stage I, Moody Air Force Base, Georgia.

    DTIC Science & Technology

    1985-12-01

    Confirmation/Quantification. Moody AFB- GA _____ S12. PERSONAL AUTHOR(S) .. ’ Steinberg J.A. and Thiess, W.G. 13.& TYPE OF REPORT 13b. TIME COVERED 14I. DATE...2.3.2 Soils On the high ground western portion of the base, the surface soils are mostly in the Tifton series. The soil profile consists of about 2 to...Florida Department of Environmental Regulation FWQS Florida Water Quality Standards gpd Gallons per day gpm Gallons per minute GC Gas chromatograph

  8. STEM VQ Method, Using Scanning Transmission Electron Microscopy (STEM) for Accurate Virus Quantification

    DTIC Science & Technology

    2017-02-02

    Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies

  9. Theory of mind in children with Neurofibromatosis Type 1.

    PubMed

    Payne, Jonathan M; Porter, Melanie; Pride, Natalie A; North, Kathryn N

    2016-05-01

    Neurofibromatosis Type I (NF1) is a single gene disorder associated with cognitive and behavioral deficits. While there is clear evidence for poorer social outcomes in NF1, the factors underlying reduced social function are not well understood. This study examined theory of mind (ToM) in children with NF1 and unaffected controls. ToM was assessed in children with NF1 (n = 26) and unaffected controls (n = 36) aged 4-12 years using a nonverbal picture sequencing task. The task assessed understanding of ToM (unrealized goals, false belief, pretence, intention), while controlling for social script knowledge and physical cause-and-effect reasoning. Children with NF1 made significantly more errors than unaffected controls on most ToM stories while demonstrating no difficulty sequencing physical cause-and-effect stories. Performance on the picture sequencing task was not related to lower intellectual function, symptoms of attention deficit-hyperactivity disorder (ADHD), or parent ratings of executive function. Results suggest a generalized ToM deficit in children with NF1 that appears to be independent of general cognitive abilities and ADHD symptoms. The study refines understanding of the clinical presentation of NF1 and identifies psychological constructs that may contribute to the higher prevalence of social dysfunction in children with NF1. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    PubMed

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  11. Breaking Ground: A Study of Gestalt Therapy Theory and Holland's Theory of Vocational Choice.

    ERIC Educational Resources Information Center

    Hartung, Paul J.

    In both Gestalt therapy and Holland's theory of vocational choice, person-environment interaction receives considerable emphasis. Gestalt therapy theory suggests that people make contact (that is, meet needs) through a characteristic style of interacting with the environment. Holland identifies six personality types in his theory and asserts that…

  12. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  13. Determination of statin drugs in hospital effluent with dispersive liquid-liquid microextraction and quantification by liquid chromatography.

    PubMed

    Martins, Ayrton F; Frank, Carla da S; Altissimo, Joseline; de Oliveira, Júlia A; da Silva, Daiane S; Reichert, Jaqueline F; Souza, Darliana M

    2017-08-24

    Statins are classified as being amongst the most prescribed agents for treating hypercholesterolaemia and preventing vascular diseases. In this study, a rapid and effective liquid chromatography method, assisted by diode array detection, was designed and validated for the simultaneous quantification of atorvastatin (ATO) and simvastatin (SIM) in hospital effluent samples. The solid phase extraction (SPE) of the analytes was optimized regarding sorbent material and pH, and the dispersive liquid-liquid microextraction (DLLME), in terms of pH, ionic strength, type and volume of extractor/dispersor solvents. The performance of both extraction procedures was evaluated in terms of linearity, quantification limits, accuracy (recovery %), precision and matrix effects for each analyte. The methods proved to be linear in the concentration range considered; the quantification limits were 0.45 µg L -1 for ATO and 0.75 µg L -1 for SIM; the matrix effect was almost absent in both methods and the average recoveries remained between 81.5-90.0%; and the RSD values were <20%. The validated methods were applied to the quantification of the statins in real samples of hospital effluent; the concentrations ranged from 18.8 µg L -1 to 35.3 µg L -1 for ATO, and from 30.3 µg L -1 to 38.5 µg L -1 for SIM. Since the calculated risk quotient was ≤192, the occurrence of ATO and SIM in hospital effluent poses a potential serious risk to human health and the aquatic ecosystem.

  14. Einstein-aether theory: dynamics of relativistic particles with spin or polarization in a Gödel-type universe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balakin, Alexander B.; Popov, Vladimir A., E-mail: alexander.balakin@kpfu.ru, E-mail: vladipopov@mail.ru

    In the framework of the Einstein-aether theory we consider a cosmological model, which describes the evolution of the unit dynamic vector field with activated rotational degree of freedom. We discuss exact solutions of the Einstein-aether theory, for which the space-time is of the Gödel-type, the velocity four-vector of the aether motion is characterized by a non-vanishing vorticity, thus the rotational vectorial modes can be associated with the source of the universe rotation. The main goal of our paper is to study the motion of test relativistic particles with a vectorial internal degree of freedom (spin or polarization), which is coupledmore » to the unit dynamic vector field. The particles are considered as the test ones in the given space-time background of the Gödel-type; the spin (polarization) coupling to the unit dynamic vector field is modeled using exact solutions of three types. The first exact solution describes the aether with arbitrary Jacobson's coupling constants; the second one relates to the case, when the Jacobson's constant responsible for the vorticity is vanishing; the third exact solution is obtained using three constraints for the coupling constants. The analysis of the exact expressions, which are obtained for the particle momentum and for the spin (polarization) four-vector components, shows that the interaction of the spin (polarization) with the unit vector field induces a rotation, which is additional to the geodesic precession of the spin (polarization) associated with the universe rotation as a whole.« less

  15. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  16. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  17. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    PubMed

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification

  18. Quantification of pericardial effusions by echocardiography and computed tomography.

    PubMed

    Leibowitz, David; Perlman, Gidon; Planer, David; Gilon, Dan; Berman, Philip; Bogot, Naama

    2011-01-15

    Echocardiography is a well-accepted tool for the diagnosis and quantification of pericardial effusion (PEff). Given the increasing use of computed tomographic (CT) scanning, more PEffs are being initially diagnosed by computed tomography. No study has compared quantification of PEff by computed tomography and echocardiography. The objective of this study was to assess the accuracy of quantification of PEff by 2-dimensional echocardiography and computed tomography compared to the amount of pericardial fluid drained at pericardiocentesis. We retrospectively reviewed an institutional database to identify patients who underwent chest computed tomography and echocardiography before percutaneous pericardiocentesis with documentation of the amount of fluid withdrawn. Digital 2-dimensional echocardiographic and CT images were retrieved and quantification of PEff volume was performed by applying the formula for the volume of a prolate ellipse, π × 4/3 × maximal long-axis dimension/2 × maximal transverse dimension/2 × maximal anteroposterior dimension/2, to the pericardial sac and to the heart. Nineteen patients meeting study qualifications were entered into the study. The amount of PEff drained was 200 to 1,700 ml (mean 674 ± 340). Echocardiographically calculated pericardial effusion volume correlated relatively well with PEff volume (r = 0.73, p <0.001, mean difference -41 ± 225 ml). There was only moderate correlation between CT volume quantification and actual volume drained (r = 0.4, p = 0.004, mean difference 158 ± 379 ml). In conclusion, echocardiography appears a more accurate imaging technique than computed tomography in quantitative assessment of nonloculated PEffs and should continue to be the primary imaging in these patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  20. Assessment of primer/template mismatch effects on real-time PCR amplification of target taxa for GMO quantification.

    PubMed

    Ghedira, Rim; Papazova, Nina; Vuylsteke, Marnik; Ruttink, Tom; Taverniers, Isabel; De Loose, Marc

    2009-10-28

    GMO quantification, based on real-time PCR, relies on the amplification of an event-specific transgene assay and a species-specific reference assay. The uniformity of the nucleotide sequences targeted by both assays across various transgenic varieties is an important prerequisite for correct quantification. Single nucleotide polymorphisms (SNPs) frequently occur in the maize genome and might lead to nucleotide variation in regions used to design primers and probes for reference assays. Further, they may affect the annealing of the primer to the template and reduce the efficiency of DNA amplification. We assessed the effect of a minor DNA template modification, such as a single base pair mismatch in the primer attachment site, on real-time PCR quantification. A model system was used based on the introduction of artificial mismatches between the forward primer and the DNA template in the reference assay targeting the maize starch synthase (SSIIb) gene. The results show that the presence of a mismatch between the primer and the DNA template causes partial to complete failure of the amplification of the initial DNA template depending on the type and location of the nucleotide mismatch. With this study, we show that the presence of a primer/template mismatch affects the estimated total DNA quantity to a varying degree.

  1. Three-dimensional gauge theories and gravitational instantons from string theory

    NASA Astrophysics Data System (ADS)

    Cherkis, Sergey Alexander

    Various realizations of gauge theories in string theory allow an identification of their spaces of vacua with gravitational instantons. Also, they provide a correspondence of vacua of gauge theories with nonabelian monopole configurations and solutions of a system of integrable equations called Nahm equations. These identifications make it possible to apply powerful techniques of differential and algebraic geometry to solve the gauge theories in question. In other words, it becomes possible to find the exact metrics on their moduli spaces of vacua with all quantum corrections included. As another outcome we obtain for the first time the description of a series of all Dk-type gravitational instantons.

  2. Installation Restoration Program. Confirmation/Quantification Stage 1. Phase 2

    DTIC Science & Technology

    1985-03-07

    INSTALLATION RESTORATION PROGRAM i0 PHASE II - CONFIRMATION/QUANTIFICATION 0STAGE 1 KIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117 IIl PREPARED BY SCIENCE...APPLICATIONS INTERNATIONAL CORPORATION 505 MARQUETTE NW, SUITE 1200 ALBUQUERQUE, NEW MEXICO 871021 5MARCH 1985 FINAL REPORT FROM FEB 1983 TO MAR 1985...QUANTIFICATION STAGE 1 i FINAL REPORT FOR IKIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117U HEADQUARTERS MILITARY AIRLIFT COMMAND COMMAND SURGEON’S OFFICE (HQ MAC

  3. Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay

    PubMed Central

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997

  4. The Predictive Effects of Protection Motivation Theory on Intention and Behaviour of Physical Activity in Patients with Type 2 Diabetes

    PubMed Central

    Ali Morowatisharifabad, Mohammad; Abdolkarimi, Mahdi; Asadpour, Mohammad; Fathollahi, Mahmood Sheikh; Balaee, Parisa

    2018-01-01

    INTRODUCTION: Theory-based education tailored to target behaviour and group can be effective in promoting physical activity. AIM: The purpose of this study was to examine the predictive power of Protection Motivation Theory on intent and behaviour of Physical Activity in Patients with Type 2 Diabetes. METHODS: This descriptive study was conducted on 250 patients in Rafsanjan, Iran. To examine the scores of protection motivation theory structures, a researcher-made questionnaire was used. Its validity and reliability were confirmed. The level of physical activity was also measured by the International Short - form Physical Activity Inventory. Its validity and reliability were also approved. Data were analysed by statistical tests including correlation coefficient, chi-square, logistic regression and linear regression. RESULTS: The results revealed that there was a significant correlation between all the protection motivation theory constructs and the intention to do physical activity. The results showed that the Theory structures were able to predict 60% of the variance of physical activity intention. The results of logistic regression demonstrated that increase in the score of physical activity intent and self - efficacy increased the chance of higher level of physical activity by 3.4 and 1.5 times, respectively OR = (3.39, 1.54). CONCLUSION: Considering the ability of protection motivation theory structures to explain the physical activity behaviour, interventional designs are suggested based on the structures of this theory, especially to improve self -efficacy as the most powerful factor in predicting physical activity intention and behaviour. PMID:29731945

  5. The Predictive Effects of Protection Motivation Theory on Intention and Behaviour of Physical Activity in Patients with Type 2 Diabetes.

    PubMed

    Ali Morowatisharifabad, Mohammad; Abdolkarimi, Mahdi; Asadpour, Mohammad; Fathollahi, Mahmood Sheikh; Balaee, Parisa

    2018-04-15

    Theory-based education tailored to target behaviour and group can be effective in promoting physical activity. The purpose of this study was to examine the predictive power of Protection Motivation Theory on intent and behaviour of Physical Activity in Patients with Type 2 Diabetes. This descriptive study was conducted on 250 patients in Rafsanjan, Iran. To examine the scores of protection motivation theory structures, a researcher-made questionnaire was used. Its validity and reliability were confirmed. The level of physical activity was also measured by the International Short - form Physical Activity Inventory. Its validity and reliability were also approved. Data were analysed by statistical tests including correlation coefficient, chi-square, logistic regression and linear regression. The results revealed that there was a significant correlation between all the protection motivation theory constructs and the intention to do physical activity. The results showed that the Theory structures were able to predict 60% of the variance of physical activity intention. The results of logistic regression demonstrated that increase in the score of physical activity intent and self - efficacy increased the chance of higher level of physical activity by 3.4 and 1.5 times, respectively OR = (3.39, 1.54). Considering the ability of protection motivation theory structures to explain the physical activity behaviour, interventional designs are suggested based on the structures of this theory, especially to improve self -efficacy as the most powerful factor in predicting physical activity intention and behaviour.

  6. Theory of type 3b solar radio bursts. [plasma interaction and electron beams

    NASA Technical Reports Server (NTRS)

    Smith, R. A.; Delanoee, J.

    1975-01-01

    During the initial space-time evolution of an electron beam injected into the corona, the strong beam-plasma interaction occurs at the head of the beam, leading to the amplification of a quasi-monochromatic large-amplitude plasma wave that stabilizes by trapping the beam particles. Oscillation of the trapped particles in the wave troughs amplifies sideband electrostatic waves. The sidebands and the main wave subsequently decay to observable transverse electromagnetic waves through the parametric decay instability. This process gives rise to the elementary striation bursts. Owing to velocity dispersion in the beam and the density gradient of the corona, the entire process may repeat at a finite number of discrete plasma levels, producing chains of elementary bursts. All the properties of the type IIIb bursts are accounted for in the context of the theory.

  7. Minimal string theories and integrable hierarchies

    NASA Astrophysics Data System (ADS)

    Iyer, Ramakrishnan

    Well-defined, non-perturbative formulations of the physics of string theories in specific minimal or superminimal model backgrounds can be obtained by solving matrix models in the double scaling limit. They provide us with the first examples of completely solvable string theories. Despite being relatively simple compared to higher dimensional critical string theories, they furnish non-perturbative descriptions of interesting physical phenomena such as geometrical transitions between D-branes and fluxes, tachyon condensation and holography. The physics of these theories in the minimal model backgrounds is succinctly encoded in a non-linear differential equation known as the string equation, along with an associated hierarchy of integrable partial differential equations (PDEs). The bosonic string in (2,2m-1) conformal minimal model backgrounds and the type 0A string in (2,4 m) superconformal minimal model backgrounds have the Korteweg-de Vries system, while type 0B in (2,4m) backgrounds has the Zakharov-Shabat system. The integrable PDE hierarchy governs flows between backgrounds with different m. In this thesis, we explore this interesting connection between minimal string theories and integrable hierarchies further. We uncover the remarkable role that an infinite hierarchy of non-linear differential equations plays in organizing and connecting certain minimal string theories non-perturbatively. We are able to embed the type 0A and 0B (A,A) minimal string theories into this single framework. The string theories arise as special limits of a rich system of equations underpinned by an integrable system known as the dispersive water wave hierarchy. We find that there are several other string-like limits of the system, and conjecture that some of them are type IIA and IIB (A,D) minimal string backgrounds. We explain how these and several other string-like special points arise and are connected. In some cases, the framework endows the theories with a non

  8. Using the Advocacy Coalition Framework and Multiple Streams policy theories to examine the role of evidence, research and other types of knowledge in drug policy.

    PubMed

    Ritter, Alison; Hughes, Caitlin Elizabeth; Lancaster, Kari; Hoppe, Robert

    2018-04-17

    The prevailing 'evidence-based policy' paradigm emphasizes a technical-rational relationship between alcohol and drug research evidence and subsequent policy action. However, policy process theories do not start with this premise, and hence provide an opportunity to consider anew the ways in which evidence, research and other types of knowledge impact upon policy. This paper presents a case study, the police deployment of drug detection dogs, to highlight how two prominent policy theories [the Advocacy Coalition Framework (ACF) and the Multiple Streams (MS) approach] explicate the relationship between evidence and policy. The two theories were interrogated with reference to their descriptions and framings of evidence, research and other types of knowledge. The case study methodology was employed to extract data concerned with evidence and other types of knowledge from a previous detailed historical account and analysis of drug detection dogs in one Australian state (New South Wales). Different types of knowledge employed across the case study were identified and coded, and then analysed with reference to each theory. A detailed analysis of one key 'evidence event' within the case study was also undertaken. Five types of knowledge were apparent in the case study: quantitative program data; practitioner knowledge; legal knowledge; academic research; and lay knowledge. The ACF highlights how these various types of knowledge are only influential inasmuch as they provide the opportunity to alter the beliefs of decision-makers. The MS highlights how multiple types of knowledge may or may not form part of the strategy of policy entrepreneurs to forge the confluence of problems, solutions and politics. Neither the Advocacy Coalition Framework nor the Multiple Streams approach presents an uncomplicated linear relationship between evidence and policy action, nor do they preference any one type of knowledge. The implications for research and practice include the contestation

  9. An Examination of the Four-Part Theory of the Chinese Self: The Differentiation and Relative Importance of the Different Types of Social-Oriented Self

    PubMed Central

    Sun, Chien-Ru

    2017-01-01

    Because culture has a deep and far-reaching influence, individuals who grew up within different cultures tend to develop different basic self-constructions. With respect to the Chinese under the influence of Chinese culture, Yang proposed the concepts of individual-oriented self and social-oriented self. He argued that, besides the individual-oriented self, the social-oriented self of the Chinese contains three types of self: the relationship-oriented self, the familistic (group)-oriented self, and the other-oriented self. The theory proposed that the Chinese self is appropriately covered only through this four-part theory of the Chinese self. However, this remains to be tested; whether these three types of sub-level “selves” can be effectively triggered, along with their relative importance. This study examines the four-part theory of the Chinese self. Through photo priming, Experiment 1 shows that the three types of social-oriented self are differentiated from each other and can be individually triggered. In Experiment 2, the importance of the three types of self was investigated, adopting the concept of limited self-regulation resources to design scenarios. The participants were asked to make counterarguments about the notion of each of the three types of self, with performance in the subsequent task serving as the main dependent variable. In Experiment 3, the relative importance of the three types of self was examined by investigating the choices made by individuals within the context of conflict under the three orientations of the social-oriented self. Overall, results of the experiments showed that the Chinese have a four-part self with the importance of the other-oriented self as the most remarkable. PMID:28713310

  10. Theory of chromatography of partially cyclic polymers: Tadpole-type and manacle-type macromolecules.

    PubMed

    Vakhrushev, Andrey V; Gorbunov, Alexei A

    2016-02-12

    A theory of chromatography is developed for partially cyclic polymers of tadpole- and manacle-shaped topological structures. We present exact equations for the distribution coefficient K at different adsorption interactions; simpler approximate formulae are also derived, relevant to the conditions of size-exclusion, adsorption, and critical chromatography. Theoretical chromatograms of heterogeneous partially cyclic polymers are simulated, and conditions for good separation by topology are predicted. According to the theory, an effective SEC-radius of tadpoles and manacles is mostly determined by the molar mass M, and by the linear-cyclic composition. In the interactive chromatography, the effect of molecular topology on the retention becomes significant. At the critical interaction point, partial dependences K(Mlin) and K(Mring) are qualitatively different: while being almost independent of Mlin, K increases with Mring. This behavior could be realized in critical chromatography-for separation of partially cyclic polymers by the number and molar mass of cyclic elements. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Electrophoresis Gel Quantification with a Flatbed Scanner and Versatile Lighting from a Screen Scavenged from a Liquid Crystal Display (LCD) Monitor

    ERIC Educational Resources Information Center

    Yeung, Brendan; Ng, Tuck Wah; Tan, Han Yen; Liew, Oi Wah

    2012-01-01

    The use of different types of stains in the quantification of proteins separated on gels using electrophoresis offers the capability of deriving good outcomes in terms of linear dynamic range, sensitivity, and compatibility with specific proteins. An inexpensive, simple, and versatile lighting system based on liquid crystal display backlighting is…

  12. Nambu-Poisson gauge theory

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Schupp, Peter; Vysoký, Jan

    2014-06-01

    We generalize noncommutative gauge theory using Nambu-Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg-Witten map. We construct a covariant Nambu-Poisson gauge theory action, give its first order expansion in the Nambu-Poisson tensor and relate it to a Nambu-Poisson matrix model.

  13. Relative quantification of biomarkers using mixed-isotope labeling coupled with MS

    PubMed Central

    Chapman, Heidi M; Schutt, Katherine L; Dieter, Emily M; Lamos, Shane M

    2013-01-01

    The identification and quantification of important biomarkers is a critical first step in the elucidation of biological systems. Biomarkers take many forms as cellular responses to stimuli and can be manifested during transcription, translation, and/or metabolic processing. Increasingly, researchers have relied upon mixed-isotope labeling (MIL) coupled with MS to perform relative quantification of biomarkers between two or more biological samples. MIL effectively tags biomarkers of interest for ease of identification and quantification within the mass spectrometer by using isotopic labels that introduce a heavy and light form of the tag. In addition to MIL coupled with MS, a number of other approaches have been used to quantify biomarkers including protein gel staining, enzymatic labeling, metabolic labeling, and several label-free approaches that generate quantitative data from the MS signal response. This review focuses on MIL techniques coupled with MS for the quantification of protein and small-molecule biomarkers. PMID:23157360

  14. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  15. Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.

    PubMed

    Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian

    2017-04-01

    Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques

  16. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    PubMed

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  17. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    USGS Publications Warehouse

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  18. Microbial quantification in activated sludge: the hits and misses.

    PubMed

    Hall, S J; Keller, J; Blackall, L L

    2003-01-01

    Since the implementation of the activated sludge process for treating wastewater, there has been a reliance on chemical and physical parameters to monitor the system. However, in biological nutrient removal (BNR) processes, the microorganisms responsible for some of the transformations should be used to monitor the processes with the overall goal to achieve better treatment performance. The development of in situ identification and rapid quantification techniques for key microorganisms involved in BNR are required to achieve this goal. This study explored the quantification of Nitrospira, a key organism in the oxidation of nitrite to nitrate in BNR. Two molecular genetic microbial quantification techniques were evaluated: real-time polymerase chain reaction (PCR) and fluorescence in situ hybridisation (FISH) followed by digital image analysis. A correlation between the Nitrospira quantitative data and the nitrate production rate, determined in batch tests, was attempted. The disadvantages and advantages of both methods will be discussed.

  19. A surrogate accelerated multicanonical Monte Carlo method for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wu, Keyi; Li, Jinglai

    2016-09-01

    In this work we consider a class of uncertainty quantification problems where the system performance or reliability is characterized by a scalar parameter y. The performance parameter y is random due to the presence of various sources of uncertainty in the system, and our goal is to estimate the probability density function (PDF) of y. We propose to use the multicanonical Monte Carlo (MMC) method, a special type of adaptive importance sampling algorithms, to compute the PDF of interest. Moreover, we develop an adaptive algorithm to construct local Gaussian process surrogates to further accelerate the MMC iterations. With numerical examples we demonstrate that the proposed method can achieve several orders of magnitudes of speedup over the standard Monte Carlo methods.

  20. Inverse regression-based uncertainty quantification algorithms for high-dimensional models: Theory and practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lin, Guang; Li, Bing

    2016-09-01

    A well-known challenge in uncertainty quantification (UQ) is the "curse of dimensionality". However, many high-dimensional UQ problems are essentially low-dimensional, because the randomness of the quantity of interest (QoI) is caused only by uncertain parameters varying within a low-dimensional subspace, known as the sufficient dimension reduction (SDR) subspace. Motivated by this observation, we propose and demonstrate in this paper an inverse regression-based UQ approach (IRUQ) for high-dimensional problems. Specifically, we use an inverse regression procedure to estimate the SDR subspace and then convert the original problem to a low-dimensional one, which can be efficiently solved by building a response surface model such as a polynomial chaos expansion. The novelty and advantages of the proposed approach is seen in its computational efficiency and practicality. Comparing with Monte Carlo, the traditionally preferred approach for high-dimensional UQ, IRUQ with a comparable cost generally gives much more accurate solutions even for high-dimensional problems, and even when the dimension reduction is not exactly sufficient. Theoretically, IRUQ is proved to converge twice as fast as the approach it uses seeking the SDR subspace. For example, while a sliced inverse regression method converges to the SDR subspace at the rate ofmore » $$O(n^{-1/2})$$, the corresponding IRUQ converges at $$O(n^{-1})$$. IRUQ also provides several desired conveniences in practice. It is non-intrusive, requiring only a simulator to generate realizations of the QoI, and there is no need to compute the high-dimensional gradient of the QoI. Finally, error bars can be derived for the estimation results reported by IRUQ.« less

  1. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  2. Quantification of Fibrosis and Osteosclerosis in Myeloproliferative Neoplasms: A Computer-Assisted Image Study

    PubMed Central

    Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.

    2010-01-01

    Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729

  3. Single cell genomic quantification by non-fluorescence nonlinear microscopy

    NASA Astrophysics Data System (ADS)

    Kota, Divya; Liu, Jing

    2017-02-01

    Human epidermal growth receptor 2 (Her2) is a gene which plays a major role in breast cancer development. The quantification of Her2 expression in single cells is limited by several drawbacks in existing fluorescence-based single molecule techniques, such as low signal-to-noise ratio (SNR), strong autofluorescence and background signals from biological components. For rigorous genomic quantification, a robust method of orthogonal detection is highly desirable and we demonstrated it by two non-fluorescent imaging techniques -transient absorption microscopy (TAM) and second harmonic generation (SHG). In TAM, gold nanoparticles (AuNPs) are chosen as an orthogonal probes for detection of single molecules which gives background-free quantifications of single mRNA transcript. In SHG, emission from barium titanium oxide (BTO) nanoprobes was demonstrated which allows stable signal beyond the autofluorescence window. Her2 mRNA was specifically labeled with nanoprobes which are conjugated with antibodies or oligonucleotides and quantified at single copy sensitivity in the cancer cells and tissues. Furthermore, a non-fluorescent super-resolution concept, named as second harmonic super-resolution microscopy (SHaSM), was proposed to quantify individual Her2 transcripts in cancer cells beyond the diffraction limit. These non-fluorescent imaging modalities will provide new dimensions in biomarker quantification at single molecule sensitivity in turbid biological samples, offering a strong cross-platform strategy for clinical monitoring at single cell resolution.

  4. Reference tissue quantification of DCE-MRI data without a contrast agent calibration

    NASA Astrophysics Data System (ADS)

    Walker-Samuel, Simon; Leach, Martin O.; Collins, David J.

    2007-02-01

    The quantification of dynamic contrast-enhanced (DCE) MRI data conventionally requires a conversion from signal intensity to contrast agent concentration by measuring a change in the tissue longitudinal relaxation rate, R1. In this paper, it is shown that the use of a spoiled gradient-echo acquisition sequence (optimized so that signal intensity scales linearly with contrast agent concentration) in conjunction with a reference tissue-derived vascular input function (VIF), avoids the need for the conversion to Gd-DTPA concentration. This study evaluates how to optimize such sequences and which dynamic time-series parameters are most suitable for this type of analysis. It is shown that signal difference and relative enhancement provide useful alternatives when full contrast agent quantification cannot be achieved, but that pharmacokinetic parameters derived from both contain sources of error (such as those caused by differences between reference tissue and region of interest proton density and native T1 values). It is shown in a rectal cancer study that these sources of uncertainty are smaller when using signal difference, compared with relative enhancement (15 ± 4% compared with 33 ± 4%). Both of these uncertainties are of the order of those associated with the conversion to Gd-DTPA concentration, according to literature estimates.

  5. Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.

    PubMed

    Mujika, Iñigo

    2017-04-01

    Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.

  6. Detection and quantification of extracellular microRNAs in murine biofluids

    PubMed Central

    2014-01-01

    Background MicroRNAs (miRNAs) are short RNA molecules which regulate gene expression in eukaryotic cells, and are abundant and stable in biofluids such as blood serum and plasma. As such, there has been heightened interest in the utility of extracellular miRNAs as minimally invasive biomarkers for diagnosis and monitoring of a wide range of human pathologies. However, quantification of extracellular miRNAs is subject to a number of specific challenges, including the relatively low RNA content of biofluids, the possibility of contamination with serum proteins (including RNases and PCR inhibitors), hemolysis, platelet contamination/activation, a lack of well-established reference miRNAs and the biochemical properties of miRNAs themselves. Protocols for the detection and quantification of miRNAs in biofluids are therefore of high interest. Results The following protocol was validated by quantifying miRNA abundance in C57 (wild-type) and dystrophin-deficient (mdx) mice. Important differences in miRNA abundance were observed depending on whether blood was taken from the jugular or tail vein. Furthermore, efficiency of miRNA recovery was reduced when sample volumes greater than 50 μl were used. Conclusions Here we describe robust and novel procedures to harvest murine serum/plasma, extract biofluid RNA, amplify specific miRNAs by RT-qPCR and analyze the resulting data, enabling the determination of relative and absolute miRNA abundance in extracellular biofluids with high accuracy, specificity and sensitivity. PMID:24629058

  7. Flavor structure in F-theory compactifications

    NASA Astrophysics Data System (ADS)

    Hayashi, Hirotaka; Kawano, Teruhiko; Tsuchiya, Yoichi; Watari, Taizan

    2010-08-01

    F-theory is one of frameworks in string theory where supersymmetric grand unification is accommodated, and all the Yukawa couplings and Majorana masses of righthanded neutrinos are generated. Yukawa couplings of charged fermions are generated at codimension-3 singularities, and a contribution from a given singularity point is known to be approximately rank 1. Thus, the approximate rank of Yukawa matrices in low-energy effective theory of generic F-theory compactifications are minimum of either the number of generations N gen = 3 or the number of singularity points of certain types. If there is a geometry with only one E 6 type point and one D 6 type point over the entire 7-brane for SU(5) gauge fields, F-theory compactified on such a geometry would reproduce approximately rank-1 Yukawa matrices in the real world. We found, however, that there is no such geometry. Thus, it is a problem how to generate hierarchical Yukawa eigenvalues in F-theory compactifications. A solution in the literature so far is to take an appropriate factorization limit. In this article, we propose an alternative solution to the hierarchical structure problem (which requires to tune some parameters) by studying how zero mode wavefunctions depend on complex structure moduli. In this solution, the N gen × N gen CKM matrix is predicted to have only N gen entries of order unity without an extra tuning of parameters, and the lepton flavor anarchy is predicted for the lepton mixing matrix. The hierarchy among the Yukawa eigenvalues of the down-type and charged lepton sector is predicted to be smaller than that of the up-type sector, and the Majorana masses of left-handed neutrinos generated through the see-saw mechanism have small hierarchy. All of these predictions agree with what we observe in the real world. We also obtained a precise description of zero mode wavefunctions near the E 6 type singularity points, where the up-type Yukawa couplings are generated.

  8. Accurate frequency domain measurement of the best linear time-invariant approximation of linear time-periodic systems including the quantification of the time-periodic distortions

    NASA Astrophysics Data System (ADS)

    Louarroudi, E.; Pintelon, R.; Lataire, J.

    2014-10-01

    Time-periodic (TP) phenomena occurring, for instance, in wind turbines, helicopters, anisotropic shaft-bearing systems, and cardiovascular/respiratory systems, are often not addressed when classical frequency response function (FRF) measurements are performed. As the traditional FRF concept is based on the linear time-invariant (LTI) system theory, it is only approximately valid for systems with varying dynamics. Accordingly, the quantification of any deviation from this ideal LTI framework is more than welcome. The “measure of deviation” allows us to define the notion of the best LTI (BLTI) approximation, which yields the best - in mean square sense - LTI description of a linear time-periodic LTP system. By taking into consideration the TP effects, it is shown in this paper that the variability of the BLTI measurement can be reduced significantly compared with that of classical FRF estimators. From a single experiment, the proposed identification methods can handle (non-)linear time-periodic [(N)LTP] systems in open-loop with a quantification of (i) the noise and/or the NL distortions, (ii) the TP distortions and (iii) the transient (leakage) errors. Besides, a geometrical interpretation of the BLTI approximation is provided, leading to a framework called vector FRF analysis. The theory presented is supported by numerical simulations as well as real measurements mimicking the well-known mechanical Mathieu oscillator.

  9. Identification and quantification of selected chemicals in laser pyrolysis products of mammalian tissues

    NASA Astrophysics Data System (ADS)

    Spleiss, Martin; Weber, Lothar W.; Meier, Thomas H.; Treffler, Bernd

    1995-01-01

    Liver and muscle tissue have been irradiated with a surgical CO2-laser. The prefiltered fumes were adsorbed on different sorbents (activated charcoal type NIOSH and Carbotrap) and desorbed with different solvents (carbondisulphide and acetone). Analysis was done by gas chromatography/mass spectrometry. An updated list of identified substances is shown. Typical Maillard reaction products as found in warmed over flavour as aldehydes, aromatics, heterocyclic and sulphur compounds were detected. Quantification of some toxicological relevant substances is presented. The amounts of these substances are given in relation to the laser parameters and different tissues for further toxicological assessment.

  10. Intrusive Method for Uncertainty Quantification in a Multiphase Flow Solver

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2016-11-01

    Uncertainty quantification (UQ) is a necessary, interesting, and often neglected aspect of fluid flow simulations. To determine the significance of uncertain initial and boundary conditions, a multiphase flow solver is being created which extends a single phase, intrusive, polynomial chaos scheme into multiphase flows. Reliably estimating the impact of input uncertainty on design criteria can help identify and minimize unwanted variability in critical areas, and has the potential to help advance knowledge in atomizing jets, jet engines, pharmaceuticals, and food processing. Use of an intrusive polynomial chaos method has been shown to significantly reduce computational cost over non-intrusive collocation methods such as Monte-Carlo. This method requires transforming the model equations into a weak form through substitution of stochastic (random) variables. Ultimately, the model deploys a stochastic Navier Stokes equation, a stochastic conservative level set approach including reinitialization, as well as stochastic normals and curvature. By implementing these approaches together in one framework, basic problems may be investigated which shed light on model expansion, uncertainty theory, and fluid flow in general. NSF Grant Number 1511325.

  11. A note on local BRST cohomology of Yang-Mills type theories with free Abelian factors

    NASA Astrophysics Data System (ADS)

    Barnich, Glenn; Boulanger, Nicolas

    2018-05-01

    We extend previous work on antifield dependent local Becchi-Rouet-Stora-Tyutin (BRST) cohomology for matter coupled gauge theories of Yang-Mills type to the case of gauge groups that involve free Abelian factors. More precisely, we first investigate in a model independent way how the dynamics enters the computation of the cohomology for a general class of Lagrangians in general spacetime dimensions. We then discuss explicit solutions in the case of specific models. Our analysis has implications for the structure of characteristic cohomology and for consistent deformations of the classical models, as well as for divergences/counterterms and for gauge anomalies that may appear during perturbative quantization.

  12. In-Gel Stable-Isotope Labeling (ISIL): a strategy for mass spectrometry-based relative quantification.

    PubMed

    Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C

    2006-01-01

    Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.

  13. Effects of information type on children's interrogative suggestibility: is Theory-of-Mind involved?

    PubMed

    Hünefeldt, Thomas; Rossi-Arnaud, Clelia; Furia, Augusta

    2009-08-01

    This research was aimed at learning more about the different psychological mechanisms underlying children's suggestibility to leading questions, on the one hand, and children's suggestibility to negative feedback, on the other, by distinguishing between interview questions concerning different types of information. Results showed that, unlike the developmental pattern of children's suggestibility to leading questions, the developmental pattern of children's suggestibility to negative feedback differed depending on whether the interview questions concerned external facts (physical states and events) or internal facts (mental states and events). This difference was not manifested in response to questions concerning central versus peripheral facts. Results are interpreted in terms of the hypothesis that children's suggestibility to negative feedback is differently affected by "Theory-of-Mind" abilities than children's suggestibility to leading questions. Further research is needed in order to test this hypothesis.

  14. Program Theory Evaluation: Logic Analysis

    ERIC Educational Resources Information Center

    Brousselle, Astrid; Champagne, Francois

    2011-01-01

    Program theory evaluation, which has grown in use over the past 10 years, assesses whether a program is designed in such a way that it can achieve its intended outcomes. This article describes a particular type of program theory evaluation--logic analysis--that allows us to test the plausibility of a program's theory using scientific knowledge.…

  15. Clinical applications of MS-based protein quantification.

    PubMed

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq

    PubMed Central

    Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H.; Keleş, Sündüz; Dewey, Colin N.

    2016-01-01

    RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. PMID:27405803

  17. Quantification and Formalization of Security

    DTIC Science & Technology

    2010-02-01

    Quantification of Information Flow . . . . . . . . . . . . . . . . . . 30 2.4 Language Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . 46...system behavior observed by users holding low clearances. This policy, or a variant of it, is enforced by many pro- gramming language -based mechanisms...illustrates with a particular programming language (while-programs plus probabilistic choice). The model is extended in §2.5 to programs in which

  18. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  19. Constraining f(R) theories with cosmography

    NASA Astrophysics Data System (ADS)

    Anabella Teppa Pannia, Florencia; Esteban Perez Bergliaffa, Santiago

    2013-08-01

    A method to set constraints on the parameters of extended theories of gravitation is presented. It is based on the comparison of two series expansions of any observable that depends on H(z). The first expansion is of the cosmographical type, while the second uses the dependence of H with z furnished by a given type of extended theory. When applied to f(R) theories together with the redshift drift, the method yields limits on the parameters of two examples (the theory of Hu and Sawicki [1], and the exponential gravity introduced by Linder [2]) that are compatible with or more stringent than the existing ones, as well as a limit for a previously unconstrained parameter.

  20. N=2 gauge theories and degenerate fields of Toda theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanno, Shoichi; Matsuo, Yutaka; Shiba, Shotaro

    We discuss the correspondence between degenerate fields of the W{sub N} algebra and punctures of Gaiotto's description of the Seiberg-Witten curve of N=2 superconformal gauge theories. Namely, we find that the type of degenerate fields of the W{sub N} algebra, with null states at level one, is classified by Young diagrams with N boxes, and that the singular behavior of the Seiberg-Witten curve near the puncture agrees with that of W{sub N} generators. We also find how to translate mass parameters of the gauge theory to the momenta of the Toda theory.

  1. Performance of Density Functional Theory Procedures for the Calculation of Proton-Exchange Barriers: Unusual Behavior of M06-Type Functionals.

    PubMed

    Chan, Bun; Gilbert, Andrew T B; Gill, Peter M W; Radom, Leo

    2014-09-09

    We have examined the performance of a variety of density functional theory procedures for the calculation of complexation energies and proton-exchange barriers, with a focus on the Minnesota-class of functionals that are generally highly robust and generally show good accuracy. A curious observation is that M05-type and M06-type methods show an atypical decrease in calculated barriers with increasing proportion of Hartree-Fock exchange. To obtain a clearer picture of the performance of the underlying components of M05-type and M06-type functionals, we have investigated the combination of MPW-type and PBE-type exchange and B95-type and PBE-type correlation procedures. We find that, for the extensive E3 test set, the general performance of the various hybrid-DFT procedures improves in the following order: PBE1-B95 → PBE1-PBE → MPW1-PBE → PW6-B95. As M05-type and M06-type procedures are related to PBE1-B95, it would be of interest to formulate and examine the general performance of an alternative Minnesota DFT method related to PW6-B95.

  2. Exploration of Action Figure Appeals Using Evaluation Grid Method and Quantification Theory Type I

    ERIC Educational Resources Information Center

    Chang, Hua-Cheng; Chen, Hung-Yuan

    2017-01-01

    Contemporary toy is characterized by accelerating social, cultural and technological change. An attractive action figure can grab consumers' attention, influence the latent consuming preference and evoke their pleasure. However, traditional design of action figure is always dependent on designer's opinion, subjective experience and preference. It…

  3. Quantification of mRNA expression by competitive PCR using non-homologous competitors containing a shifted restriction site

    PubMed Central

    Watzinger, Franz; Hörth, Elfriede; Lion, Thomas

    2001-01-01

    Despite the recent introduction of real-time PCR methods, competitive PCR techniques continue to play an important role in nucleic acid quantification because of the significantly lower cost of equipment and consumables. Here we describe a shifted restriction-site competitive PCR (SRS-cPCR) assay based on a modified type of competitor. The competitor fragments are designed to contain a recognition site for a restriction endonuclease that is also present in the target sequence to be quantified, but in a different position. Upon completion of the PCR, the amplicons are digested in the same tube with a single restriction enzyme, without the need to purify PCR products. The generated competitor- and target-specific restriction fragments display different sizes, and can be readily separated by electrophoresis and quantified by image analysis. Suboptimal digestion affects competitor- and target-derived amplicons to the same extent, thus eliminating the problem of incorrect quantification as a result of incomplete digestion of PCR products. We have established optimized conditions for a panel of 20 common restriction endonucleases permitting efficient digestion in PCR buffer. It is possible, therefore, to find a suitable restriction site for competitive PCR in virtually any sequence of interest. The assay presented is inexpensive, widely applicable, and permits reliable and accurate quantification of nucleic acid targets. PMID:11376164

  4. Inverse regression-based uncertainty quantification algorithms for high-dimensional models: Theory and practice

    NASA Astrophysics Data System (ADS)

    Li, Weixuan; Lin, Guang; Li, Bing

    2016-09-01

    Many uncertainty quantification (UQ) approaches suffer from the curse of dimensionality, that is, their computational costs become intractable for problems involving a large number of uncertainty parameters. In these situations, the classic Monte Carlo often remains the preferred method of choice because its convergence rate O (n - 1 / 2), where n is the required number of model simulations, does not depend on the dimension of the problem. However, many high-dimensional UQ problems are intrinsically low-dimensional, because the variation of the quantity of interest (QoI) is often caused by only a few latent parameters varying within a low-dimensional subspace, known as the sufficient dimension reduction (SDR) subspace in the statistics literature. Motivated by this observation, we propose two inverse regression-based UQ algorithms (IRUQ) for high-dimensional problems. Both algorithms use inverse regression to convert the original high-dimensional problem to a low-dimensional one, which is then efficiently solved by building a response surface for the reduced model, for example via the polynomial chaos expansion. The first algorithm, which is for the situations where an exact SDR subspace exists, is proved to converge at rate O (n-1), hence much faster than MC. The second algorithm, which doesn't require an exact SDR, employs the reduced model as a control variate to reduce the error of the MC estimate. The accuracy gain could still be significant, depending on how well the reduced model approximates the original high-dimensional one. IRUQ also provides several additional practical advantages: it is non-intrusive; it does not require computing the high-dimensional gradient of the QoI; and it reports an error bar so the user knows how reliable the result is.

  5. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  6. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    PubMed

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  7. Recent advances in stable isotope labeling based techniques for proteome relative quantification.

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2014-10-24

    The large scale relative quantification of all proteins expressed in biological samples under different states is of great importance for discovering proteins with important biological functions, as well as screening disease related biomarkers and drug targets. Therefore, the accurate quantification of proteins at proteome level has become one of the key issues in protein science. Herein, the recent advances in stable isotope labeling based techniques for proteome relative quantification were reviewed, from the aspects of metabolic labeling, chemical labeling and enzyme-catalyzed labeling. Furthermore, the future research direction in this field was prospected. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. The Scientific Status of Learning Styles Theories

    ERIC Educational Resources Information Center

    Willingham, Daniel T.; Hughes, Elizabeth M.; Dobolyi, David G.

    2015-01-01

    Theories of learning styles suggest that individuals think and learn best in different ways. These are not differences of ability but rather preferences for processing certain types of information or for processing information in certain types of way. If accurate, learning styles theories could have important implications for instruction because…

  9. In-vivo segmentation and quantification of coronary lesions by optical coherence tomography images for a lesion type definition and stenosis grading.

    PubMed

    Celi, Simona; Berti, Sergio

    2014-10-01

    Optical coherence tomography (OCT) is a catheter-based medical imaging technique that produces cross-sectional images of blood vessels. This technique is particularly useful for studying coronary atherosclerosis. In this paper, we present a new framework that allows a segmentation and quantification of OCT images of coronary arteries to define the plaque type and stenosis grading. These analyses are usually carried out on-line on the OCT-workstation where measuring is mainly operator-dependent and mouse-based. The aim of this program is to simplify and improve the processing of OCT images for morphometric investigations and to present a fast procedure to obtain 3D geometrical models that can also be used for external purposes such as for finite element simulations. The main phases of our toolbox are the lumen segmentation and the identification of the main tissues in the artery wall. We validated the proposed method with identification and segmentation manually performed by expert OCT readers. The method was evaluated on ten datasets from clinical routine and the validation was performed on 210 images randomly extracted from the pullbacks. Our results show that automated segmentation of the vessel and of the tissue components are possible off-line with a precision that is comparable to manual segmentation for the tissue component and to the proprietary-OCT-console for the lumen segmentation. Several OCT sections have been processed to provide clinical outcome. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Recurrence quantification analysis of electrically evoked surface EMG signal.

    PubMed

    Liu, Chunling; Wang, Xu

    2005-01-01

    Recurrence Plot is a quite useful tool used in time-series analysis, in particular for measuring unstable periodic orbits embedded in a chaotic dynamical system. This paper introduced the structures of the Recurrence Plot and the ways of the plot coming into being. Then the way of the quantification of the Recurrence Plot is defined. In this paper, one of the possible applications of Recurrence Quantification Analysis (RQA) strategy to the analysis of electrical stimulation evoked surface EMG. The result shows the percent determination is increased along with stimulation intensity.

  11. Histological quantification of brain tissue inflammatory cell infiltration after focal cerebral infarction: a systematic review.

    PubMed

    Russek, Natanya S; Jensen, Matthew B

    2014-03-01

    Ischemic stroke is a leading cause of death and disability, and current treatments to limit tissue injury and improve recovery are limited. Cerebral infarction is accompanied by intense brain tissue inflammation involving many inflammatory cell types that may cause both negative and positive effects on outcomes. Many potential neuroprotective and neurorestorative treatments may affect, and be affected by, this inflammatory cell infiltration, so that accurate quantification of this tissue response is needed. We performed a systematic review of histological methods to quantify brain tissue inflammatory cell infiltration after cerebral infarction. We found reports of multiple techniques to quantify different inflammatory cell types. We found no direct comparison studies and conclude that more research is needed to optimize the assessment of this important stroke outcome.

  12. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  13. It is the Theory Which Decides What We Can Observe (Einstein)

    NASA Astrophysics Data System (ADS)

    Filk, Thomas

    In this chapter I will give examples for three types of contextuality: theory as context, a theory of measurement as context, and environmental and internal conditions as context. In particular, I will argue that depending on which theory of measurements we attribute to Bohmian mechanics, this theory may be called a classical theory or a quantum theory. Furthermore, I will show that for neural networks one can define in a natural way two different theories of measurements which can be compared with scanner-type measurements on the one hand and psychological experiments on the other hand. The later theory of measurements for neural networks leads to non-commutativity and even quantum-like contextuality. It is shown that very simple neural networks can violate Bell-type inequalities.

  14. K-theoretic aspects of string theory dualities

    NASA Astrophysics Data System (ADS)

    Mendez-Diez, Stefan Milo

    String theory is a a physical field theory in which point particles are replaced by 1-manifolds propagating in time, called strings. The 2-manifold representing the time evolution of a string is called the string worldsheet. Strings can be either closed (meaning their worldsheets are closed surfaces) or open (meaning their worldsheets have boundary). A D-brane is a submanifold of the spacetime manifold on which string endpoints are constrained to lie. There are five different string theories that have supersymmetry, and they are all related by various dualities. This dissertation will review how D-branes are classified by K-theory. We will then explore the K-theoretic aspects of a hypothesized duality between the type I theory compactified on a 4-torus and the type IIA theory compactified on a K3 surface, by looking at a certain blow down of the singular limit of K3. This dissertation concludes by classifying D-branes on the type II orientifold Tn/Z2 when the Z2 action is multiplication by -1 and the H-flux is trivial. We find that classifying D-branes on the singular limit of K3, T4/Z2 by equivariant K-theory agrees with the classification of D-branes on a smooth K3 surface by ordinary K-theory.

  15. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    PubMed

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  16. Competitive Reporter Monitored Amplification (CMA) - Quantification of Molecular Targets by Real Time Monitoring of Competitive Reporter Hybridization

    PubMed Central

    Ullrich, Thomas; Ermantraut, Eugen; Schulz, Torsten; Steinmetzer, Katrin

    2012-01-01

    Background State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. Methodology and Principal Findings The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. Conclusions and Significance The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2), we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls and targets into a

  17. New structural insights into the molecular deciphering of mycobacterial lipoglycan binding to C-type lectins: lipoarabinomannan glycoform characterization and quantification by capillary electrophoresis at the subnanomole level.

    PubMed

    Nigou, J; Vercellone, A; Puzo, G

    2000-06-23

    Lipoarabinomannans are key molecules of the mycobacterial envelopes involved in many steps of tuberculosis immunopathogenesis. Several of the biological activities of lipoarabinomannans are mediated by their ability to bind human C-type lectins, such as the macrophage mannose receptor, the mannose-binding protein and the surfactant proteins A and D. The lipoarabinomannan mannooligosaccharide caps have been demonstrated to be involved in the binding to the lectin carbohydrate recognition domains. We report an original analytical approach, based on capillary electrophoresis monitored by laser-induced fluorescence, allowing the absolute quantification, in nanomole quantities of lipoarabinomannan, of the number of mannooligosaccharide units per lipoarabinomannan molecule. Moreover, this analytical approach was successful for the glycosidic linkage determination of the mannooligosaccharide motifs and has been applied to the comparative analysis of parietal and cellular lipoarabinomannans of Mycobacterium bovis BCG and Mycobacterium tuberculosis H37Rv, H37Ra and Erdman strains. Significant differences were observed in the amounts of the various mannooligosaccharide units between lipoarabinomannans of different strains and between parietal and cellular lipoarabinomannans of the same strain. Nevertheless, no relationship was found between the number of mannooligosaccharide caps and the virulence of the corresponding strain. The results of the present study should help us to gain more understanding of the molecular basis of lipoarabinomannan discrimination in the process of binding to C-type lectins. Copyright 2000 Academic Press.

  18. Estimating phosphorus loss in runoff from manure and fertilizer for a phosphorus loss quantification tool.

    PubMed

    Vadas, P A; Good, L W; Moore, P A; Widman, N

    2009-01-01

    Nonpoint-source pollution of fresh waters by P is a concern because it contributes to accelerated eutrophication. Given the state of the science concerning agricultural P transport, a simple tool to quantify annual, field-scale P loss is a realistic goal. We developed new methods to predict annual dissolved P loss in runoff from surface-applied manures and fertilizers and validated the methods with data from 21 published field studies. We incorporated these manure and fertilizer P runoff loss methods into an annual, field-scale P loss quantification tool that estimates dissolved and particulate P loss in runoff from soil, manure, fertilizer, and eroded sediment. We validated the P loss tool using independent data from 28 studies that monitored P loss in runoff from a variety of agricultural land uses for at least 1 yr. Results demonstrated (i) that our new methods to estimate P loss from surface manure and fertilizer are an improvement over methods used in existing Indexes, and (ii) that it was possible to reliably quantify annual dissolved, sediment, and total P loss in runoff using relatively simple methods and readily available inputs. Thus, a P loss quantification tool that does not require greater degrees of complexity or input data than existing P Indexes could accurately predict P loss across a variety of management and fertilization practices, soil types, climates, and geographic locations. However, estimates of runoff and erosion are still needed that are accurate to a level appropriate for the intended use of the quantification tool.

  19. Information transduction capacity reduces the uncertainties in annotation-free isoform discovery and quantification

    PubMed Central

    Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong

    2017-01-01

    Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101

  20. Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.

    PubMed

    Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G

    2018-06-01

    Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).

  1. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.

    PubMed

    Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina

    2017-11-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.

  2. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  3. Type II universal spacetimes

    NASA Astrophysics Data System (ADS)

    Hervik, S.; Málek, T.; Pravda, V.; Pravdová, A.

    2015-12-01

    We study type II universal metrics of the Lorentzian signature. These metrics simultaneously solve vacuum field equations of all theories of gravitation with the Lagrangian being a polynomial curvature invariant constructed from the metric, the Riemann tensor and its covariant derivatives of an arbitrary order. We provide examples of type II universal metrics for all composite number dimensions. On the other hand, we have no examples for prime number dimensions and we prove the non-existence of type II universal spacetimes in five dimensions. We also present type II vacuum solutions of selected classes of gravitational theories, such as Lovelock, quadratic and L({{Riemann}}) gravities.

  4. Enhanced gauge symmetry in type II and F-theory compactifications: Dynkin diagrams from polyhedra

    NASA Astrophysics Data System (ADS)

    Perevalov, Eugene; Skarke, Harald

    1997-02-01

    We explain the observation by Candelas and Font that the Dynkin diagrams of non-abelian gauge groups occurring in type IIA and F-theory can be read off from the polyhedron Δ ∗ that provides the toric description of the Calabi-Yau manifold used for compactification. We show how the intersection pattern of toric divisors corresponding to the degeneration of elliptic fibers follows the ADE classification of singularities and the Kodaira classification of degenerations. We treat in detail the cases of elliptic K3 surfaces and K3 fibered threefolds where the fiber is again elliptic. We also explain how even the occurrence of monodromy and non-simply laced groups in the latter case is visible in the toric picture. These methods also work in the fourfold case.

  5. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: validation of the quantitative multimolecular approach by radiocarbon analysis.

    PubMed

    Jeanneau, Laurent; Faure, Pierre

    2010-09-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.

  6. Systematic development of a theory-informed multifaceted behavioural intervention to increase physical activity of adults with type 2 diabetes in routine primary care: Movement as Medicine for Type 2 Diabetes.

    PubMed

    Avery, Leah; Charman, Sarah J; Taylor, Louise; Flynn, Darren; Mosely, Kylie; Speight, Jane; Lievesley, Matthew; Taylor, Roy; Sniehotta, Falko F; Trenell, Michael I

    2016-07-19

    Despite substantial evidence for physical activity (PA) as a management option for type 2 diabetes, there remains a lack of PA behavioural interventions suitable for delivery in primary care. This paper describes the systematic development of an evidence-informed PA behavioural intervention for use during routine primary care consultations. In accordance with the Medical Research Council Framework for the Development and Evaluation of Complex Interventions, a four-stage systematic development process was undertaken: (1) exploratory work involving interviews and workshop discussions identified training needs of healthcare professionals and support needs of adults with type 2 diabetes; (2) a systematic review with meta- and moderator analyses identified behaviour change techniques and optimal intervention intensity and duration; (3) usability testing identified strategies to increase implementation of the intervention in primary care and (4) an open pilot study in two primary care practices facilitated intervention optimisation. Healthcare professional training needs included knowledge about type, intensity and duration of PA sufficient to improve glycaemic control and acquisition of skills to promote PA behaviour change. Patients lacked knowledge about type 2 diabetes and skills to enable them to make sustainable changes to their level of PA. An accredited online training programme for healthcare professionals and a professional-delivered behavioural intervention for adults with type 2 diabetes were subsequently developed. This multifaceted intervention was informed by the theory of planned behaviour and social cognitive theory and consisted of 15 behaviour change techniques. Intervention intensity and duration were informed by a systematic review. Usability testing resolved technical problems with the online training intervention that facilitated use on practice IT systems. An open pilot study of the intervention with fidelity of delivery assessment informed

  7. Standardless quantification by parameter optimization in electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  8. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  9. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  10. Consistency of flow quantifications in tridirectional phase-contrast MRI

    NASA Astrophysics Data System (ADS)

    Unterhinninghofen, R.; Ley, S.; Dillmann, R.

    2009-02-01

    Tridirectionally encoded phase-contrast MRI is a technique to non-invasively acquire time-resolved velocity vector fields of blood flow. These may not only be used to analyze pathological flow patterns, but also to quantify flow at arbitrary positions within the acquired volume. In this paper we examine the validity of this approach by analyzing the consistency of related quantifications instead of comparing it with an external reference measurement. Datasets of the thoracic aorta were acquired from 6 pigs, 1 healthy volunteer and 3 patients with artificial aortic valves. Using in-house software an elliptical flow quantification plane was placed manually at 6 positions along the descending aorta where it was rotated to 5 different angles. For each configuration flow was computed based on the original data and data that had been corrected for phase offsets. Results reveal that quantifications are more dependent on changes in position than on changes in angle. Phase offset correction considerably reduces this dependency. Overall consistency is good with a maximum variation coefficient of 9.9% and a mean variation coefficient of 7.2%.

  11. Theory, the Final Frontier? A Corpus-Based Analysis of the Role of Theory in Psychological Articles.

    PubMed

    Beller, Sieghard; Bender, Andrea

    2017-01-01

    Contemporary psychology regards itself as an empirical science, at least in most of its subfields. Theory building and development are often considered critical to the sciences, but the extent to which psychology can be cast in this way is under debate. According to those advocating a strong role of theory, studies should be designed to test hypotheses derived from theories (theory-driven) and ideally should yield findings that stimulate hypothesis formation and theory building (theory-generating). The alternative position values empirical findings over theories as the lasting legacy of science. To investigate which role theory actually plays in current research practice, we analyse references to theory in the complete set of 2,046 articles accepted for publication in Frontiers of Psychology in 2015. This sample of articles, while not representative in the strictest sense, covers a broad range of sub-disciplines, both basic and applied, and a broad range of article types, including research articles, reviews, hypothesis & theory, and commentaries. For the titles, keyword lists, and abstracts in this sample, we conducted a text search for terms related to empiricism and theory, assessed the frequency and scope of usage for six theory-related terms, and analyzed their distribution over different article types and subsections of the journal. The results indicate substantially lower frequencies of theoretical than empirical terms, with references to a specific (named) theory in less than 10% of the sample and references to any of even the most frequently mentioned theories in less than 0.5% of the sample. In conclusion, we discuss possible limitations of our study and the prospect of theoretical advancement.

  12. Theory, the Final Frontier? A Corpus-Based Analysis of the Role of Theory in Psychological Articles

    PubMed Central

    Beller, Sieghard; Bender, Andrea

    2017-01-01

    Contemporary psychology regards itself as an empirical science, at least in most of its subfields. Theory building and development are often considered critical to the sciences, but the extent to which psychology can be cast in this way is under debate. According to those advocating a strong role of theory, studies should be designed to test hypotheses derived from theories (theory-driven) and ideally should yield findings that stimulate hypothesis formation and theory building (theory-generating). The alternative position values empirical findings over theories as the lasting legacy of science. To investigate which role theory actually plays in current research practice, we analyse references to theory in the complete set of 2,046 articles accepted for publication in Frontiers of Psychology in 2015. This sample of articles, while not representative in the strictest sense, covers a broad range of sub-disciplines, both basic and applied, and a broad range of article types, including research articles, reviews, hypothesis & theory, and commentaries. For the titles, keyword lists, and abstracts in this sample, we conducted a text search for terms related to empiricism and theory, assessed the frequency and scope of usage for six theory-related terms, and analyzed their distribution over different article types and subsections of the journal. The results indicate substantially lower frequencies of theoretical than empirical terms, with references to a specific (named) theory in less than 10% of the sample and references to any of even the most frequently mentioned theories in less than 0.5% of the sample. In conclusion, we discuss possible limitations of our study and the prospect of theoretical advancement. PMID:28642728

  13. The Development of C. A. McMurry's Type Study: Emergence of a Unit Development Theory Embedding Teacher Training

    ERIC Educational Resources Information Center

    Fujimoto, Kazuhisa

    2014-01-01

    From the historical viewpoint, is it possible that curriculum and teacher education could have been integrated at the beginning of the era of curriculum studies? This paper focuses on the development of type study in the 1910s by C. A. McMurry (1857-1929) as a pioneering curriculum theory surveying the scope of teacher education. McMurry was a key…

  14. Theory of mind in schizophrenia: error types and associations with symptoms.

    PubMed

    Fretland, Ragnhild A; Andersson, Stein; Sundet, Kjetil; Andreassen, Ole A; Melle, Ingrid; Vaskinn, Anja

    2015-03-01

    Social cognition is an important determinant of functioning in schizophrenia. However, how social cognition relates to the clinical symptoms of schizophrenia is still unclear. The aim of this study was to explore the relationship between a social cognition domain, Theory of Mind (ToM), and the clinical symptoms of schizophrenia. Specifically, we investigated the associations between three ToM error types; 1) "overmentalizing" 2) "reduced ToM and 3) "no ToM", and positive, negative and disorganized symptoms. Fifty-two participants with a diagnosis of schizophrenia or schizoaffective disorder were assessed with the Movie for the Assessment of Social Cognition (MASC), a video-based ToM measure. An empirically validated five-factor model of the Positive and Negative Syndrome Scale (PANSS) was used to assess clinical symptoms. There was a significant, small-moderate association between overmentalizing and positive symptoms (rho=.28, p=.04). Disorganized symptoms correlated at a trend level with "reduced ToM" (rho=.27, p=.05). There were no other significant correlations between ToM impairments and symptom levels. Positive/disorganized symptoms did not contribute significantly in explaining total ToM performance, whereas IQ did (B=.37, p=.01). Within the undermentalizing domain, participants performed more "reduced ToM" errors than "no ToM" errors. Overmentalizing was associated with positive symptoms. The undermentalizing error types were unrelated to symptoms, but "reduced ToM" was somewhat associated to disorganization. The higher number of "reduced ToM" responses suggests that schizophrenia is characterized by accuracy problems rather than a fundamental lack of mental state concept. The findings call for the use of more sensitive measures when investigating ToM in schizophrenia to avoid the "right/wrong ToM"-dichotomy. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  16. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    PubMed

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures

  17. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    PubMed Central

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  18. Constructor theory of probability

    PubMed Central

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  19. Inference and quantification of peptidoforms in large sample cohorts by SWATH-MS

    PubMed Central

    Röst, Hannes L; Ludwig, Christina; Buil, Alfonso; Bensimon, Ariel; Soste, Martin; Spector, Tim D; Dermitzakis, Emmanouil T; Collins, Ben C; Malmström, Lars; Aebersold, Ruedi

    2017-01-01

    The consistent detection and quantification of protein post-translational modifications (PTMs) across sample cohorts is an essential prerequisite for the functional analysis of biological processes. Data-independent acquisition (DIA), a bottom-up mass spectrometry based proteomic strategy, exemplified by SWATH-MS, provides complete precursor and fragment ion information of a sample and thus, in principle, the information to identify peptidoforms, the modified variants of a peptide. However, due to the convoluted structure of DIA data sets the confident and systematic identification and quantification of peptidoforms has remained challenging. Here we present IPF (Inference of PeptidoForms), a fully automated algorithm that uses spectral libraries to query, validate and quantify peptidoforms in DIA data sets. The method was developed on data acquired by SWATH-MS and benchmarked using a synthetic phosphopeptide reference data set and phosphopeptide-enriched samples. The data indicate that IPF reduced false site-localization by more than 7-fold in comparison to previous approaches, while recovering 85.4% of the true signals. IPF was applied to detect and quantify peptidoforms carrying ten different types of PTMs in DIA data acquired from more than 200 samples of undepleted blood plasma of a human twin cohort. The data approportioned, for the first time, the contribution of heritable, environmental and longitudinal effects on the observed quantitative variability of specific modifications in blood plasma of a human population. PMID:28604659

  20. Developing a Model of Theory-to-Practice-to-Theory in Student Affairs: An Extended Case Analysis of Theories of Student Learning and Development

    ERIC Educational Resources Information Center

    Kimball, Ezekiel W.

    2012-01-01

    Recent literature suggests a problematic connection between theory and practice in higher education scholarship generally and the study of student learning and development specifically (e.g. Bensimon, 2007; Kezar, 2000; Love, 2012). Much of this disconnect stems from a lack of differentiation between various types of theory used in student affairs…

  1. Quantification of taurine in energy drinks using ¹H NMR.

    PubMed

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. AdS/CFT in string theory and M-theory

    NASA Astrophysics Data System (ADS)

    Gulotta, Daniel R.

    The AdS/CFT correspondence is a powerful tool that can help shed light on the relationship between geometry and field theory. The first part of this thesis will focus on the construction of theories dual to Type IIB string theory on AdS5 × Y5, where Y5 is a toric Sasaki-Einstein manifold. This thesis will introduce a consistency condition called ``proper ordering'' and demonstrate that it is equivalent to several other previously known consistency conditions. It will then give an efficient algorithm that produces a consistent field theory for any toric Sasaki-Einstein Y5. The second part of this thesis will examine the large-N limit of the Kapustin-Willett-Yaakov matrix model. This model computes the S3 partition function for a CFT dual to M-theory on AdS4 × Y7. One of the main results will be a formula that relates the distribution of eigenvalues in the matrix model to the distribution of holomorphic operators on the cone over Y7. A variety of examples are given to support this formula.

  3. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Final Technical Report: Mathematical Foundations for Uncertainty Quantification in Materials Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plechac, Petr; Vlachos, Dionisios G.

    We developed path-wise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of non-equilibrium extended molecular systems. The combination of these novel methodologies provided the first methods in the literature which are capable to handle UQ questions for stochastic complex systems with some or all of the following features: (a) multi-scale stochastic models such as (bio)chemical reaction networks, with a very large number of parameters, (b) spatially distributed systems such as Kinetic Monte Carlo or Langevin Dynamics, (c) non-equilibrium processes typically associated with coupled physico-chemical mechanisms, driven boundary conditions, hybrid micro-macro systems,more » etc. A particular computational challenge arises in simulations of multi-scale reaction networks and molecular systems. Mathematical techniques were applied to in silico prediction of novel materials with emphasis on the effect of microstructure on model uncertainty quantification (UQ). We outline acceleration methods to make calculations of real chemistry feasible followed by two complementary tasks on structure optimization and microstructure-induced UQ.« less

  5. Developmental Systems of Students' Personal Theories about Education

    ERIC Educational Resources Information Center

    Barger, Michael M.; Linnenbrink-Garcia, Lisa

    2017-01-01

    Children hold many personal theories about education: theories about themselves, knowledge, and the learning process. Personal theories help children predict what their actions will cause, and therefore relate to motivation, self-regulation, and achievement. Researchers typically examine how specific types of personal theories develop…

  6. Path-Goal Theory of Leadership

    DTIC Science & Technology

    1975-04-01

    PATH-GOAL THEORY OF LEADERSHIP Robert J. House, et al Washington University AD-A009 513 Prepared for: Office of Naval Research April 1975...NUMBER^ READ »ITRUCTIONS BEFORE COMPLETING FORM *- TITLE (mö Submit) PATH-GOAL THEORY OF LEADERSHIP t. TYPE OF REPORT ft PERIOD COVERED... Theory -Contingency Factors - Leadership Style 20. ABSTRACT (Conllnuo an rmvttc cmy an« Idonlllr kr Meek mmtbor) The paper reviews the path-goal

  7. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    NASA Astrophysics Data System (ADS)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for

  8. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. A theory of solar type 3 radio bursts

    NASA Technical Reports Server (NTRS)

    Goldstein, M. L.; Papadopoulos, K.; Smith, R. A.

    1979-01-01

    Energetic electrons propagating through the interplanetary medium are shown to excite the one dimensional oscillating two stream instability (OTSI). The OTSI is in turn stabilized by anomalous resistivity which completes the transfer of long wavelength Langmuir waves to short wavelengths, out of resonance with the electrons. The theory explains the small energy losses suffered by the electrons in propagating to 1 AU, the predominance of second harmonic radiation, and the observed correlation between radio and electron fluxes.

  10. Band-gap corrected density functional theory calculations for InAs/GaSb type II superlattices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianwei; Zhang, Yong

    2014-12-07

    We performed pseudopotential based density functional theory (DFT) calculations for GaSb/InAs type II superlattices (T2SLs), with bandgap errors from the local density approximation mitigated by applying an empirical method to correct the bulk bandgaps. Specifically, this work (1) compared the calculated bandgaps with experimental data and non-self-consistent atomistic methods; (2) calculated the T2SL band structures with varying structural parameters; (3) investigated the interfacial effects associated with the no-common-atom heterostructure; and (4) studied the strain effect due to lattice mismatch between the two components. This work demonstrates the feasibility of applying the DFT method to more exotic heterostructures and defect problemsmore » related to this material system.« less

  11. On B-type Open-Closed Landau-Ginzburg Theories Defined on Calabi-Yau Stein Manifolds

    NASA Astrophysics Data System (ADS)

    Babalic, Elena Mirela; Doryn, Dmitry; Lazaroiu, Calin Iuliu; Tavakol, Mehdi

    2018-05-01

    We consider the bulk algebra and topological D-brane category arising from the differential model of the open-closed B-type topological Landau-Ginzburg theory defined by a pair (X,W), where X is a non-compact Calabi-Yau manifold and W is a complex-valued holomorphic function. When X is a Stein manifold (but not restricted to be a domain of holomorphy), we extract equivalent descriptions of the bulk algebra and of the category of topological D-branes which are constructed using only the analytic space associated to X. In particular, we show that the D-brane category is described by projective factorizations defined over the ring of holomorphic functions of X. We also discuss simplifications of the analytic models which arise when X is holomorphically parallelizable and illustrate these in a few classes of examples.

  12. Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq.

    PubMed

    Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H; Keleş, Sündüz; Dewey, Colin N

    2016-08-01

    RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. © 2016 Liu et al.; Published by Cold Spring Harbor Laboratory Press.

  13. Multiscale recurrence quantification analysis of order recurrence plots

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian; Lin, Aijing

    2017-03-01

    In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.

  14. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics1[OPEN

    PubMed Central

    Poeschl, Yvonne; Plötner, Romina

    2017-01-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. PMID:28931626

  15. [Quantification of pulmonary emphysema in multislice-CT using different software tools].

    PubMed

    Heussel, C P; Achenbach, T; Buschsieweke, C; Kuhnigk, J; Weinheimer, O; Hammer, G; Düber, C; Kauczor, H-U

    2006-10-01

    The data records of thin-section MSCT of the lung with approx. 300 images are difficult to use in manual evaluation. A computer-assisted pre-diagnosis can help with reporting. Furthermore, post-processing techniques, for instance, for quantification of emphysema on the basis of three-dimensional anatomical information might be improved and the workflow might be further automated. The results of 4 programs (Pulmo, Volume, YACTA and PulmoFUNC) for the quantitative analysis of emphysema (lung and emphysema volume, mean lung density and emphysema index) of 30 consecutive thin-section MSCT datasets with different emphysema severity levels were compared. The classification result of the YACTA program for different types of emphysema was also analyzed. Pulmo and Volume have a median operating time of 105 and 59 minutes respectively due to the necessity for extensive manual correction of the lung segmentation. The programs PulmoFUNC and YACTA, which are automated to a large extent, have a median runtime of 26 and 16 minutes, respectively. The evaluation with Pulmo and Volume using 2 different datasets resulted in implausible values. PulmoFUNC crashed with 2 other datasets in a reproducible manner. Only with YACTA could all graphic datasets be evaluated. The lung volume, emphysema volume, emphysema index and mean lung density determined by YACTA and PulmoFUNC are significantly larger than the corresponding values of Volume and Pulmo (differences: Volume: 119 cm(3)/65 cm(3)/1 %/17 HU, Pulmo: 60 cm(3)/96 cm(3)/1 %/37 HU). Classification of the emphysema type was in agreement with that of the radiologist in 26 panlobular cases, in 22 paraseptalen cases and in 15 centrilobular emphysema cases. The substantial expenditure of time obstructs the employment of quantitative emphysema analysis in the clinical routine. The results of YACTA and PulmoFUNC are affected by the dedicated exclusion of the tracheobronchial system. These fully automatic tools enable not only fast

  16. Quantification of collagen contraction in three-dimensional cell culture.

    PubMed

    Kopanska, Katarzyna S; Bussonnier, Matthias; Geraldo, Sara; Simon, Anthony; Vignjevic, Danijela; Betz, Timo

    2015-01-01

    Many different cell types including fibroblasts, smooth muscle cells, endothelial cells, and cancer cells exert traction forces on the fibrous components of the extracellular matrix. This can be observed as matrix contraction both macro- and microscopically in three-dimensional (3D) tissues models such as collagen type I gels. The quantification of local contraction at the micron scale, including its directionality and speed, in correlation with other parameters such as cell invasion, local protein or gene expression, can provide useful information to study wound healing, organism development, and cancer metastasis. In this article, we present a set of tools to quantify the flow dynamics of collagen contraction, induced by cells migrating out of a multicellular cancer spheroid into a three-dimensional (3D) collagen matrix. We adapted a pseudo-speckle technique that can be applied to bright-field and fluorescent microscopy time series. The image analysis presented here is based on an in-house written software developed in the Matlab (Mathworks) programming environment. The analysis program is freely available from GitHub following the link: http://dx.doi.org/10.5281/zenodo.10116. This tool provides an automatized technique to measure collagen contraction that can be utilized in different 3D cellular systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Stable-isotope-labeled Histone Peptide Library for Histone Post-translational Modification and Variant Quantification by Mass Spectrometry *

    PubMed Central

    Lin, Shu; Wein, Samuel; Gonzales-Cope, Michelle; Otte, Gabriel L.; Yuan, Zuo-Fei; Afjehi-Sadat, Leila; Maile, Tobias; Berger, Shelley L.; Rush, John; Lill, Jennie R.; Arnott, David; Garcia, Benjamin A.

    2014-01-01

    To facilitate accurate histone variant and post-translational modification (PTM) quantification via mass spectrometry, we present a library of 93 synthetic peptides using Protein-Aqua™ technology. The library contains 55 peptides representing different modified forms from histone H3 peptides, 23 peptides representing H4 peptides, 5 peptides representing canonical H2A peptides, 8 peptides representing H2A.Z peptides, and peptides for both macroH2A and H2A.X. The PTMs on these peptides include lysine mono- (me1), di- (me2), and tri-methylation (me3); lysine acetylation; arginine me1; serine/threonine phosphorylation; and N-terminal acetylation. The library was subjected to chemical derivatization with propionic anhydride, a widely employed protocol for histone peptide quantification. Subsequently, the detection efficiencies were quantified using mass spectrometry extracted ion chromatograms. The library yields a wide spectrum of detection efficiencies, with more than 1700-fold difference between the peptides with the lowest and highest efficiencies. In this paper, we describe the impact of different modifications on peptide detection efficiencies and provide a resource to correct for detection biases among the 93 histone peptides. In brief, there is no correlation between detection efficiency and molecular weight, hydrophobicity, basicity, or modification type. The same types of modifications may have very different effects on detection efficiencies depending on their positions within a peptide. We also observed antagonistic effects between modifications. In a study of mouse trophoblast stem cells, we utilized the detection efficiencies of the peptide library to correct for histone PTM/variant quantification. For most histone peptides examined, the corrected data did not change the biological conclusions but did alter the relative abundance of these peptides. For a low-abundant histone H2A variant, macroH2A, the corrected data led to a different conclusion than the

  18. Theory-based self-management educational interventions on patients with type 2 diabetes: a systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Zhao, Fang-Fang; Suhonen, Riitta; Koskinen, Sanna; Leino-Kilpi, Helena

    2017-04-01

    To synthesize the effects of theory-based self-management educational interventions on patients with type 2 diabetes (T2DM) in randomized controlled trials. Type 2 diabetes is a common chronic disease causing complications that put a heavy burden on society and reduce the quality of life of patients. Good self-management of diabetes can prevent complications and improve the quality of life of T2DM patients. Systematic review with meta-analysis of randomized controlled trials following Cochrane methods. A literature search was carried out in the MEDLINE, EMBASE, CINAHL, PSYCINFO, and Web of Science databases (1980-April 2015). The risk of bias of these eligible studies was assessed independently by two authors using the Cochrane Collaboration's tool. The Publication bias of the main outcomes was examined. Statistical heterogeneity and random-effects model were used for meta-analysis. Twenty studies with 5802 participants met the inclusion criteria. The interventions in the studies were based on one or more theories which mostly belong to mid-range theories. The pooled main outcomes by random-effects model showed significant improvements in HbA1c, self-efficacy, and diabetes knowledge, but not in BMI. As for quality of life, no conclusions can be drawn as the pooled outcome became the opposite with reduced heterogeneity after one study was excluded. No significant publication bias was found in the main outcomes. To get theory-based interventions to produce more effects, the role of patients should be more involved and stronger and the education team should be trained beyond the primary preparation for the self-management education program. © 2016 John Wiley & Sons Ltd.

  19. Definition of a new thermal contrast and pulse correction for defect quantification in pulsed thermography

    NASA Astrophysics Data System (ADS)

    Benítez, Hernán D.; Ibarra-Castanedo, Clemente; Bendada, AbdelHakim; Maldague, Xavier; Loaiza, Humberto; Caicedo, Eduardo

    2008-01-01

    It is well known that the methods of thermographic non-destructive testing based on the thermal contrast are strongly affected by non-uniform heating at the surface. Hence, the results obtained from these methods considerably depend on the chosen reference point. The differential absolute contrast (DAC) method was developed to eliminate the need of determining a reference point that defined the thermal contrast with respect to an ideal sound area. Although, very useful at early times, the DAC accuracy decreases when the heat front approaches the sample rear face. We propose a new DAC version by explicitly introducing the sample thickness using the thermal quadrupoles theory and showing that the new DAC range of validity increases for long times while preserving the validity for short times. This new contrast is used for defect quantification in composite, Plexiglas™ and aluminum samples.

  20. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    PubMed

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  2. Bayesian Methods for Effective Field Theories

    NASA Astrophysics Data System (ADS)

    Wesolowski, Sarah

    Microscopic predictions of the properties of atomic nuclei have reached a high level of precision in the past decade. This progress mandates improved uncertainty quantification (UQ) for a robust comparison of experiment with theory. With the uncertainty from many-body methods under control, calculations are now sensitive to the input inter-nucleon interactions. These interactions include parameters that must be fit to experiment, inducing both uncertainty from the fit and from missing physics in the operator structure of the Hamiltonian. Furthermore, the implementation of the inter-nucleon interactions is not unique, which presents the additional problem of assessing results using different interactions. Effective field theories (EFTs) take advantage of a separation of high- and low-energy scales in the problem to form a power-counting scheme that allows the organization of terms in the Hamiltonian based on their expected contribution to observable predictions. This scheme gives a natural framework for quantification of uncertainty due to missing physics. The free parameters of the EFT, called the low-energy constants (LECs), must be fit to data, but in a properly constructed EFT these constants will be natural-sized, i.e., of order unity. The constraints provided by the EFT, namely the size of the systematic uncertainty from truncation of the theory and the natural size of the LECs, are assumed information even before a calculation is performed or a fit is done. Bayesian statistical methods provide a framework for treating uncertainties that naturally incorporates prior information as well as putting stochastic and systematic uncertainties on an equal footing. For EFT UQ Bayesian methods allow the relevant EFT properties to be incorporated quantitatively as prior probability distribution functions (pdfs). Following the logic of probability theory, observable quantities and underlying physical parameters such as the EFT breakdown scale may be expressed as pdfs that

  3. Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.

    PubMed

    de Wit, C; Fautz, C; Xu, Y

    2000-09-01

    Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell

  4. Deconfinement and the Hagedorn transition in string theory.

    PubMed

    Chaudhuri, S

    2001-03-05

    We introduce a new definition of the thermal partition function in string theory. With this new definition, the thermal partition functions of all of the string theories obey thermal duality relations with self-dual Hagedorn temperature beta(2)(H) = 4pi(2)alpha('). A beta-->beta(2)(H)/beta transformation maps the type I theory into a new string theory (type I) with thermal D p-branes, spatial hypersurfaces supporting a p-dimensional finite temperature non-Abelian Higgs-gauge theory for p< or =9. We demonstrate a continuous phase transition in the behavior of the static heavy quark-antiquark potential for small separations r(2)(*)

  5. The Schwarzian theory — origins

    NASA Astrophysics Data System (ADS)

    Mertens, Thomas G.

    2018-05-01

    In this paper we further study the 1d Schwarzian theory, the universal low-energy limit of Sachdev-Ye-Kitaev models, using the link with 2d Liouville theory. We provide a path-integral derivation of the structural link between both theories, and study the relation between 3d gravity, 2d Jackiw-Teitelboim gravity, 2d Liouville and the 1d Schwarzian. We then generalize the Schwarzian double-scaling limit to rational models, relevant for SYK-type models with internal symmetries. We identify the holographic gauge theory as a 2d BF theory and compute correlators of the holographically dual 1d particle-on-a-group action, decomposing these into diagrammatic building blocks, in a manner very similar to the Schwarzian theory.

  6. Matrix suppression as a guideline for reliable quantification of peptides by matrix-assisted laser desorption ionization.

    PubMed

    Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo

    2013-09-17

    We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.

  7. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR.

    PubMed

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.

  8. Mathematical and Computational Foundations of Recurrence Quantifications

    NASA Astrophysics Data System (ADS)

    Marwan, Norbert; Webber, Charles L.

    Real-world systems possess deterministic trajectories, phase singularities and noise. Dynamic trajectories have been studied in temporal and frequency domains, but these are linear approaches. Basic to the field of nonlinear dynamics is the representation of trajectories in phase space. A variety of nonlinear tools such as the Lyapunov exponent, Kolmogorov-Sinai entropy, correlation dimension, etc. have successfully characterized trajectories in phase space, provided the systems studied were stationary in time. Ubiquitous in nature, however, are systems that are nonlinear and nonstationary, existing in noisy environments all of which are assumption breaking to otherwise powerful linear tools. What has been unfolding over the last quarter of a century, however, is the timely discovery and practical demonstration that the recurrences of system trajectories in phase space can provide important clues to the system designs from which they derive. In this chapter we will introduce the basics of recurrence plots (RP) and their quantification analysis (RQA). We will begin by summarizing the concept of phase space reconstructions. Then we will provide the mathematical underpinnings of recurrence plots followed by the details of recurrence quantifications. Finally, we will discuss computational approaches that have been implemented to make recurrence strategies feasible and useful. As computers become faster and computer languages advance, younger generations of researchers will be stimulated and encouraged to capture nonlinear recurrence patterns and quantification in even better formats. This particular branch of nonlinear dynamics remains wide open for the definition of new recurrence variables and new applications untouched to date.

  9. Residual transglutaminase in collagen - effects, detection, quantification, and removal.

    PubMed

    Schloegl, W; Klein, A; Fürst, R; Leicht, U; Volkmer, E; Schieker, M; Jus, S; Guebitz, G M; Stachel, I; Meyer, M; Wiggenhorn, M; Friess, W

    2012-02-01

    In the present study, we developed an enzyme-linked immunosorbent assay (ELISA) for microbial transglutaminase (mTG) from Streptomyces mobaraensis to overcome the lack of a quantification method for mTG. We further performed a detailed follow-on-analysis of insoluble porcine collagen type I enzymatically modified with mTG primarily focusing on residuals of mTG. Repeated washing (4 ×) reduced mTG-levels in the washing fluids but did not quantitatively remove mTG from the material (p < 0.000001). Substantial amounts of up to 40% of the enzyme utilized in the crosslinking mixture remained associated with the modified collagen. Binding was non-covalent as could be demonstrated by Western blot analysis. Acidic and alkaline dialysis of mTG treated collagen material enabled complete removal the enzyme. Treatment with guanidinium chloride, urea, or sodium chloride was less effective in reducing the mTG content. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Does theory influence the effectiveness of health behavior interventions? Meta-analysis.

    PubMed

    Prestwich, Andrew; Sniehotta, Falko F; Whittington, Craig; Dombrowski, Stephan U; Rogers, Lizzie; Michie, Susan

    2014-05-01

    To systematically investigate the extent and type of theory use in physical activity and dietary interventions, as well as associations between extent and type of theory use with intervention effectiveness. An in-depth analysis of studies included in two systematic reviews of physical activity and healthy eating interventions (k = 190). Extent and type of theory use was assessed using the Theory Coding Scheme (TCS) and intervention effectiveness was calculated using Hedges's g. Metaregressions assessed the relationships between these measures. Fifty-six percent of interventions reported a theory base. Of these, 90% did not report links between all of their behavior change techniques (BCTs) with specific theoretical constructs and 91% did not report links between all the specified constructs with BCTs. The associations between a composite score or specific items on the TCS and intervention effectiveness were inconsistent. Interventions based on Social Cognitive Theory or the Transtheoretical Model were similarly effective and no more effective than interventions not reporting a theory base. The coding of theory in these studies suggested that theory was not often used extensively in the development of interventions. Moreover, the relationships between type of theory used and the extent of theory use with effectiveness were generally weak. The findings suggest that attempts to apply the two theories commonly used in this review more extensively are unlikely to increase intervention effectiveness. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  11. Theory Interpretations in PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)

    2001-01-01

    The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.

  12. A short course on measure and probability theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the pastmore » decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.« less

  13. A theory of cerebellar cortex and adaptive motor control based on two types of universal function approximation capability.

    PubMed

    Fujita, Masahiko

    2016-03-01

    Lesions of the cerebellum result in large errors in movements. The cerebellum adaptively controls the strength and timing of motor command signals depending on the internal and external environments of movements. The present theory describes how the cerebellar cortex can control signals for accurate and timed movements. A model network of the cerebellar Golgi and granule cells is shown to be equivalent to a multiple-input (from mossy fibers) hierarchical neural network with a single hidden layer of threshold units (granule cells) that receive a common recurrent inhibition (from a Golgi cell). The weighted sum of the hidden unit signals (Purkinje cell output) is theoretically analyzed regarding the capability of the network to perform two types of universal function approximation. The hidden units begin firing as the excitatory inputs exceed the recurrent inhibition. This simple threshold feature leads to the first approximation theory, and the network final output can be any continuous function of the multiple inputs. When the input is constant, this output becomes stationary. However, when the recurrent unit activity is triggered to decrease or the recurrent inhibition is triggered to increase through a certain mechanism (metabotropic modulation or extrasynaptic spillover), the network can generate any continuous signals for a prolonged period of change in the activity of recurrent signals, as the second approximation theory shows. By incorporating the cerebellar capability of two such types of approximations to a motor system, in which learning proceeds through repeated movement trials with accompanying corrections, accurate and timed responses for reaching the target can be adaptively acquired. Simple models of motor control can solve the motor error vs. sensory error problem, as well as the structural aspects of credit (or error) assignment problem. Two physiological experiments are proposed for examining the delay and trace conditioning of eyelid responses, as

  14. A correction method for the axial maladjustment of transmission-type optical system based on aberration theory

    NASA Astrophysics Data System (ADS)

    Xu, Chunmei; Huang, Fu-yu; Yin, Jian-ling; Chen, Yu-dan; Mao, Shao-juan

    2016-10-01

    The influence of aberration on misalignment of optical system is considered fully, the deficiencies of Gauss optical correction method is pointed, and a correction method for transmission-type misalignment optical system is proposed based on aberration theory. The variation regularity of single lens aberration caused by axial displacement is analyzed, and the aberration effect is defined. On this basis, through calculating the size of lens adjustment induced by the image position error and the magnifying rate error, the misalignment correction formula based on the constraints of the aberration is deduced mathematically. Taking the three lens collimation system for an example, the test is carried out to validate this method, and its superiority is proved.

  15. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  16. Motivational Profiles for Physical Activity Practice in Adults with Type 2 Diabetes: A Self-Determination Theory Perspective.

    PubMed

    Gourlan, Mathieu; Trouilloud, David; Boiché, Julie

    2016-01-01

    Drawing on Self-Determination Theory, this study explored the motivational profiles toward Physical Activity (PA) among adults with type 2 diabetes and the relationships between motivational profile, perceived competence and PA. Participants were 350 men and women (Mean age 62.77 years) who were interviewed on their motivations toward PA, perceived level of competence to practice, and PA practice. Cluster analyses reveal the existence of three distinct profiles: "High Combined" (ie, high scores on motivations ranging from intrinsic to external regulation, moderate level on amotivation), "Self-Determined" (ie, high scores on intrinsic, integrated, and identified regulations; low scores on other regulations), and "Moderate" (ie, moderate scores on all regulations). Participants with "High Combined" and "Self-Determined" profiles reported higher perceived competence and longer leisure-time PA practice in comparison to those with a "Moderate" profile. This study highlights the necessity of adopting a person-centered approach to better understand motivation toward PA among type 2 diabetics.

  17. Protein, enzyme and carbohydrate quantification using smartphone through colorimetric digitization technique.

    PubMed

    Dutta, Sibasish; Saikia, Gunjan Prasad; Sarma, Dhruva Jyoti; Gupta, Kuldeep; Das, Priyanka; Nath, Pabitra

    2017-05-01

    In this paper the utilization of smartphone as a detection platform for colorimetric quantification of biological macromolecules has been demonstrated. Using V-channel of HSV color space, the quantification of BSA protein, catalase enzyme and carbohydrate (using D-glucose) have been successfully investigated. A custom designed android application has been developed for estimating the total concentration of biological macromolecules. The results have been compared with that of a standard spectrophotometer which is generally used for colorimetric quantification in laboratory settings by measuring its absorbance at a specific wavelength. The results obtained with the designed sensor is found to be similar when compared with the spectrophotometer data. The designed sensor is low cost, robust and we envision that it could promote diverse fields of bio-analytical investigations. Schematic illustration of the smartphone sensing mechanism for colorimetric analysis of biomolecular samples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Quantification of trace metals in water using complexation and filter concentration.

    PubMed

    Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel

    2010-06-15

    Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.

  19. Charge Transfer Enhancement in the D-π-A Type Porphyrin Dyes: A Density Functional Theory (DFT) and Time-Dependent Density Functional Theory (TD-DFT) Study.

    PubMed

    Kang, Guo-Jun; Song, Chao; Ren, Xue-Feng

    2016-11-25

    The electronic geometries and optical properties of two D-π-A type zinc porphyrin dyes (NCH₃-YD2 and TPhe-YD) were systematically investigated by density functional theory (DFT) and time-dependent density functional theory (TD-DFT) to reveal the origin of significantly altered charge transfer enhancement by changing the electron donor of the famous porphyrin-based sensitizer YD2-o-C8. The molecular geometries and photophysical properties of dyes before and after binding to the TiO₂ cluster were fully investigated. From the analyses of natural bond orbital (NBO), extended charge decomposition analysis (ECDA), and electron density variations (Δρ) between the excited state and ground state, it was found that the introduction of N(CH₃)₂ and 1,1,2-triphenylethene groups enhanced the intramolecular charge-transfer (ICT) character compared to YD2-o-C8. The absorption wavelength and transition possess character were significantly influenced by N(CH₃)₂ and 1,1,2-triphenylethene groups. NCH₃-YD2 with N(CH₃)₂ groups in the donor part is an effective way to improve the interactions between the dyes and TiO₂ surface, light having efficiency (LHE), and free energy change (ΔG inject ), which is expected to be an efficient dye for use in dye-sensitized solar cells (DSSCs).

  20. Nitric Oxide Analyzer Quantification of Plant S-Nitrosothiols.

    PubMed

    Hussain, Adil; Yun, Byung-Wook; Loake, Gary J

    2018-01-01

    Nitric oxide (NO) is a small diatomic molecule that regulates multiple physiological processes in animals, plants, and microorganisms. In animals, it is involved in vasodilation and neurotransmission and is present in exhaled breath. In plants, it regulates both plant immune function and numerous developmental programs. The high reactivity and short half-life of NO and cross-reactivity of its various derivatives make its quantification difficult. Different methods based on calorimetric, fluorometric, and chemiluminescent detection of NO and its derivatives are available, but all of them have significant limitations. Here we describe a method for the chemiluminescence-based quantification of NO using ozone-chemiluminescence technology in plants. This approach provides a sensitive, robust, and flexible approach for determining the levels of NO and its signaling products, protein S-nitrosothiols.

  1. Cosmological constraints on Brans-Dicke theory.

    PubMed

    Avilez, A; Skordis, C

    2014-07-04

    We report strong cosmological constraints on the Brans-Dicke (BD) theory of gravity using cosmic microwave background data from Planck. We consider two types of models. First, the initial condition of the scalar field is fixed to give the same effective gravitational strength Geff today as the one measured on Earth, GN. In this case, the BD parameter ω is constrained to ω>692 at the 99% confidence level, an order of magnitude improvement over previous constraints. In the second type, the initial condition for the scalar is a free parameter leading to a somewhat stronger constraint of ω>890, while Geff is constrained to 0.981theory and are valid for any Horndeski theory, the most general second-order scalar-tensor theory, which approximates the BD theory on cosmological scales. In this sense, our constraints place strong limits on possible modifications of gravity that might explain cosmic acceleration.

  2. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    PubMed

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  3. Automated Detection of Stereotypical Motor Movements in Autism Spectrum Disorder Using Recurrence Quantification Analysis

    PubMed Central

    Großekathöfer, Ulf; Manyakov, Nikolay V.; Mihajlović, Vojkan; Pandina, Gahan; Skalkin, Andrew; Ness, Seth; Bangerter, Abigail; Goodwin, Matthew S.

    2017-01-01

    A number of recent studies using accelerometer features as input to machine learning classifiers show promising results for automatically detecting stereotypical motor movements (SMM) in individuals with Autism Spectrum Disorder (ASD). However, replicating these results across different types of accelerometers and their position on the body still remains a challenge. We introduce a new set of features in this domain based on recurrence plot and quantification analyses that are orientation invariant and able to capture non-linear dynamics of SMM. Applying these features to an existing published data set containing acceleration data, we achieve up to 9% average increase in accuracy compared to current state-of-the-art published results. Furthermore, we provide evidence that a single torso sensor can automatically detect multiple types of SMM in ASD, and that our approach allows recognition of SMM with high accuracy in individuals when using a person-independent classifier. PMID:28261082

  4. Automated Detection of Stereotypical Motor Movements in Autism Spectrum Disorder Using Recurrence Quantification Analysis.

    PubMed

    Großekathöfer, Ulf; Manyakov, Nikolay V; Mihajlović, Vojkan; Pandina, Gahan; Skalkin, Andrew; Ness, Seth; Bangerter, Abigail; Goodwin, Matthew S

    2017-01-01

    A number of recent studies using accelerometer features as input to machine learning classifiers show promising results for automatically detecting stereotypical motor movements (SMM) in individuals with Autism Spectrum Disorder (ASD). However, replicating these results across different types of accelerometers and their position on the body still remains a challenge. We introduce a new set of features in this domain based on recurrence plot and quantification analyses that are orientation invariant and able to capture non-linear dynamics of SMM. Applying these features to an existing published data set containing acceleration data, we achieve up to 9% average increase in accuracy compared to current state-of-the-art published results. Furthermore, we provide evidence that a single torso sensor can automatically detect multiple types of SMM in ASD, and that our approach allows recognition of SMM with high accuracy in individuals when using a person-independent classifier.

  5. Thermoelectric properties of n and p-type cubic and tetragonal XTiO3 (X = Ba,Pb): A density functional theory study

    NASA Astrophysics Data System (ADS)

    Rahman, Gul; Rahman, Altaf Ur

    2017-12-01

    Thermoelectric properties of cubic (C) and tetragonal (T) BaTiO3 (BTO) and PbTiO3 (PTO) are investigated using density functional theory together with semiclassical Boltzmann's transport theory. Both electron and hole doped BTO and PTO are considered in 300-500 K temperature range. We observed that C-BTO has larger power factor(PF) when doped with holes, whereas n-type carrier concentration in C-PTO has larger PF. Comparing both BTO and PTO, C-PTO has larger figure of merit ZT. Tetragonal distortion reduces the Seebeck coefficient S in n-doped PTO, and the electronic structures revealed that such reduction in S is mainly caused by the increase in the optical band gaps (Γ - Γ and Γ-X).

  6. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  7. Slater-type geminals in explicitly-correlated perturbation theory: application to n-alkanols and analysis of errors and basis-set requirements.

    PubMed

    Höfener, Sebastian; Bischoff, Florian A; Glöss, Andreas; Klopper, Wim

    2008-06-21

    In the recent years, Slater-type geminals (STGs) have been used with great success to expand the first-order wave function in an explicitly-correlated perturbation theory. The present work reports on this theory's implementation in the framework of the Turbomole suite of programs. A formalism is presented for evaluating all of the necessary molecular two-electron integrals by means of the Obara-Saika recurrence relations, which can be applied when the STG is expressed as a linear combination of a small number (n) of Gaussians (STG-nG geminal basis). In the Turbomole implementation of the theory, density fitting is employed and a complementary auxiliary basis set (CABS) is used for the resolution-of-the-identity (RI) approximation of explicitly-correlated theory. By virtue of this RI approximation, the calculation of molecular three- and four-electron integrals is avoided. An approximation is invoked to avoid the two-electron integrals over the commutator between the operators of kinetic energy and the STG. This approximation consists of computing commutators between matrices in place of operators. Integrals over commutators between operators would have occurred if the theory had been formulated and implemented as proposed originally. The new implementation in Turbomole was tested by performing a series of calculations on rotational conformers of the alkanols n-propanol through n-pentanol. Basis-set requirements concerning the orbital basis, the auxiliary basis set for density fitting and the CABS were investigated. Furthermore, various (constrained) optimizations of the amplitudes of the explicitly-correlated double excitations were studied. These amplitudes can be optimized in orbital-variant and orbital-invariant manners, or they can be kept fixed at the values governed by the rational generator approach, that is, by the electron cusp conditions. Electron-correlation effects beyond the level of second-order perturbation theory were accounted for by conventional

  8. Modeling qRT-PCR dynamics with application to cancer biomarker quantification.

    PubMed

    Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A

    2017-01-01

    Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.

  9. Statistical analysis of 4 types of neck whiplash injuries based on classical meridian theory.

    PubMed

    Chen, Yemeng; Zhao, Yan; Xue, Xiaolin; Li, Hui; Wu, Xiuyan; Zhang, Qunce; Zheng, Xin; Wang, Tianfang

    2015-01-01

    As one component of the Chinese medicine meridian system, the meridian sinew (Jingjin, (see text), tendino-musculo) is specially described as being for acupuncture treatment of the musculoskeletal system because of its dynamic attributes and tender point correlations. In recent decades, the therapeutic importance of the sinew meridian has become revalued in clinical application. Based on this theory, the authors have established therapeutic strategies of acupuncture treatment in Whiplash-Associated Disorders (WAD) by categorizing four types of neck symptom presentations. The advantage of this new system is to make it much easier for the clinician to find effective acupuncture points. This study attempts to prove the significance of the proposed therapeutic strategies by analyzing data collected from a clinical survey of various WAD using non-supervised statistical methods, such as correlation analysis, factor analysis, and cluster analysis. The clinical survey data have successfully verified discrete characteristics of four neck syndromes, based upon the range of motion (ROM) and tender point location findings. A summary of the relationships among the symptoms of the four neck syndromes has shown the correlation coefficient as having a statistical significance (P < 0.01 or P < 0.05), especially with regard to ROM. Furthermore, factor and cluster analyses resulted in a total of 11 categories of general symptoms, which implies syndrome factors are more related to the Liver, as originally described in classical theory. The hypothesis of meridian sinew syndromes in WAD is clearly supported by the statistical analysis of the clinical trials. This new discovery should be beneficial in improving therapeutic outcomes.

  10. Instantons in string theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahlén, Olof, E-mail: olof.ahlen@aei.mpg.de

    2015-12-17

    These proceedings from the second Caesar Lattes meeting in Rio de Janeiro 2015 are a brief introduction to how automorphic forms appear in the low energy effective action of maximally supersymmetric string theory. The explicit example of the R{sup 4}-interaction of type IIB string theory in ten dimensions is discussed. Its Fourier expansion is interpreted in terms of perturbative and non-perturbative contributions to the four graviton amplitude.

  11. A refined methodology for modeling volume quantification performance in CT

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  12. Recurrence quantification as potential bio-markers for diagnosis of pre-cancer

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sabyasachi; Pratiher, Sawon; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2017-03-01

    In this paper, the spectroscopy signals have been analyzed in recurrence plots (RP), and extract recurrence quantification analysis (RQA) parameters from the RP in order to classify the tissues into normal and different precancerous grades. Three RQA parameters have been quantified in order to extract the important features in the spectroscopy data. These features have been fed to different classifiers for classification. Simulation results validate the efficacy of the recurrence quantification as potential bio-markers for diagnosis of pre-cancer.

  13. Generic method for the absolute quantification of glutathione S-conjugates: Application to the conjugates of acetaminophen, clozapine and diclofenac.

    PubMed

    den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M

    2017-03-01

    Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.

  14. Using the theory of planned behavior to predict two types of snack food consumption among Midwestern upper elementary children: implications for practice.

    PubMed

    Branscum, Paul; Sharma, Manoj

    This study examined the extent to which constructs of the theory of planned behavior (TPB) can predict the consumption of two types of snack foods among elementary school children. A 15-item instrument tested for validity and reliability measuring TPB constructs was developed and administered to 167 children. Snack foods were evaluated using a modified 24-hour recall method. On average, children consumed 302 calories from snack foods per day. Stepwise multiple regression found that attitudes, subjective norms, and perceived control accounted for 44.7% of the variance for intentions. Concurrently, intentions accounted for 11.3% of the variance for calorically-dense snack food consumption and 8.9% of the variance for fruit and vegetable snack consumption. Results suggest that the theory of planned behavior is an efficacious theory for these two behaviors. Future interventions should consider using this theoretical framework and aim to enhance children's attitudes, perceived control, and subjective norms towards snack food consumption.

  15. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  16. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  17. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  18. Theoretical limitations of quantification for noncompetitive sandwich immunoassays.

    PubMed

    Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom

    2015-11-01

    Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations.

  19. Relativistic theory of tidal Love numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binnington, Taylor; Poisson, Eric

    In Newtonian gravitational theory, a tidal Love number relates the mass multipole moment created by tidal forces on a spherical body to the applied tidal field. The Love number is dimensionless, and it encodes information about the body's internal structure. We present a relativistic theory of Love numbers, which applies to compact bodies with strong internal gravities; the theory extends and completes a recent work by Flanagan and Hinderer, which revealed that the tidal Love number of a neutron star can be measured by Earth-based gravitational-wave detectors. We consider a spherical body deformed by an external tidal field, and providemore » precise and meaningful definitions for electric-type and magnetic-type Love numbers; and these are computed for polytropic equations of state. The theory applies to black holes as well, and we find that the relativistic Love numbers of a nonrotating black hole are all zero.« less

  20. A Mathematical Theory of System Information Flow

    DTIC Science & Technology

    2016-06-27

    AFRL-AFOSR-VA-TR-2016-0232 A Mathematical Theory of System Information Flow Michael Mislove ADMINISTRATORS OF THE TULANE EDUCATIONAL FUND THE 6823...MM-YYYY) 17-06-2016 2. REPORT TYPE Final 3. DATES COVERED (From - To) 27MAR2013 - 31MAR2016 4. TITLE AND SUBTITLE A Mathematical Theory of System...systems using techniques from information theory , domain theory and other areas of mathematics and computer science. Over time, the focus shifted

  1. Media Effects: Theory and Research.

    PubMed

    Valkenburg, Patti M; Peter, Jochen; Walther, Joseph B

    2016-01-01

    This review analyzes trends and commonalities among prominent theories of media effects. On the basis of exemplary meta-analyses of media effects and bibliometric studies of well-cited theories, we identify and discuss five features of media effects theories as well as their empirical support. Each of these features specifies the conditions under which media may produce effects on certain types of individuals. Our review ends with a discussion of media effects in newer media environments. This includes theories of computer-mediated communication, the development of which appears to share a similar pattern of reformulation from unidirectional, receiver-oriented views, to theories that recognize the transactional nature of communication. We conclude by outlining challenges and promising avenues for future research.

  2. Personality and Type (but "Not" a Psychological Theory!)

    ERIC Educational Resources Information Center

    Holst-Larkin, Jane

    2006-01-01

    Word processing is part of every writer's set of competencies today, and as readers, their expectations of type have risen well beyond the old Courier font of typewriters. Yet only recently have writers had access to the thousands of different typefaces available today and had such power in making design choices. Type has been much studied and…

  3. Theory of flux cutting and flux transport at the critical current of a type-II superconducting cylindrical wire

    NASA Astrophysics Data System (ADS)

    Clem, John R.

    2011-06-01

    I introduce a critical-state theory incorporating both flux cutting and flux transport to calculate the magnetic-field and current-density distributions inside a type-II superconducting cylinder at its critical current in a longitudinal applied magnetic field. The theory is an extension of the elliptic critical-state model introduced by Romero-Salazar and Pérez-Rodríguez. The vortex dynamics depend in detail on two nonlinear effective resistivities for flux cutting (ρ∥) and flux flow (ρ⊥), and their ratio r=ρ∥/ρ⊥. When r<1, the low relative efficiency of flux cutting in reducing the magnitude of the internal magnetic-flux density leads to a paramagnetic longitudinal magnetic moment. As a model for understanding the experimentally observed interrelationship between the critical currents for flux cutting and depinning, I calculate the forces on a helical vortex arc stretched between two pinning centers when the vortex is subjected to a current density of arbitrary angle ϕ. Simultaneous initiation of flux cutting and flux transport occurs at the critical current density Jc(ϕ) that makes the vortex arc unstable.

  4. PSEA-Quant: a protein set enrichment analysis on label-free and label-based protein quantification data.

    PubMed

    Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2014-12-05

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.

  5. PSEA-Quant: A Protein Set Enrichment Analysis on Label-Free and Label-Based Protein Quantification Data

    PubMed Central

    2015-01-01

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766

  6. Aspects of some dualities in string theory

    NASA Astrophysics Data System (ADS)

    Kim, Bom Soo

    AdS/CFT correspondence in string theory has changed landscape of the theoretical physics. Through this celebrated duality between gravity theory and field theory, one can investigate analytically strongly coupled gauge theories such as Quantum Chromodynamics (QCD) in terms of weakly coupled string theory such as supergravity theory and vice versa. In the first part of this thesis we used this duality to construct a new type of nonlocal field theory, called Puff Field Theory, in terms of D3 branes in type IIB string theory with a geometric twist. In addition to the strong-weak duality of AdS/CFT, there also exists a weak-weak duality, called Twistor String Theory. Twistor technique is successfully used to calculate the SYM scattering amplitude in an elegant fashion. Yet, the progress in the string theory side was hindered by a non-unitary conformal gravity. We extend the Twistor string theory by introducing mass terms, in the second part of the thesis. A chiral mass term is identified as a vacuum expectation value of a conformal supergravity field and is tied with the breaking of the conformal symmetry of gravity. As a prime candidate for a quantum theory of gravity, string theory revealed many promising successes such as counting the number of microstates in supersymmetric Black Holes thermodynamics and resolution of timelike and null singularities, to name a few. Yet, the fundamental string and M-theroy formulations are not yet available. Various string theories without gravity, such as Non-Commutative Open String (NCOS) and Open Membrane (OM) theories, are very nice playground to investigate the fundamental structure of string and M-theory without the complication of gravity. In the last part of the thesis, simpler Non-Relativistic String Theories are constructed and investigated. One important motivation for those theories is related to the connection between Non-Relativistic String Theories and Non-critical String Theories through the bosonization of betagamma

  7. Topological BF Theories

    NASA Astrophysics Data System (ADS)

    Sǎraru, Silviu-Constantin

    Topological field theories originate in the papers of Schwarz and Witten. Initially, Schwarz shown that one of the topological invariants, namely the Ray-Singer torsion, can be represented as the partition function of a certain quantum field theory. Subsequently, Witten constructed a framework for understanding Morse theory in terms of supersymmetric quantum mechanics. These two constructions represent the prototypes of all topological field theories. The model used by Witten has been applied to classical index theorems and, moreover, suggested some generalizations that led to new mathematical results on holomorphic Morse inequalities. Starting with these results, further developments in the domain of topological field theories have been achieved. The Becchi-Rouet-Stora-Tyutin (BRST) symmetry allowed for a new definition of topological ...eld theories as theories whose BRST-invariant Hamiltonian is also BRST-exact. An important class of topological theories of Schwarz type is the class of BF models. This type of models describes three-dimensional quantum gravity and is useful at the study of four-dimensional quantum gravity in Ashtekar-Rovelli-Smolin formulation. Two-dimensional BF models are correlated to Poisson sigma models from various two-dimensional gravities. The analysis of Poisson sigma models, including their relationship to two-dimensional gravity and the study of classical solutions, has been intensively studied in the literature. In this thesis we approach the problem of construction of some classes of interacting BF models in the context of the BRST formalism. In view of this, we use the method of the deformation of the BRST charge and BRST-invariant Hamiltonian. Both methods rely on specific techniques of local BRST cohomology. The main hypotheses in which we construct the above mentioned interactions are: space-time locality, Poincare invariance, smoothness of deformations in the coupling constant and the preservation of the number of derivatives on

  8. WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marques da Silva, A; Fischer, A

    2015-06-15

    Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization,more » in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of

  9. Advancing agricultural greenhouse gas quantification*

    NASA Astrophysics Data System (ADS)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    Agricultural Research Service 2011), which aim to improve consistency of field measurement and data collection for soil carbon sequestration and soil nitrous oxide fluxes. Often these national-level activity data and emissions factors are the basis for regional and smaller-scale applications. Such data are used for model-based estimates of changes in GHGs at a project or regional level (Olander et al 2011). To complement national data for regional-, landscape-, or field-level applications, new data are often collected through farmer knowledge or records and field sampling. Ideally such data could be collected in a standardized manner, perhaps through some type of crowd sourcing model to improve regional—and national—level data, as well as to improve consistency of locally collected data. Data can also be collected by companies working with agricultural suppliers and in country networks, within efforts aimed at understanding firm and product (supply-chain) sustainability and risks (FAO 2009). Such data may feed into various certification processes or reporting requirements from buyers. Unfortunately, this data is likely proprietary. A new process is needed to aggregate and share private data in a way that would not be a competitive concern so such data could complement or supplement national data and add value. A number of papers in this focus issue discuss issues surrounding quantification methods and systems at large scales, global and national levels, while others explore landscape- and field-scale approaches. A few explore the intersection of top-down and bottom-up data measurement and modeling approaches. 5. The agricultural greenhouse gas quantification project and ERL focus issue Important land management decisions are often made with poor or few data, especially in developing countries. Current systems for quantifying GHG emissions are inadequate in most low-income countries, due to a lack of funding, human resources, and infrastructure. Most non-Annex 1 countries

  10. Absolute quantification by droplet digital PCR versus analog real-time PCR

    PubMed Central

    Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh

    2014-01-01

    Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387

  11. Carbon Nanotubes Released from an Epoxy-Based Nanocomposite: Quantification and Particle Toxicity.

    PubMed

    Schlagenhauf, Lukas; Buerki-Thurnherr, Tina; Kuo, Yu-Ying; Wichser, Adrian; Nüesch, Frank; Wick, Peter; Wang, Jing

    2015-09-01

    Studies combining both the quantification of free nanoparticle release and the toxicological investigations of the released particles from actual nanoproducts in a real-life exposure scenario are urgently needed, yet very rare. Here, a new measurement method was established to quantify the amount of free-standing and protruding multiwalled carbon nanotubes (MWCNTs) in the respirable fraction of particles abraded from a MWCNT-epoxy nanocomposite. The quantification approach involves the prelabeling of MWCNTs with lead ions, nanocomposite production, abrasion and collection of the inhalable particle fraction, and quantification of free-standing and protruding MWCNTs by measuring the concentration of released lead ions. In vitro toxicity studies for genotoxicity, reactive oxygen species formation, and cell viability were performed using A549 human alveolar epithelial cells and THP-1 monocyte-derived macrophages. The quantification experiment revealed that in the respirable fraction of the abraded particles, approximately 4000 ppm of the MWCNTs were released as exposed MWCNTs (which could contact lung cells upon inhalation) and approximately 40 ppm as free-standing MWCNTs in the worst-case scenario. The release of exposed MWCNTs was lower for nanocomposites containing agglomerated MWCNTs. The toxicity tests revealed that the abraded particles did not induce any acute cytotoxic effects.

  12. A Bayes network approach to uncertainty quantification in hierarchically developed computational models

    DOE PAGES

    Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.

    2012-03-01

    Here, performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty inmore » the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.« less

  13. Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.

    PubMed

    Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro

    2016-09-02

    Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards.

  14. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    PubMed Central

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  15. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  16. Visual Hemispheric Specialization: A Computational Theory. Technical Report #7.

    ERIC Educational Resources Information Center

    Kosslyn, Stephen M.

    Visual recognition, navigation, tracking, and imagery are posited to involve some of the same types of representations and processes. The first part of this paper develops a theory of some of the shared types of representations and processing modules. The theory is developed in light of neurophysiological and neuroanatomical data from non-human…

  17. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  18. A superstring field theory for supergravity

    NASA Astrophysics Data System (ADS)

    Reid-Edwards, R. A.; Riccombeni, D. A.

    2017-09-01

    A covariant closed superstring field theory, equivalent to classical tendimensional Type II supergravity, is presented. The defining conformal field theory is the ambitwistor string worldsheet theory of Mason and Skinner. This theory is known to reproduce the scattering amplitudes of Cachazo, He and Yuan in which the scattering equations play an important role and the string field theory naturally incorporates these results. We investigate the operator formalism description of the ambitwsitor string and propose an action for the string field theory of the bosonic and supersymmetric theories. The correct linearised gauge symmetries and spacetime actions are explicitly reproduced and evidence is given that the action is correct to all orders. The focus is on the NeveuSchwarz sector and the explicit description of tree level perturbation theory about flat spacetime. Application of the string field theory to general supergravity backgrounds and the inclusion of the Ramond sector are briefly discussed.

  19. Mathematical Techniques for Nonlinear System Theory.

    DTIC Science & Technology

    1978-01-01

    4. TITLE (and Subtitle) 5. TYPE OF REPORT 6 PERIOD COVERED MATHEMATICAL TECHNIQUES FOR NONLINEAR SYSTEM THEORY Interim 6...ADDRESS 10. PROGRAM ELEMENT. PROJECT . TASK AREA & WORK UNIT NUMBERS Unlvers].ty of Flori.da Center for Mathematical System Theory ~~~~ Gainesville , FL...rings”, Mathematical System Theory , 9: 327—344. E. D. SONTAG (1976b1 “Linear systems over commutative rings: a survey”, Richerche di Automatica, 7: 1-34

  20. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less

  1. Comparison of high-resolution ultrasonic resonator technology and Raman spectroscopy as novel process analytical tools for drug quantification in self-emulsifying drug delivery systems.

    PubMed

    Stillhart, Cordula; Kuentz, Martin

    2012-02-05

    Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.

    PubMed

    Huang, Chia-Chia; Pan, Tzu-Ming

    2005-05-18

    The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.

  3. Ion permeation and glutamate residues linked by Poisson-Nernst-Planck theory in L-type calcium channels.

    PubMed Central

    Nonner, W; Eisenberg, B

    1998-01-01

    L-type Ca channels contain a cluster of four charged glutamate residues (EEEE locus), which seem essential for high Ca specificity. To understand how this highly charged structure might produce the currents and selectivity observed in this channel, a theory is needed that relates charge to current. We use an extended Poisson-Nernst-Planck (PNP2) theory to compute (mean) Coulombic interactions and thus to examine the role of the mean field electrostatic interactions in producing current and selectivity. The pore was modeled as a central cylinder with tapered atria; the cylinder (i.e., "pore proper") contained a uniform volume density of fixed charge equivalent to that of one to four carboxyl groups. The pore proper was assigned ion-specific, but spatially uniform, diffusion coefficients and excess chemical potentials. Thus electrostatic selection by valency was computed self-consistently, and selection by other features was also allowed. The five external parameters needed for a system of four ionic species (Na, Ca, Cl, and H) were determined analytically from published measurements of thre limiting conductances and two critical ion concentrations, while treating the pore as a macroscopic ion-exchange system in equilibrium with a uniform bath solution. The extended PNP equations were solved with these parameters, and the predictions were compared to currents measured in a variety of solutions over a range of transmembrane voltages. The extended PNP theory accurately predicted current-voltage relations, anomalous mole fraction effects in the observed current, saturation effects of varied Ca and Na concentrations, and block by protons. Pore geometry, dielectric permittivity, and the number of carboxyl groups had only weak effects. The successful prediction of Ca fluxes in this paper demonstrates that ad hoc electrostatic parameters, multiple discrete binding sites, and logistic assumptions of single-file movement are all unnecessary for the prediction of permeation in

  4. Quantification of peptides from immunoglobulin constant and variable regions by LC-MRM MS for assessment of multiple myeloma patients.

    PubMed

    Remily-Wood, Elizabeth R; Benson, Kaaron; Baz, Rachid C; Chen, Y Ann; Hussein, Mohamad; Hartley-Brown, Monique A; Sprung, Robert W; Perez, Brianna; Liu, Richard Z; Yoder, Sean J; Teer, Jamie K; Eschrich, Steven A; Koomen, John M

    2014-10-01

    Quantitative MS assays for Igs are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, for example, multiple myeloma (MM). Using LC-MS/MS data, Ig constant region peptides, and transitions were selected for LC-MRM MS. Quantitative assays were used to assess Igs in serum from 83 patients. RNA sequencing and peptide-based LC-MRM are used to define peptides for quantification of the disease-specific Ig. LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1-4, IgA1-2, IgM, IgD, and IgE, as well as kappa (κ) and lambda (λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 MM cell line and two MM patients. LC-MRM assays targeting constant region peptides determine the type and isoform of the involved Ig and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher inter-assay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. GC-MS quantification of suspected volatile allergens in fragrances. 2. Data treatment strategies and method performances.

    PubMed

    Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias

    2007-01-10

    The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.

  6. In vivo behavior of NTBI revealed by automated quantification system.

    PubMed

    Ito, Satoshi; Ikuta, Katsuya; Kato, Daisuke; Lynda, Addo; Shibusa, Kotoe; Niizeki, Noriyasu; Toki, Yasumichi; Hatayama, Mayumi; Yamamoto, Masayo; Shindo, Motohiro; Iizuka, Naomi; Kohgo, Yutaka; Fujiya, Mikihiro

    2016-08-01

    Non-Tf-bound iron (NTBI), which appears in serum in iron overload, is thought to contribute to organ damage; the monitoring of serum NTBI levels may therefore be clinically useful in iron-overloaded patients. However, NTBI quantification methods remain complex, limiting their use in clinical practice. To overcome the technical difficulties often encountered, we recently developed a novel automated NTBI quantification system capable of measuring large numbers of samples. In the present study, we investigated the in vivo behavior of NTBI in human and animal serum using this newly established automated system. Average NTBI in healthy volunteers was 0.44 ± 0.076 μM (median 0.45 μM, range 0.28-0.66 μM), with no significant difference between sexes. Additionally, serum NTBI rapidly increased after iron loading, followed by a sudden disappearance. NTBI levels also decreased in inflammation. The results indicate that NTBI is a unique marker of iron metabolism, unlike other markers of iron metabolism, such as serum ferritin. Our new automated NTBI quantification method may help to reveal the clinical significance of NTBI and contribute to our understanding of iron overload.

  7. Scaling of Theory-of-Mind Tasks

    ERIC Educational Resources Information Center

    Wellman, Henry M.; Liu, David

    2004-01-01

    Two studies address the sequence of understandings evident in preschoolers' developing theory of mind. The first, preliminary study provides a meta-analysis of research comparing different types of mental state understandings (e.g., desires vs. beliefs, ignorance vs. false belief). The second, primary study tests a theory-of-mind scale for…

  8. A succession of theories: purging redundancy from disturbance theory.

    PubMed

    Pulsford, Stephanie A; Lindenmayer, David B; Driscoll, Don A

    2016-02-01

    The topics of succession and post-disturbance ecosystem recovery have a long and convoluted history. There is extensive redundancy within this body of theory, which has resulted in confusion, and the links among theories have not been adequately drawn. This review aims to distil the unique ideas from the array of theory related to ecosystem change in response to disturbance. This will help to reduce redundancy, and improve communication and understanding between researchers. We first outline the broad range of concepts that have developed over the past century to describe community change in response to disturbance. The body of work spans overlapping succession concepts presented by Clements in 1916, Egler in 1954, and Connell and Slatyer in 1977. Other theories describing community change include state and transition models, biological legacy theory, and the application of functional traits to predict responses to disturbance. Second, we identify areas of overlap of these theories, in addition to highlighting the conceptual and taxonomic limitations of each. In aligning each of these theories with one another, the limited scope and relative inflexibility of some theories becomes apparent, and redundancy becomes explicit. We identify a set of unique concepts to describe the range of mechanisms driving ecosystem responses to disturbance. We present a schematic model of our proposed synthesis which brings together the range of unique mechanisms that were identified in our review. The model describes five main mechanisms of transition away from a post-disturbance community: (i) pulse events with rapid state shifts; (ii) stochastic community drift; (iii) facilitation; (iv) competition; and (v) the influence of the initial composition of a post-disturbance community. In addition, stabilising processes such as biological legacies, inhibition or continuing disturbance may prevent a transition between community types. Integrating these six mechanisms with the functional

  9. Optimization and Verification of Droplet Digital PCR Even-Specific Methods for the Quantification of GM Maize DAS1507 and NK603.

    PubMed

    Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir

    2018-05-01

    In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.

  10. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  11. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  12. How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?

    NASA Astrophysics Data System (ADS)

    Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.

    2013-12-01

    In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial

  13. Lignin-Derived Thioacidolysis Dimers: Reevaluation, New Products, Authentication, and Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, Fengxia; Lu, Fachuang; Regner, Matt

    2017-01-26

    Lignin structural studies play an essential role both in understanding the development of plant cell walls and for valorizing lignocellulosics as renewable biomaterials. Dimeric products released by selectively cleaving β–aryl ether linkages between lignin units reflect the distribution of recalcitrant lignin units, but have been neither absolutely defined nor quantitatively determined. Here in this work, 12 guaiacyl-type thioacidolysis dimers were identified and quantified using newly synthesized standards. One product previously attributed to deriving from β–1-coupled units was established as resulting from β–5 units, correcting an analytical quandary. Another longstanding dilemma, that no β–β dimers were recognized in thioacidolysis products frommore » gymnosperms, was resolved with the discovery of two such authenticated compounds. Finally, individual GC response factors for each standard compound allowed rigorous quantification of dimeric products released from softwood lignins, affording insight into the various interunit-linkage distributions in lignins and thereby guiding the valorization of lignocellulosics.« less

  14. Lignin‐Derived Thioacidolysis Dimers: Reevaluation, New Products, Authentication, and Quantification

    PubMed Central

    Yue, Fengxia; Regner, Matt; Sun, Runcang

    2017-01-01

    Abstract Lignin structural studies play an essential role both in understanding the development of plant cell walls and for valorizing lignocellulosics as renewable biomaterials. Dimeric products released by selectively cleaving β–aryl ether linkages between lignin units reflect the distribution of recalcitrant lignin units, but have been neither absolutely defined nor quantitatively determined. Here, 12 guaiacyl‐type thioacidolysis dimers were identified and quantified using newly synthesized standards. One product previously attributed to deriving from β–1‐coupled units was established as resulting from β–5 units, correcting an analytical quandary. Another longstanding dilemma, that no β–β dimers were recognized in thioacidolysis products from gymnosperms, was resolved with the discovery of two such authenticated compounds. Individual GC response factors for each standard compound allowed rigorous quantification of dimeric products released from softwood lignins, affording insight into the various interunit‐linkage distributions in lignins and thereby guiding the valorization of lignocellulosics. PMID:28125766

  15. Lignin-Derived Thioacidolysis Dimers: Reevaluation, New Products, Authentication, and Quantification.

    PubMed

    Yue, Fengxia; Lu, Fachuang; Regner, Matt; Sun, Runcang; Ralph, John

    2017-03-09

    Lignin structural studies play an essential role both in understanding the development of plant cell walls and for valorizing lignocellulosics as renewable biomaterials. Dimeric products released by selectively cleaving β-aryl ether linkages between lignin units reflect the distribution of recalcitrant lignin units, but have been neither absolutely defined nor quantitatively determined. Here, 12 guaiacyl-type thioacidolysis dimers were identified and quantified using newly synthesized standards. One product previously attributed to deriving from β-1-coupled units was established as resulting from β-5 units, correcting an analytical quandary. Another longstanding dilemma, that no β-β dimers were recognized in thioacidolysis products from gymnosperms, was resolved with the discovery of two such authenticated compounds. Individual GC response factors for each standard compound allowed rigorous quantification of dimeric products released from softwood lignins, affording insight into the various interunit-linkage distributions in lignins and thereby guiding the valorization of lignocellulosics. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  16. Psychological Type and Counselling.

    ERIC Educational Resources Information Center

    Bayne, Rowan

    1995-01-01

    Psychological type as a theory of four major personality characteristics is applied to empathy and choice of strategy. Discusses how versatile counselors should be and of how, specifically, type can be used in counseling. The problem of losing weight is used to illustrate how strategies may be matched with clients of different types or…

  17. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  18. Quantification of Triacylglycerol Molecular Species in Edible Fats and Oils by Gas Chromatography-Flame Ionization Detector Using Correction Factors.

    PubMed

    Yoshinaga, Kazuaki; Obi, Junji; Nagai, Toshiharu; Iioka, Hiroyuki; Yoshida, Akihiko; Beppu, Fumiaki; Gotoh, Naohiro

    2017-03-01

    In the present study, the resolution parameters and correction factors (CFs) of triacylglycerol (TAG) standards were estimated by gas chromatography-flame ionization detector (GC-FID) to achieve the precise quantification of the TAG composition in edible fats and oils. Forty seven TAG standards comprising capric acid, lauric acid, myristic acid, pentadecanoic acid, palmitic acid, palmitoleic acid, stearic acid, oleic acid, linoleic acid, and/or linolenic acid were analyzed, and the CFs of these TAGs were obtained against tripentadecanoyl glycerol as the internal standard. The capillary column was Ultra ALLOY + -65 (30 m × 0.25 mm i.d., 0.10 μm thickness) and the column temperature was programmed to rise from 250°C to 360°C at 4°C/min and then hold for 25 min. The limit of detection (LOD) and limit of quantification (LOQ) values of the TAG standards were > 0.10 mg and > 0.32 mg per 100 mg fat and oil, respectively, except for LnLnLn, and the LOD and LOQ values of LnLnLn were 0.55 mg and 1.84 mg per 100 mg fat and oil, respectively. The CFs of TAG standards decreased with increasing total acyl carbon number and degree of desaturation of TAG molecules. Also, there were no remarkable differences in the CFs between TAG positional isomers such as 1-palmitoyl-2-oleoyl-3-stearoyl-rac-glycerol, 1-stearoyl-2-palmitoyl-3-oleoyl-rac-glycerol, and 1-palmitoyl-2-stearoyl-3-oleoyl-rac-glycerol, which cannot be separated by GC-FID. Furthermore, this method was able to predict the CFs of heterogeneous (AAB- and ABC-type) TAGs from the CFs of homogenous (AAA-, BBB-, and CCC-type) TAGs. In addition, the TAG composition in cocoa butter, palm oil, and canola oil was determined using CFs, and the results were found to be in good agreement with those reported in the literature. Therefore, the GC-FID method using CFs can be successfully used for the quantification of TAG molecular species in natural fats and oils.

  19. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Improving microstructural quantification in FIB/SEM nanotomography.

    PubMed

    Taillon, Joshua A; Pellegrinelli, Christopher; Huang, Yi-Lin; Wachsman, Eric D; Salamanca-Riba, Lourdes G

    2018-01-01

    FIB/SEM nanotomography (FIB-nt) is a powerful technique for the determination and quantification of the three-dimensional microstructure in subsurface features. Often times, the microstructure of a sample is the ultimate determiner of the overall performance of a system, and a detailed understanding of its properties is crucial in advancing the materials engineering of a resulting device. While the FIB-nt technique has developed significantly in the 15 years since its introduction, advanced nanotomographic analysis is still far from routine, and a number of challenges remain in data acquisition and post-processing. In this work, we present a number of techniques to improve the quality of the acquired data, together with easy-to-implement methods to obtain "advanced" microstructural quantifications. The techniques are applied to a solid oxide fuel cell cathode of interest to the electrochemistry community, but the methodologies are easily adaptable to a wide range of material systems. Finally, results from an analyzed sample are presented as a practical example of how these techniques can be implemented. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Self-digitization microfluidic chip for absolute quantification of mRNA in single cells.

    PubMed

    Thompson, Alison M; Gansen, Alexander; Paguirigan, Amy L; Kreutz, Jason E; Radich, Jerald P; Chiu, Daniel T

    2014-12-16

    Quantification of mRNA in single cells provides direct insight into how intercellular heterogeneity plays a role in disease progression and outcomes. Quantitative polymerase chain reaction (qPCR), the current gold standard for evaluating gene expression, is insufficient for providing absolute measurement of single-cell mRNA transcript abundance. Challenges include difficulties in handling small sample volumes and the high variability in measurements. Microfluidic digital PCR provides far better sensitivity for minute quantities of genetic material, but the typical format of this assay does not allow for counting of the absolute number of mRNA transcripts samples taken from single cells. Furthermore, a large fraction of the sample is often lost during sample handling in microfluidic digital PCR. Here, we report the absolute quantification of single-cell mRNA transcripts by digital, one-step reverse transcription PCR in a simple microfluidic array device called the self-digitization (SD) chip. By performing the reverse transcription step in digitized volumes, we find that the assay exhibits a linear signal across a wide range of total RNA concentrations and agrees well with standard curve qPCR. The SD chip is found to digitize a high percentage (86.7%) of the sample for single-cell experiments. Moreover, quantification of transferrin receptor mRNA in single cells agrees well with single-molecule fluorescence in situ hybridization experiments. The SD platform for absolute quantification of single-cell mRNA can be optimized for other genes and may be useful as an independent control method for the validation of mRNA quantification techniques.

  2. Quantification of α-tubulin isotypes by sandwich ELISA with signal amplification through biotinyl-tyramide or immuno-PCR.

    PubMed

    Dráberová, Eduarda; Stegurová, Lucie; Sulimenko, Vadym; Hájková, Zuzana; Dráber, Petr; Dráber, Pavel

    2013-09-30

    Microtubules formed by αβ-tubulin dimers represent cellular structures that are indispensable for the maintenance of cell morphology and for cell motility generation. Microtubules in intact cells are in highly regulated equilibrium with cellular pools of soluble tubulin dimers. Sensitive, reproducible and rapid assays are necessary to monitor tubulin changes in cytosolic pools after treatment with anti-mitotic drugs, during the cell cycle or activation and differentiation events. Here we describe new assays for α-tubulin quantification. The assays are based on sandwich ELISA, and the signal is amplified with biotinyl-tyramide or immuno-PCR. Matching monoclonal antibody pair recognizes phylogenetically highly conserved epitopes localized outside the C-terminal isotype-defining region. This makes it possible to detect α-tubulin isotypes in different cell types of various species. Biotinyl-tyramide amplification and immuno-PCR amplification enable detection of tubulin at concentrations 2.5ng/ml and 0.086ng/ml, respectively. Immuno-PCR detection shows enhanced sensitivity and wider dynamic range when compared to ELISA with biotinyl-tyramide detection. Our results on taxol-treated and activated bone marrow-derived mast cells demonstrate, that the assays allow sensitive quantification of tubulin in complex biological fluids. © 2013.

  3. Quick, sensitive and specific detection and evaluation of quantification of minor variants by high-throughput sequencing.

    PubMed

    Leung, Ross Ka-Kit; Dong, Zhi Qiang; Sa, Fei; Chong, Cheong Meng; Lei, Si Wan; Tsui, Stephen Kwok-Wing; Lee, Simon Ming-Yuen

    2014-02-01

    Minor variants have significant implications in quasispecies evolution, early cancer detection and non-invasive fetal genotyping but their accurate detection by next-generation sequencing (NGS) is hampered by sequencing errors. We generated sequencing data from mixtures at predetermined ratios in order to provide insight into sequencing errors and variations that can arise for which simulation cannot be performed. The information also enables better parameterization in depth of coverage, read quality and heterogeneity, library preparation techniques, technical repeatability for mathematical modeling, theory development and simulation experimental design. We devised minor variant authentication rules that achieved 100% accuracy in both testing and validation experiments. The rules are free from tedious inspection of alignment accuracy, sequencing read quality or errors introduced by homopolymers. The authentication processes only require minor variants to: (1) have minimum depth of coverage larger than 30; (2) be reported by (a) four or more variant callers, or (b) DiBayes or LoFreq, plus SNVer (or BWA when no results are returned by SNVer), and with the interassay coefficient of variation (CV) no larger than 0.1. Quantification accuracy undermined by sequencing errors could neither be overcome by ultra-deep sequencing, nor recruiting more variant callers to reach a consensus, such that consistent underestimation and overestimation (i.e. low CV) were observed. To accommodate stochastic error and adjust the observed ratio within a specified accuracy, we presented a proof of concept for the use of a double calibration curve for quantification, which provides an important reference towards potential industrial-scale fabrication of calibrants for NGS.

  4. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    PubMed

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  5. Relative quantification in seed GMO analysis: state of art and bottlenecks.

    PubMed

    Chaouachi, Maher; Bérard, Aurélie; Saïd, Khaled

    2013-06-01

    Reliable quantitative methods are needed to comply with current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) and GMO-derived food and feed products with a minimum GMO content of 0.9 %. The implementation of EU Commission Recommendation 2004/787/EC on technical guidance for sampling and detection which meant as a helpful tool for the practical implementation of EC Regulation 1830/2003, which states that "the results of quantitative analysis should be expressed as the number of target DNA sequences per target taxon specific sequences calculated in terms of haploid genomes". This has led to an intense debate on the type of calibrator best suitable for GMO quantification. The main question addressed in this review is whether reference materials and calibrators should be matrix based or whether pure DNA analytes should be used for relative quantification in GMO analysis. The state of the art, including the advantages and drawbacks, of using DNA plasmid (compared to genomic DNA reference materials) as calibrators, is widely described. In addition, the influence of the genetic structure of seeds on real-time PCR quantitative results obtained for seed lots is discussed. The specific composition of a seed kernel, the mode of inheritance, and the ploidy level ensure that there is discordance between a GMO % expressed as a haploid genome equivalent and a GMO % based on numbers of seeds. This means that a threshold fixed as a percentage of seeds cannot be used as such for RT-PCR. All critical points that affect the expression of the GMO content in seeds are discussed in this paper.

  6. N =1 Lagrangians for generalized Argyres-Douglas theories

    NASA Astrophysics Data System (ADS)

    Agarwal, Prarit; Sciarappa, Antonio; Song, Jaewon

    2017-10-01

    We find N = 1 Lagrangian gauge theories that flow to generalized ArgyresDouglas theories with N = 2 supersymmetry. We find that certain SU quiver gauge theories flow to generalized Argyres-Douglas theories of type ( A k-1 , A mk-1) and ( I m,km , S). We also find quiver gauge theories of SO/Sp gauge groups flowing to the ( A 2 m-1 , D 2 mk+1), ( A 2 m , D 2 m( k-1)+ k ) and D m(2 k + 2) m(2 k + 2) [ m] theories.

  7. Non-perturbative String Theory from Water Waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iyer, Ramakrishnan; Johnson, Clifford V.; /Southern California U.

    2012-06-14

    We use a combination of a 't Hooft limit and numerical methods to find non-perturbative solutions of exactly solvable string theories, showing that perturbative solutions in different asymptotic regimes are connected by smooth interpolating functions. Our earlier perturbative work showed that a large class of minimal string theories arise as special limits of a Painleve IV hierarchy of string equations that can be derived by a similarity reduction of the dispersive water wave hierarchy of differential equations. The hierarchy of string equations contains new perturbative solutions, some of which were conjectured to be the type IIA and IIB string theoriesmore » coupled to (4, 4k ? 2) superconformal minimal models of type (A, D). Our present paper shows that these new theories have smooth non-perturbative extensions. We also find evidence for putative new string theories that were not apparent in the perturbative analysis.« less

  8. Identification and quantification of virulence factors of enterotoxigenic Escherichia coli by high-resolution melting curve quantitative PCR.

    PubMed

    Wang, Weilan; Zijlstra, Ruurd T; Gänzle, Michael G

    2017-05-15

    Diagnosis of enterotoxigenic E. coli (ETEC) associated diarrhea is complicated by the diversity of E.coli virulence factors. This study developed a multiplex quantitative PCR assay based on high-resolution melting curves analysis (HRM-qPCR) to identify and quantify genes encoding five ETEC fimbriae related to diarrhea in swine, i.e. K99, F41, F18, F6 and K88. Five fimbriae expressed by ETEC were amplified in multiple HRM-qPCR reactions to allow simultaneous identification and quantification of five target genes. The assay was calibrated to allow quantification of the most abundant target gene, and validated by analysis of 30 samples obtained from piglets with diarrhea and healthy controls, and comparison to standard qPCR detection. The five amplicons with melting temperatures (Tm) ranging from 74.7 ± 0.06 to 80.5 ± 0.15 °C were well-separated by HRM-qPCR. The area of amplicons under the melting peak correlated linearly to the proportion of the template in the calibration mixture if the proportion exceeded 4.8% (K88) or <1% (all other amplicons). The suitability of the method was evaluated using 30 samples from weaned pigs aged 6-7 weeks; 14 of these animals suffered from diarrhea in consequence of poor sanitary conditions. Genes encoding fimbriae and enterotoxins were quantified by HRM-qPCR and/or qPCR. The multiplex HRM-qPCR allowed accurate analysis when the total gene copy number of targets was more than 1 × 10 5 / g wet feces and the HRM curves were able to simultaneously distinguish fimbriae genes in the fecal samples. The relative quantification of the most abundant F18 based on melting peak area was highly correlated (P < 0.001; r 2  = 0.956) with that of individual qPCR result but the correlation for less abundant fimbriae was much lower. The multiplex HRM assay identifies ETEC virulence factors specifically and efficiently. It correctly indicated the predominant fimbriae type and additionally provides information of presence/ absence of

  9. Digital Quantification of Proteins and mRNA in Single Mammalian Cells.

    PubMed

    Albayrak, Cem; Jordi, Christian A; Zechner, Christoph; Lin, Jing; Bichsel, Colette A; Khammash, Mustafa; Tay, Savaş

    2016-03-17

    Absolute quantification of macromolecules in single cells is critical for understanding and modeling biological systems that feature cellular heterogeneity. Here we show extremely sensitive and absolute quantification of both proteins and mRNA in single mammalian cells by a very practical workflow that combines proximity ligation assay (PLA) and digital PCR. This digital PLA method has femtomolar sensitivity, which enables the quantification of very small protein concentration changes over its entire 3-log dynamic range, a quality necessary for accounting for single-cell heterogeneity. We counted both endogenous (CD147) and exogenously expressed (GFP-p65) proteins from hundreds of single cells and determined the correlation between CD147 mRNA and the protein it encodes. Using our data, a stochastic two-state model of the central dogma was constructed and verified using joint mRNA/protein distributions, allowing us to estimate transcription burst sizes and extrinsic noise strength and calculate the transcription and translation rate constants in single mammalian cells. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Recurrence quantification analysis of human postural fluctuations in older fallers and non-fallers.

    PubMed

    Ramdani, Sofiane; Tallon, Guillaume; Bernard, Pierre Louis; Blain, Hubert

    2013-08-01

    We investigate postural sway data dynamics in older adult fallers and non-fallers. Center of pressure (COP) signals were recorded during quiet standing in 28 older adults. The subjects were divided in two groups: with and without history of falls. COP time series were analyzed using recurrence quantification analysis (RQA) in both anteroposterior and mediolateral (ML) directions. Classical stabilometric variables (path length and range) were also computed. The results showed that RQA outputs quantifying predictability of COP fluctuations and Shannon entropy of recurrence plot diagonal line length distribution, were significantly higher in fallers, only for ML direction. In addition, the range of ML COP signals was also significantly higher in fallers. This result is in accordance with some findings of the literature and could be interpreted as an increased hip strategy in fallers. The RQA results seem coherent with the theory of loss of complexity with aging and disease. Our results suggest that RQA is a promising approach for the investigation of COP fluctuations in a frail population.

  11. Music Participation: Theory, Research, and Policy.

    ERIC Educational Resources Information Center

    Gates, J. Terry

    1991-01-01

    Bases a music participation theory on findings in music education, ethnomusicology, and sociology of leisure. Posits six types of music participants: professionals, apprentices, amateurs, hobbyists, recreationists, and dabblers. Distinguishes each type by theoretical variations in cost-benefit relationships as perceived by participants. Discusses…

  12. Game Engagement Theory and Adult Learning

    ERIC Educational Resources Information Center

    Whitton, Nicola

    2011-01-01

    One of the benefits of computer game-based learning is the ability of certain types of game to engage and motivate learners. However, theories of learning and engagement, particularly in the sphere of higher education, typically fail to consider gaming engagement theory. In this article, the author examines the principles of engagement from games…

  13. Metering error quantification under voltage and current waveform distortion

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  14. Ultrasensitive multiplex optical quantification of bacteria in large samples of biofluids

    PubMed Central

    Pazos-Perez, Nicolas; Pazos, Elena; Catala, Carme; Mir-Simon, Bernat; Gómez-de Pedro, Sara; Sagales, Juan; Villanueva, Carlos; Vila, Jordi; Soriano, Alex; García de Abajo, F. Javier; Alvarez-Puebla, Ramon A.

    2016-01-01

    Efficient treatments in bacterial infections require the fast and accurate recognition of pathogens, with concentrations as low as one per milliliter in the case of septicemia. Detecting and quantifying bacteria in such low concentrations is challenging and typically demands cultures of large samples of blood (~1 milliliter) extending over 24–72 hours. This delay seriously compromises the health of patients. Here we demonstrate a fast microorganism optical detection system for the exhaustive identification and quantification of pathogens in volumes of biofluids with clinical relevance (~1 milliliter) in minutes. We drive each type of bacteria to accumulate antibody functionalized SERS-labelled silver nanoparticles. Particle aggregation on the bacteria membranes renders dense arrays of inter-particle gaps in which the Raman signal is exponentially amplified by several orders of magnitude relative to the dispersed particles. This enables a multiplex identification of the microorganisms through the molecule-specific spectral fingerprints. PMID:27364357

  15. Protection motivation theory and the prediction of physical activity among adults with type 1 or type 2 diabetes in a large population sample.

    PubMed

    Plotnikoff, Ronald C; Lippke, Sonia; Trinh, Linda; Courneya, Kerry S; Birkett, Nick; Sigal, Ronald J

    2010-09-01

    To investigate the utility of the protection motivation theory (PMT) for explaining physical activity (PA) in an adult population with type 1 diabetes (T1D) and type 2 diabetes (T2D). Cross-sectional and 6-month longitudinal analysis using PMT. Two thousand three hundred and eleven individuals with T1D (N=697) and T2D (N=1,614) completed self-report PMT constructs of vulnerability, severity, response efficacy, self-efficacy, and intention, and PA behaviour at baseline and 6-month follow-up. Multi-group structural equation modelling was conducted to: (1) test the fit of the PMT structure; (2) determine the similarities and differences in the PMT structure between the two types of diabetes; and (3) examine the explained variance and compare the strength of association of the PMT constructs in predicting PA intention and behaviour. The findings provide evidence for the utility of the PMT in both diabetes samples (chi(2)/df=1.27-4.08, RMSEA=.02-.05). Self-efficacy was a stronger predictor of intention (beta=0.64-0.68) than response efficacy (beta=0.14-0.16) in individuals with T1D or T2D. Severity was significantly related to intention (beta=0.06) in T2D individuals only, whereas vulnerability was not significantly related to intention or PA behaviour. Self-efficacy (beta's=0.20-0.28) and intention (beta's=0.12-0.30) were significantly associated with PA behaviour. Promotion of PA behaviour should primarily target self-efficacy to form intentions and to change behaviour. In addition, for individuals with T2D, severity information should be incorporated into PA intervention materials in this population.

  16. Goals, intentions and mental states: challenges for theories of autism.

    PubMed

    Hamilton, Antonia F de C

    2009-08-01

    The ability to understand the goals and intentions behind other people's actions is central to many social interactions. Given the profound social difficulties seen in autism, we might expect goal understanding to be impaired in these individuals. Two influential theories, the 'broken mirror' theory and the mentalising theory, can both predict this result. However, a review of the current data provides little empirical support for goal understanding difficulties; several studies demonstrate normal performance by autistic children on tasks requiring the understanding of goals or intentions. I suggest that this conclusion forces us to reject the basic broken mirror theory and to re-evaluate the breadth of the mentalising theory. More subtle theories which distinguish between different types of mirroring and different types of mentalising may be able to account for the present data, and further research is required to test and refine these theories.

  17. Psychological Type and the Matching of Cognitive Styles.

    ERIC Educational Resources Information Center

    Bargar, Robert R.; Hoover, Randy L.

    1984-01-01

    Carl Jung's theory of psychological type is explored and related to education in this article. A model of the interaction between teacher, student, subject matter, and instructional alternatives is examined and the educational implications are discussed. This theory is used to illustrate how psychological-type influences teaching and learning…

  18. Size determination and quantification of engineered cerium oxide nanoparticles by flow field-flow fractionation coupled to inductively coupled plasma mass spectrometry.

    PubMed

    Sánchez-García, L; Bolea, E; Laborda, F; Cubel, C; Ferrer, P; Gianolio, D; da Silva, I; Castillo, J R

    2016-03-18

    Facing the lack of studies on characterization and quantification of cerium oxide nanoparticles (CeO2 NPs), whose consumption and release is greatly increasing, this work proposes a method for their sizing and quantification by Flow Field-flow Fractionation (FFFF) coupled to Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). Two modalities of FFFF (Asymmetric Flow- and Hollow Fiber-Flow Field Flow Fractionation, AF4 and HF5, respectively) are compared, and their advantages and limitations discussed. Experimental conditions (carrier composition, pH, ionic strength, crossflow and carrier flow rates) are studied in detail in terms of NP separation, recovery, and repeatability. Size characterization of CeO2 NPs was addressed by different approaches. In the absence of feasible size standards of CeO2 NPs, suspensions of Ag, Au, and SiO2 NPs of known size were investigated. Ag and Au NPs failed to show a comparable behavior to that of the CeO2 NPs, whereas the use of SiO2 NPs provided size estimations in agreement to those predicted by the theory. The latter approach was thus used for characterizing the size of CeO2 NPs in a commercial suspension. Results were in adequate concordance with those achieved by transmission electron microscopy, X-ray diffraction and dynamic light scattering. The quantification of CeO2 NPs in the commercial suspension by AF4-ICP-MS required the use of a CeO2 NPs standards, since the use of ionic cerium resulted in low recoveries (99 ± 9% vs. 73 ± 7%, respectively). A limit of detection of 0.9 μg L(-1) CeO2 corresponding to a number concentration of 1.8 × 1012 L(-1) for NPs of 5 nm was achieved for an injection volume of 100 μL. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    PubMed

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Interferometric Computation Beyond Quantum Theory

    NASA Astrophysics Data System (ADS)

    Garner, Andrew J. P.

    2018-03-01

    There are quantum solutions for computational problems that make use of interference at some stage in the algorithm. These stages can be mapped into the physical setting of a single particle travelling through a many-armed interferometer. There has been recent foundational interest in theories beyond quantum theory. Here, we present a generalized formulation of computation in the context of a many-armed interferometer, and explore how theories can differ from quantum theory and still perform distributed calculations in this set-up. We shall see that quaternionic quantum theory proves a suitable candidate, whereas box-world does not. We also find that a classical hidden variable model first presented by Spekkens (Phys Rev A 75(3): 32100, 2007) can also be used for this type of computation due to the epistemic restriction placed on the hidden variable.

  1. Good quantification practices of flavours and fragrances by mass spectrometry.

    PubMed

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  2. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    NASA Astrophysics Data System (ADS)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  3. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  4. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    PubMed

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  5. Sensitive Targeted Quantification of ERK Phosphorylation Dynamics and Stoichiometry in Human Cells without Affinity Enrichment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Gao, Yuqian; Gaffrey, Matthew J.

    2014-12-17

    Mass spectrometry-based targeted quantification is a promising technology for site-specific quantification of posttranslational modifications (PTMs). However, a major constraint of most targeted MS approaches is the limited sensitivity for quantifying low-abundance PTMs, requiring the use of affinity reagents to enrich specific PTMs. Herein, we demonstrate the direct site-specific quantification of ERK phosphorylation isoforms (pT, pY, pTpY) and their relative stoichiometries using a highly sensitive targeted MS approach termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM). PRISM provides effective enrichment of target peptides within a given fraction from complex biological matrix with minimal sample losses, followed by selected reactionmore » monitoring (SRM) quantification. The PRISM-SRM approach enabled direct quantification of ERK phosphorylation in human mammary epithelial cells (HMEC) from as little as 25 µg tryptic peptides from whole cell lysates. Compared to immobilized metal-ion affinity chromatography, PRISM provided >10-fold improvement in signal intensities, presumably due to the better peptide recovery of PRISM for handling small size samples. This approach was applied to quantify ERK phosphorylation dynamics in HMEC treated by different doses of EGF at both the peak activation (10 min) and steady state (2 h). At 10 min, the maximal ERK activation was observed with 0.3 ng/mL dose, whereas the maximal steady state level of ERK activation at 2 h was at 3 ng/ml dose, corresponding to 1200 and 9000 occupied receptors, respectively. At 10 min, the maximally activated pTpY isoform represented ~40% of total ERK, falling to less than 10% at 2 h. The time course and dose-response profiles of individual phosphorylated ERK isoforms indicated that singly phosphorylated pT-ERK never increases significantly, while the increase of pY-ERK paralleled that of pTpY-ERK. This data supports for a processive, rather than

  6. Using virtual reality to assess theory of mind subprocesses and error types in early and chronic schizophrenia.

    PubMed

    Canty, Allana L; Neumann, David L; Shum, David H K

    2017-12-01

    Individuals with schizophrenia often demonstrate theory of mind (ToM) impairment relative to healthy adults. However, the exact nature of this impairment (first- vs. second-order ToM and cognitive vs. affective ToM) and the extent to which ToM abilities deteriorate with illness chronicity is unclear. Furthermore, little is known about the relationships between clinical symptoms and ToM error types (overmentalising, reduced mentalising and no ToM) in early and chronic schizophrenia. This study examined the nature and types of ToM impairment in individuals with early ( n  = 26) and chronic schizophrenia ( n  = 32) using a novel virtual reality task. Clinical participants and demographically-matched controls were administered the Virtual Assessment of Mentalising Ability, which provides indices of first- and second-order cognitive and affective ToM, and quantifies three different types of mentalising errors (viz., overmentalising, reduced mentalising, and no ToM). Individuals with early schizophrenia performed significantly poorer than healthy controls on first-order affective and second-order cognitive and affective ToM, but significantly higher than individuals with chronic schizophrenia on all ToM subscales. Whereas a lack of mental state concept was associated with negative symptoms, overmentalising was associated with positive symptoms. These findings suggest that ToM abilities selectively deteriorate with illness chronicity and error types are related to these individuals' presenting symptomology. An implication of the findings is that social-cognitive interventions for schizophrenia need to consider the nature, time course and symptomatology of the presenting patient.

  7. A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification

    PubMed Central

    Rutledge, Robert G.

    2011-01-01

    Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812

  8. A novel approach for the automated segmentation and volume quantification of cardiac fats on computed tomography.

    PubMed

    Rodrigues, É O; Morais, F F C; Morais, N A O S; Conci, L S; Neto, L V; Conci, A

    2016-01-01

    The deposits of fat on the surroundings of the heart are correlated to several health risk factors such as atherosclerosis, carotid stiffness, coronary artery calcification, atrial fibrillation and many others. These deposits vary unrelated to obesity, which reinforces its direct segmentation for further quantification. However, manual segmentation of these fats has not been widely deployed in clinical practice due to the required human workload and consequential high cost of physicians and technicians. In this work, we propose a unified method for an autonomous segmentation and quantification of two types of cardiac fats. The segmented fats are termed epicardial and mediastinal, and stand apart from each other by the pericardium. Much effort was devoted to achieve minimal user intervention. The proposed methodology mainly comprises registration and classification algorithms to perform the desired segmentation. We compare the performance of several classification algorithms on this task, including neural networks, probabilistic models and decision tree algorithms. Experimental results of the proposed methodology have shown that the mean accuracy regarding both epicardial and mediastinal fats is 98.5% (99.5% if the features are normalized), with a mean true positive rate of 98.0%. In average, the Dice similarity index was equal to 97.6%. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Theory of flux cutting and flux transport at the critical current of a type-II superconducting cylindrical wire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clem, John R

    2011-02-17

    I introduce a critical-state theory incorporating both flux cutting and flux transport to calculate the magnetic-field and current-density distributions inside a type-II superconducting cylinder at its critical current in a longitudinal applied magnetic field. The theory is an extension of the elliptic critical-state model introduced by Romero-Salazar and Pérez-Rodríguez. The vortex dynamics depend in detail on two nonlinear effective resistivities for flux cutting (ρ{sub ∥}) and flux flow (ρ{sub ⊥}), and their ratio r=ρ{sub ∥}/ρ{sub ⊥}. When r<1, the low relative efficiency of flux cutting in reducing the magnitude of the internal magnetic-flux density leads to a paramagnetic longitudinal magneticmore » moment. As a model for understanding the experimentally observed interrelationship between the critical currents for flux cutting and depinning, I calculate the forces on a helical vortex arc stretched between two pinning centers when the vortex is subjected to a current density of arbitrary angle Φ. Simultaneous initiation of flux cutting and flux transport occurs at the critical current density J{sub c}(Φ) that makes the vortex arc unstable.« less

  10. Theory of flux cutting and flux transport at the critical current of a type-II superconducting cylindrical wire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clem, John R.

    2011-02-17

    I introduce a critical-state theory incorporating both flux cutting and flux transport to calculate the magnetic-field and current-density distributions inside a type-II superconducting cylinder at its critical current in a longitudinal applied magnetic field. The theory is an extension of the elliptic critical-state model introduced by Romero-Salazar and Perez-Rodriguez. The vortex dynamics depend in detail on two nonlinear effective resistivities for flux cutting ({rho}{parallel}) and flux flow ({rho}{perpendicular}), and their ratio r = {rho}{parallel}/{rho}{perpendicular}. When r < 1, the low relative efficiency of flux cutting in reducing the magnitude of the internal magnetic-flux density leads to a paramagnetic longitudinal magneticmore » moment. As a model for understanding the experimentally observed interrelationship between the critical currents for flux cutting and depinning, I calculate the forces on a helical vortex arc stretched between two pinning centers when the vortex is subjected to a current density of arbitrary angle {phi}. Simultaneous initiation of flux cutting and flux transport occurs at the critical current density J{sub c}({phi}) that makes the vortex arc unstable.« less

  11. Perspectives on Rhetorical History: Aristotle's Rhetorical Theory.

    ERIC Educational Resources Information Center

    Markham, Reed

    The most important historical theory of persuasion is Aristotelian Rhetorical Theory. Aristotle's work, "The Rhetoric," is divided into three books, each of which discuss principles relevant to persuasion. Book One establishes the philosophical position of rhetoric to logic; establishes the purposes of rhetoric; discusses three types of…

  12. A quantitative witness for Greenberger-Horne-Zeilinger entanglement.

    PubMed

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.

  13. A quantitative witness for Greenberger-Horne-Zeilinger entanglement

    PubMed Central

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431

  14. Quantification of sterol lipids in plants by quadrupole time-of-flight mass spectrometry

    PubMed Central

    Wewer, Vera; Dombrink, Isabel; vom Dorp, Katharina; Dörmann, Peter

    2011-01-01

    Glycerolipids, sphingolipids, and sterol lipids constitute the major lipid classes in plants. Sterol lipids are composed of free and conjugated sterols, i.e., sterol esters, sterol glycosides, and acylated sterol glycosides. Sterol lipids play crucial roles during adaption to abiotic stresses and plant-pathogen interactions. Presently, no comprehensive method for sterol lipid quantification in plants is available. We used nanospray ionization quadrupole-time-of-flight mass spectrometry (Q-TOF MS) to resolve and identify the molecular species of all four sterol lipid classes from Arabidopsis thaliana. Free sterols were derivatized with chlorobetainyl chloride. Sterol esters, sterol glycosides, and acylated sterol glycosides were ionized as ammonium adducts. Quantification of molecular species was achieved in the positive mode after fragmentation in the presence of internal standards. The amounts of sterol lipids quantified by Q-TOF MS/MS were validated by comparison with results obtained with TLC/GC. Quantification of sterol lipids from leaves and roots of phosphate-deprived A. thaliana plants revealed changes in the amounts and molecular species composition. The Q-TOF method is far more sensitive than GC or HPLC. Therefore, Q-TOF MS/MS provides a comprehensive strategy for sterol lipid quantification that can be adapted to other tandem mass spectrometers. PMID:21382968

  15. The effectiveness of theory- and model-based lifestyle interventions on HbA1c among patients with type 2 diabetes: a systematic review and meta-analysis.

    PubMed

    Doshmangir, P; Jahangiry, L; Farhangi, M A; Doshmangir, L; Faraji, L

    2018-02-01

    The prevalence of type 2 diabetes is rising rapidly around the world. A number of systematic reviews have provided evidence for the effectiveness of lifestyle interventions on diabetic patients. The effectiveness of theory- and model-based education-lifestyle interventions for diabetic patients are unclear. The systematic review and meta-analysis aimed to evaluate and quantify the impact of theory-based lifestyle interventions on type 2 diabetes. A literature search of authentic electronic resources including PubMed, Scopus, and Cochrane collaboration was performed to identify published papers between January 2002 and July 2016. The PICOs (participants, intervention, comparison, and outcomes) elements were used for the selection of studies to meet the inclusion and exclusion criteria. Mean differences and standard deviations of hemoglobin A1c (HbA1c [mmol/mol]) level in baseline and follow-up measures of studies in intervention and control groups were considered for data synthesis. A random-effects model was used for estimating pooled effect sizes. To investigate the source of heterogeneity, predefined subgroup analyses were performed using trial duration, baseline HbA1c (mmol/mol) level, and the age of participants. Meta-regression was performed to examine the contribution of trial duration, baseline HbA1c (mmol/mol) level, the age of participants, and mean differences of HbA1c (mmol/mol) level. The significant level was considered P < 0.05. Eighteen studies with 2384 participants met the inclusion criteria. The pooled main outcomes by random-effects model showed significant improvements in HbA1c (mmol/mol) -5.35% (95% confidence interval = -6.3, -4.40; P < 0.001) with the evidence of heterogeneity across studies. The findings of this meta-analysis suggest that theory- and model-based lifestyle interventions have positive effects on HbA1c (mmol/mol) indices in patients with type 2 diabetes. Health education theories have been applied as a useful tool for

  16. Factors Influencing Physical Activity Behavior among Iranian Women with Type 2 Diabetes Using the Extended Theory of Reasoned Action.

    PubMed

    Didarloo, Alireza; Shojaeizadeh, Davoud; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad

    2011-10-01

    Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease.

  17. Cognitive adaptation theory as a predictor of adjustment to emerging adulthood for youth with and without type 1 diabetes.

    PubMed

    Helgeson, Vicki S; Reynolds, Kerry A; Siminerio, Linda M; Becker, Dorothy J; Escobar, Oscar

    2014-12-01

    The purpose of the study was to determine whether resilience, defined by cognitive adaptation theory, predicted emerging adulthood outcomes among youth with and without type 1 diabetes. Youth with (n=118) and without type 1 diabetes (n=122), who were part of a previous longitudinal study during adolescence, completed on-line questionnaires during their senior year of high school and one and two years later. They were average age 18, 53% female, and 93% white. Questionnaires assessed cognitive adaptation theory (CAT) indicators (self-esteem, mastery, optimism) and psychological, relationship, behavioral, vocational, and, for those with diabetes, diabetes outcomes. The CAT index at baseline predicted reduced psychological distress, enhanced psychological well-being, increased friend support, reduced friend conflict, the presence of romantic relationships, reduced likelihood of romantic breakups, higher GPA, higher work satisfaction, and lower work stress during the transition to emerging adulthood. Among those with diabetes, the CAT index predicted better self-care behavior and revealed a marginal relation to better glycemic control. Analyses controlled for baseline levels when appropriate. Findings were stronger one year than two years post high school graduation, and findings were stronger for those with than without diabetes. Youth with diabetes also scored lower on the CAT index than youth without diabetes. These findings suggest that the implications of CAT include not only psychological health but also relationship, vocational, and diabetes outcomes. Those who score lower on CAT indicators should be identified as children so that interventions designed to enhance resilience can be implemented. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Cognitive Adaptation Theory as a Predictor of Adjustment to Emerging Adulthood for Youth with and without Type 1 Diabetes

    PubMed Central

    Helgeson, Vicki S.; Reynolds, Kerry A.; Siminerio, Linda M.; Becker, Dorothy J.; Escobar, Oscar

    2014-01-01

    Objective: The purpose of the study was to determine whether resilience, defined by cognitive adaptation theory, predicted emerging adulthood outcomes among youth with and without type 1 diabetes. Methods: Youth with (n = 118) and without type 1 diabetes (n = 122), who were part of a previous longitudinal study during adolescence, completed on-line questionnaires during their senior year of high school and one and two years later. They were average age 18, 53% female, and 93% white. Questionnaires assessed cognitive adaptation theory (CAT) indicators (self-esteem, mastery, optimism) and psychological, relationship, behavioral, vocational, and, for those with diabetes, diabetes outcomes. Results: The CAT index at baseline predicted reduced psychological distress, enhanced psychological well-being, increased friend support, reduced friend conflict, the presence of romantic relationships, reduced likelihood of romantic breakups, higher GPA, higher work satisfaction, and lower work stress during the transition to emerging adulthood. Among those with diabetes, the CAT index predicted better self-care behavior and revealed a marginal relation to better glycemic control. Analyses controlled for baseline levels when appropriate. Findings were stronger one year than two years post high school graduation, and findings were stronger for those with than without diabetes. Youth with diabetes also scored lower on the CAT index than youth without diabetes. Conclusions: These findings suggest that the implications of CAT include not only psychological health but also relationship, vocational, and diabetes outcomes. Those who score lower on CAT indicators should be identified as children so that interventions designed to enhance resilience can be implemented. PMID:25294781

  19. Simultaneous digital quantification and fluorescence-based size characterization of massively parallel sequencing libraries.

    PubMed

    Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H

    2013-08-01

    Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.

  20. Analysis of some types of intermediate orbits used in the theory of artificial Earth satellite motion for the purposes of geodesy.

    NASA Astrophysics Data System (ADS)

    Kotseva, V. I.

    Survey, analysis and comparison of 15 types of intermediate orbits used in the satellite movement theories for the purposes both of the geodesy and geodynamics have been made. The paper is a continuation of the investigations directed to practical realization both of analytical and semi-analytical methods for satellite orbit determination. It is indicated that the intermediate orbit proposed and elaborated by Aksenov, Grebenikov and Demin has got some good qualities and priorities over all the rest intermediate orbits.

  1. Multiconfiguration Pair-Density Functional Theory Outperforms Kohn-Sham Density Functional Theory and Multireference Perturbation Theory for Ground-State and Excited-State Charge Transfer.

    PubMed

    Ghosh, Soumen; Sonnenberger, Andrew L; Hoyer, Chad E; Truhlar, Donald G; Gagliardi, Laura

    2015-08-11

    The correct description of charge transfer in ground and excited states is very important for molecular interactions, photochemistry, electrochemistry, and charge transport, but it is very challenging for Kohn-Sham (KS) density functional theory (DFT). KS-DFT exchange-correlation functionals without nonlocal exchange fail to describe both ground- and excited-state charge transfer properly. We have recently proposed a theory called multiconfiguration pair-density functional theory (MC-PDFT), which is based on a combination of multiconfiguration wave function theory with a new type of density functional called an on-top density functional. Here we have used MC-PDFT to study challenging ground- and excited-state charge-transfer processes by using on-top density functionals obtained by translating KS exchange-correlation functionals. For ground-state charge transfer, MC-PDFT performs better than either the PBE exchange-correlation functional or CASPT2 wave function theory. For excited-state charge transfer, MC-PDFT (unlike KS-DFT) shows qualitatively correct behavior at long-range with great improvement in predicted excitation energies.

  2. Dualities and Topological Field Theories from Twisted Geometries

    NASA Astrophysics Data System (ADS)

    Markov, Ruza

    I will present three studies of string theory on twisted geometries. In the first calculation included in this dissertation we use gauge/gravity duality to study the Coulomb branch of an unusual type of nonlocal field theory, called Puff Field Theory. On the gravity side, this theory is given in terms of D3-branes in type IIB string theory with a geometric twist. While the field theory description, available in the IR limit, is a deformation of Yang-Mills gauge theory by an order seven operator which we here compute. In the rest of this dissertation we explore N = 4 super Yang-Mills (SYM) theory compactied on a circle with S-duality and R-symmetry twists that preserve N = 6 supersymmetry in 2 + 1D. It was shown that abelian theory on a flat manifold gives Chern-Simons theory in the low-energy limit and here we are interested in the non-abelian counterpart. To that end, we introduce external static supersymmetric quark and anti-quark sources into the theory and calculate the Witten Index of the resulting Hilbert space of ground states on a two-torus. Using these results we compute the action of simple Wilson loops on the Hilbert space of ground states without sources. In some cases we find disagreement between our results for the Wilson loop eigenvalues and previous conjectures about a connection with Chern-Simons theory. The last result discussed in this dissertation demonstrates a connection between gravitational Chern-Simons theory and N = 4 four-dimensional SYM theory compactified on a circle twisted by S-duality where the remaining three-manifold is not flat starting with the explicit geometric realization of S-duality in terms of (2, 0) theory.

  3. ODE/IM correspondence and the Argyres-Douglas theory

    NASA Astrophysics Data System (ADS)

    Ito, Katsushi; Shu, Hongfei

    2017-08-01

    We study the quantum spectral curve of the Argyres-Douglas theories in the Nekrasov-Sahashvili limit of the Omega-background. Using the ODE/IM correspondence we investigate the quantum integrable model corresponding to the quantum spectral curve. We show that the models for the A 2 N -type theories are non-unitary coset models ( A 1)1 × ( A 1) L /( A 1) L+1 at the fractional level L=2/2N+1-2 , which appear in the study of the 4d/2d correspondence of N = 2 superconformal field theories. Based on the WKB analysis, we clarify the relation between the Y-functions and the quantum periods and study the exact Bohr-Sommerfeld quantization condition for the quantum periods. We also discuss the quantum spectral curves for the D and E type theories.

  4. Quantization of higher abelian gauge theory in generalized differential cohomology

    NASA Astrophysics Data System (ADS)

    Szabo, R.

    We review and elaborate on some aspects of the quantization of certain classes of higher abelian gauge theories using techniques of generalized differential cohomology. Particular emphasis is placed on the examples of generalized Maxwell theory and Cheeger-Simons cohomology, and of Ramond-Ramond fields in Type II superstring theory and differential K-theory.

  5. Introducing Nonlinear Pricing into Consumer Choice Theory.

    ERIC Educational Resources Information Center

    DeSalvo, Joseph S.; Huq, Mobinul

    2002-01-01

    Describes and contrasts nonlinear and linear pricing in consumer choice theory. Discusses the types of nonlinear pricing: block-declining tariff, two-part tariff, three-part tariff, and quality discounts or premia. States that understanding nonlinear pricing enhances student comprehension of consumer choice theory. Suggests teaching the concept in…

  6. Analyzing the management and disturbance in European forest based on self-thinning theory

    NASA Astrophysics Data System (ADS)

    Yan, Y.; Gielen, B.; Schelhaas, M.; Mohren, F.; Luyssaert, S.; Janssens, I. A.

    2012-04-01

    There is increasing awareness that natural and anthropogenic disturbance in forests affects exchange of CO2, H2O and energy between the ecosystem and the atmosphere. Consequently quantification of land use and disturbance intensity is one of the next steps needed to improve our understanding of the carbon cycle, its interactions with the atmosphere and its main drivers at local as well as at global level. The conventional NPP-based approaches to quantify the intensity of land management are limited because they lack a sound ecological basis. Here we apply a new way of characterising the degree of management and disturbance in forests using the self- thinning theory and observations of diameter at breast height and stand density. We used plot level information on dominant tree species, diameter at breast height, stand density and soil type from the French national forest inventory from 2005 to 2010. Stand density and diameter at breast height were used to parameterize the intercept of the self-thinning relationship and combined with theoretical slope to obtain an upper boundary for stand productivity given its density. Subsequently, we tested the sensitivity of the self-thinning relationship for tree species, soil type, climate and other environmental characteristics. We could find statistical differences in the self-thinning relationship between species and soil types, mainly due to the large uncertainty of the parameter estimates. Deviation from the theoretical self-thinning line defined as DBH=αN-3/4, was used as a proxy for disturbances, allowing to make spatially explicit maps of forest disturbance over France. The same framework was used to quantify the density-DBH trajectory of even-aged stand management of beech and oak over France. These trajectories will be used as a driver of forest management in the land surface model ORCHIDEE.

  7. Localized 2D COSY sequences: Method and experimental evaluation for a whole metabolite quantification approach

    NASA Astrophysics Data System (ADS)

    Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène

    2015-11-01

    Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.

  8. Proposed experiment to test fundamentally binary theories

    NASA Astrophysics Data System (ADS)

    Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán

    2017-09-01

    Fundamentally binary theories are nonsignaling theories in which measurements of many outcomes are constructed by selecting from binary measurements. They constitute a sensible alternative to quantum theory and have never been directly falsified by any experiment. Here we show that fundamentally binary theories are experimentally testable with current technology. For that, we identify a feasible Bell-type experiment on pairs of entangled qutrits. In addition, we prove that, for any n , quantum n -ary correlations are not fundamentally (n -1 ) -ary. For that, we introduce a family of inequalities that hold for fundamentally (n -1 ) -ary theories but are violated by quantum n -ary correlations.

  9. Hydrostatic figure of the earth: Theory and results

    NASA Technical Reports Server (NTRS)

    Khan, M. A.

    1973-01-01

    The complete development of the mathematical theory of hydrostatic equilibrium for the earth is recounted. Modifications of the first order theory are given along with the subsequent extension to the second order. In addition, the equations are presented which resulted from a revision of the second order theory to suit the new applications and data types of the post-artificial earth satellite era.

  10. Career Development Theory and Its Application. Career Knowledge Series

    ERIC Educational Resources Information Center

    National Career Development Association, 2015

    2015-01-01

    Covers career development theory, models, and techniques and how to apply them; understand the steps in the career development process and why career choice and development theory is important as well as limitations. Presents the assumptions that underlie four different types of theories; trait and factor, learning, developmental, and transition…

  11. G-theory: The generator of M-theory and supersymmetry

    NASA Astrophysics Data System (ADS)

    Sepehri, Alireza; Pincak, Richard

    2018-04-01

    In string theory with ten dimensions, all Dp-branes are constructed from D0-branes whose action has two-dimensional brackets of Lie 2-algebra. Also, in M-theory, with 11 dimensions, all Mp-branes are built from M0-branes whose action contains three-dimensional brackets of Lie 3-algebra. In these theories, the reason for difference between bosons and fermions is unclear and especially in M-theory there is not any stable object like stable M3-branes on which our universe would be formed on it and for this reason it cannot help us to explain cosmological events. For this reason, we construct G-theory with M dimensions whose branes are formed from G0-branes with N-dimensional brackets. In this theory, we assume that at the beginning there is nothing. Then, two energies, which differ in their signs only, emerge and produce 2M degrees of freedom. Each two degrees of freedom create a new dimension and then M dimensions emerge. M-N of these degrees of freedom are removed by symmetrically compacting half of M-N dimensions to produce Lie-N-algebra. In fact, each dimension produces a degree of freedom. Consequently, by compacting M-N dimensions from M dimensions, N dimensions and N degrees of freedom is emerged. These N degrees of freedoms produce Lie-N-algebra. During this compactification, some dimensions take extra i and are different from other dimensions, which are known as time coordinates. By this compactification, two types of branes, Gp and anti-Gp-branes, are produced and rank of tensor fields which live on them changes from zero to dimension of brane. The number of time coordinates, which are produced by negative energy in anti-Gp-branes, is more sensible to number of times in Gp-branes. These branes are compactified anti-symmetrically and then fermionic superpartners of bosonic fields emerge and supersymmetry is born. Some of gauge fields play the role of graviton and gravitino and produce the supergravity. The question may arise that what is the physical reason

  12. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  13. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  14. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  15. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  16. Understanding adolescent type 1 diabetes self-management as an adaptive process: A grounded theory approach.

    PubMed

    Chilton, Roy; Pires-Yfantouda, Renata

    2015-01-01

    To develop a conceptual understanding of the process of adapting to the self-management of type 1 diabetes during adolescence. Participants were recruited from a National Health Service paediatric diabetes service within the south-west of England which runs six countywide diabetes clinics. Thirteen interviews were conducted using a social constructivist grounded theory approach. The findings illustrate how self-management can be understood in terms of a continuum-based framework, ranging from difficulties with, to successful self-management. Adaptation within the continuum can further be understood by specific transitional phases and process mechanisms, providing further depth to individuals' experiences of adaptation. This investigation provides a conceptual understanding of the complex issues adolescents encounter while adapting to and integrating a diabetes self-management regime into their lives. It provides an invaluable framework for exploring psychological mechanisms and contextualising them within a self-management continuum. Implications for healthcare professionals are discussed and further research proposes whether the model could be applicable to other chronic illnesses.

  17. Colorimetric Quantification and in Situ Detection of Collagen

    ERIC Educational Resources Information Center

    Esteban, Francisco J.; del Moral, Maria L.; Sanchez-Lopez, Ana M.; Blanco, Santos; Jimenez, Ana; Hernandez, Raquel; Pedrosa, Juan A.; Peinado, Maria A.

    2005-01-01

    A simple multidisciplinary and inexpensive laboratory exercise is proposed, in which the undergraduate student may correlate biochemical and anatomical findings. The entire practical session can be completed in one 2.5-3 hour laboratory period, and consists of the quantification of collagen and total protein content from tissue sections--without…

  18. Synthesis of robust nonlinear autopilots using differential game theory

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.

    1991-01-01

    A synthesis technique for handling unmodeled disturbances in nonlinear control law synthesis was advanced using differential game theory. Two types of modeling inaccuracies can be included in the formulation. The first is a bias-type error, while the second is the scale-factor-type error in the control variables. The disturbances were assumed to satisfy an integral inequality constraint. Additionally, it was assumed that they act in such a way as to maximize a quadratic performance index. Expressions for optimal control and worst-case disturbance were then obtained using optimal control theory.

  19. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  20. Digital games for type 1 and type 2 diabetes: underpinning theory with three illustrative examples.

    PubMed

    Kamel Boulos, Maged N; Gammon, Shauna; Dixon, Mavis C; MacRury, Sandra M; Fergusson, Michael J; Miranda Rodrigues, Francisco; Mourinho Baptista, Telmo; Yang, Stephen P

    2015-03-18

    Digital games are an important class of eHealth interventions in diabetes, made possible by the Internet and a good range of affordable mobile devices (eg, mobile phones and tablets) available to consumers these days. Gamifying disease management can help children, adolescents, and adults with diabetes to better cope with their lifelong condition. Gamification and social in-game components are used to motivate players/patients and positively change their behavior and lifestyle. In this paper, we start by presenting the main challenges facing people with diabetes-children/adolescents and adults-from a clinical perspective, followed by three short illustrative examples of mobile and desktop game apps and platforms designed by Ayogo Health, Inc. (Vancouver, BC, Canada) for type 1 diabetes (one example) and type 2 diabetes (two examples). The games target different age groups with different needs-children with type 1 diabetes versus adults with type 2 diabetes. The paper is not meant to be an exhaustive review of all digital game offerings available for people with type 1 and type 2 diabetes, but rather to serve as a taster of a few of the game genres on offer today for both types of diabetes, with a brief discussion of (1) some of the underpinning psychological mechanisms of gamified digital interventions and platforms as self-management adherence tools, and more, in diabetes, and (2) some of the hypothesized potential benefits that might be gained from their routine use by people with diabetes. More research evidence from full-scale evaluation studies is needed and expected in the near future that will quantify, qualify, and establish the evidence base concerning this gamification potential, such as what works in each age group/patient type, what does not, and under which settings and criteria.

  1. Digital Games for Type 1 and Type 2 Diabetes: Underpinning Theory With Three Illustrative Examples

    PubMed Central

    Gammon, Shauna; Dixon, Mavis C; MacRury, Sandra M; Fergusson, Michael J; Miranda Rodrigues, Francisco; Mourinho Baptista, Telmo; Yang, Stephen P

    2015-01-01

    Digital games are an important class of eHealth interventions in diabetes, made possible by the Internet and a good range of affordable mobile devices (eg, mobile phones and tablets) available to consumers these days. Gamifying disease management can help children, adolescents, and adults with diabetes to better cope with their lifelong condition. Gamification and social in-game components are used to motivate players/patients and positively change their behavior and lifestyle. In this paper, we start by presenting the main challenges facing people with diabetes—children/adolescents and adults—from a clinical perspective, followed by three short illustrative examples of mobile and desktop game apps and platforms designed by Ayogo Health, Inc. (Vancouver, BC, Canada) for type 1 diabetes (one example) and type 2 diabetes (two examples). The games target different age groups with different needs—children with type 1 diabetes versus adults with type 2 diabetes. The paper is not meant to be an exhaustive review of all digital game offerings available for people with type 1 and type 2 diabetes, but rather to serve as a taster of a few of the game genres on offer today for both types of diabetes, with a brief discussion of (1) some of the underpinning psychological mechanisms of gamified digital interventions and platforms as self-management adherence tools, and more, in diabetes, and (2) some of the hypothesized potential benefits that might be gained from their routine use by people with diabetes. More research evidence from full-scale evaluation studies is needed and expected in the near future that will quantify, qualify, and establish the evidence base concerning this gamification potential, such as what works in each age group/patient type, what does not, and under which settings and criteria. PMID:25791276

  2. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  3. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  4. Current position of high-resolution MS for drug quantification in clinical & forensic toxicology.

    PubMed

    Meyer, Markus R; Helfer, Andreas G; Maurer, Hans H

    2014-08-01

    This paper reviews high-resolution MS approaches published from January 2011 until March 2014 for the quantification of drugs (of abuse) and/or their metabolites in biosamples using LC-MS with time-of-flight or Orbitrap™ mass analyzers. Corresponding approaches are discussed including sample preparation and mass spectral settings. The advantages and limitations of high-resolution MS for drug quantification, as well as the demand for a certain resolution or a specific mass accuracy are also explored.

  5. Evaluation of quantification methods for real-time PCR minor groove binding hybridization probe assays.

    PubMed

    Durtschi, Jacob D; Stevenson, Jeffery; Hymas, Weston; Voelkerding, Karl V

    2007-02-01

    Real-time PCR data analysis for quantification has been the subject of many studies aimed at the identification of new and improved quantification methods. Several analysis methods have been proposed as superior alternatives to the common variations of the threshold crossing method. Notably, sigmoidal and exponential curve fit methods have been proposed. However, these studies have primarily analyzed real-time PCR with intercalating dyes such as SYBR Green. Clinical real-time PCR assays, in contrast, often employ fluorescent probes whose real-time amplification fluorescence curves differ from those of intercalating dyes. In the current study, we compared four analysis methods related to recent literature: two versions of the threshold crossing method, a second derivative maximum method, and a sigmoidal curve fit method. These methods were applied to a clinically relevant real-time human herpes virus type 6 (HHV6) PCR assay that used a minor groove binding (MGB) Eclipse hybridization probe as well as an Epstein-Barr virus (EBV) PCR assay that used an MGB Pleiades hybridization probe. We found that the crossing threshold method yielded more precise results when analyzing the HHV6 assay, which was characterized by lower signal/noise and less developed amplification curve plateaus. In contrast, the EBV assay, characterized by greater signal/noise and amplification curves with plateau regions similar to those observed with intercalating dyes, gave results with statistically similar precision by all four analysis methods.

  6. Quantification of HCV RNA in Clinical Specimens by Branched DNA (bDNA) Technology.

    PubMed

    Wilber, J C; Urdea, M S

    1999-01-01

    The diagnosis and monitoring of hepatitis C virus (HCV) infection have been aided by the development of HCV RNA quantification assays A direct measure of viral load, HCV RNA quantification has the advantage of providing information on viral kinetics and provides unique insight into the disease process. Branched DNA (bDNA) signal amplification technology provides a novel approach for the direct quantification of HCV RNA in patient specimens. The bDNA assay measures HCV RNA at physiological levels by boosting the reporter signal, rather than by replicating target sequences as the means of detection, and thus avoids the errors inherent in the extraction and amplification of target sequences. Inherently quantitative and nonradioactive, the bDNA assay is amenable to routine use in a clinical research setting, and has been used by several groups to explore the natural history, pathogenesis, and treatment of HCV infection.

  7. Hemispheric specialization in quantification processes.

    PubMed

    Pasini, M; Tessari, A

    2001-01-01

    Three experiments were carried out to study hemispheric specialization for subitizing (the rapid enumeration of small patterns) and counting (the serial quantification process based on some formal principles). The experiments consist of numerosity identification of dot patterns presented in one visual field, with a tachistoscopic technique, or eye movements monitored through glasses, and comparison between centrally presented dot patterns and lateralized tachistoscopically presented digits. Our experiments show left visual field advantage in the identification and comparison tasks in the subitizing range, whereas right visual field advantage has been found in the comparison task for the counting range.

  8. Quantification of Fluorine Content in AFFF Concentrates

    DTIC Science & Technology

    2017-09-29

    and quantitative integrations, a 100 ppm spectral window (FIDRes 0.215 Hz) was scanned using the following acquisition parameters: acquisition time ...Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6120--17-9752 Quantification of Fluorine Content in AFFF Concentrates September 29, 2017...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  9. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  10. Graduate Courses in Argumentation Theory.

    ERIC Educational Resources Information Center

    Benoit, William L.; Follert, Vincent F.

    1986-01-01

    Reports results of a survey of graduate courses in argumentation theory. Includes data on types of courses, theorists, historical and basic concepts in argument, everyday argument, resources (books and articles), etc. (PD)

  11. Analytical Validation of Quantitative Real-Time PCR Methods for Quantification of Trypanosoma cruzi DNA in Blood Samples from Chagas Disease Patients

    PubMed Central

    Ramírez, Juan Carlos; Cura, Carolina Inés; Moreira, Otacilio da Cruz; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Guedes, Paulo Marcos da Matta; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Galvão, Lúcia Maria da Cunha; da Câmara, Antonia Cláudia Jácome; Espinoza, Bertha; de Noya, Belkisyole Alarcón; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G.

    2015-01-01

    An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. PMID:26320872

  12. Development of a Framework for Model-Based Analysis, Uncertainty Quantification, and Robust Control Design of Nonlinear Smart Composite Systems

    DTIC Science & Technology

    2015-06-04

    control, vibration and noise control, health monitoring, and energy harvesting . However, these advantages come at the cost of rate-dependent hysteresis...configuration used for energy harvesting . Uncertainty Quantification Uncertainty quantification is pursued in two steps: (i) determination of densities...Crews and R.C. Smith, “Quantification of parameter and model uncertainty for shape mem- ory alloy bending actuators,” Journal of Intelligent material

  13. Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations

    NASA Astrophysics Data System (ADS)

    Giovanis, D. G.; Shields, M. D.

    2018-07-01

    This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.

  14. Career preference theory: A grounded theory describing the effects of undergraduate career preferences on student persistence in engineering

    NASA Astrophysics Data System (ADS)

    Dettinger, Karen Marie

    This study used grounded theory in a case study at a large public research university to develop a theory about how the culture in engineering education affects students with varying interests and backgrounds. According to Career Preference Theory, the engineering education system has evolved to meet the needs of one type of student, the Physical Scientist. While this educational process serves to develop the next generation of engineering faculty members, the majority of engineering undergraduates go on to work as practicing engineers, and are far removed from working as physical scientists. According to Career Preference Theory, students with a history of success in mathematics and sciences, and a focus on career, enter engineering. These students, who actually have a wide range of interests and values, each begin seeking an identity as a practicing engineer. Career Preference Theory is developed around a concept, Career Identity Type, that describes five different types of engineering students: Pragmatic, Physical Scientist, "Social" Scientist, Designer, and Educator. According to the theory, each student must develop an identity within the engineering education system if they are to persist in engineering. However, the current undergraduate engineering education system has evolved in such a way that it meets only the needs of the Physical Scientist. Pragmatic students are also likely to succeed because they tend to be extremely goal-focused and maintain a focus on the rewards they will receive once they graduate with an engineering degree. However, "Social" Scientists, who value interpersonal relationships and giving back to society; Designers, who value integrating ideas across disciplines to create aesthetically pleasing and useful products; and Educators, who have a strong desire to give back to society by working with young people, must make some connection between these values and a future engineering career if they are to persist in engineering. According

  15. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    PubMed

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area

  16. Nondestructive quantification of analyte diffusion in cornea and sclera using optical coherence tomography.

    PubMed

    Ghosn, Mohamad G; Tuchin, Valery V; Larin, Kirill V

    2007-06-01

    Noninvasive functional imaging, monitoring, and quantification of analytes transport in epithelial ocular tissues are extremely important for therapy and diagnostics of many eye diseases. In this study the authors investigated the capability of optical coherence tomography (OCT) for noninvasive monitoring and quantification of diffusion of different analytes in sclera and cornea of rabbit eyes. A portable time-domain OCT system with wavelength of 1310 +/- 15 nm, output power of 3.5 mW, and resolution of 25 mum was used in this study. Diffusion of different analytes was monitored and quantified in rabbit cornea and sclera of whole eyeballs. Diffusion of water, metronidazole (0.5%), dexamethasone (0.2%), ciprofloxacin (0.3%), mannitol (20%), and glucose solution (20%) were examined, and their permeability coefficients were calculated by using OCT signal slope and depth-resolved amplitude methods. Permeability coefficients were calculated as a function of time and tissue depth. For instance, mannitol was found to have a permeability coefficient of (8.99 +/- 1.43) x 10(-6) cm/s in cornea and (6.18 +/- 1.08) x 10(-6) cm/s in sclera. The permeability coefficient of drugs with small concentrations (where water was the major solvent) was found to be in the range of that of water in the same tissue type, whereas permeability coefficients of higher concentrated solutions varied significantly. Results suggest that the OCT technique might be a powerful tool for noninvasive diffusion studies of different analytes in ocular tissues. However, additional methods of OCT signal acquisition and processing are required to study the diffusion of agents of small concentrations.

  17. Proteomic Identification and Quantification of S-glutathionylation in Mouse Macrophages Using Resin-Assisted Enrichment and Isobaric Labeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Dian; Gaffrey, Matthew J.; Guo, Jia

    2014-02-11

    Protein S-glutathionylation (SSG) is an important regulatory posttranslational modification of protein cysteine (Cys) thiol redox switches, yet the role of specific cysteine residues as targets of modification is poorly understood. We report a novel quantitative mass spectrometry (MS)-based proteomic method for site-specific identification and quantification of S-glutathionylation across different conditions. Briefly, this approach consists of initial blocking of free thiols by alkylation, selective reduction of glutathionylated thiols and enrichment using thiol affinity resins, followed by on-resin tryptic digestion and isobaric labeling with iTRAQ (isobaric tags for relative and absolute quantitation) for MS-based identification and quantification. The overall approach was validatedmore » by application to RAW 264.7 mouse macrophages treated with different doses of diamide to induce glutathionylation. A total of 1071 Cys-sites from 690 proteins were identified in response to diamide treatment, with ~90% of the sites displaying >2-fold increases in SSG-modification compared to controls.. This approach was extended to identify potential SSG modified Cys-sites in response to H2O2, an endogenous oxidant produced by activated macrophages and many pathophysiological stimuli. The results revealed 364 Cys-sites from 265 proteins that were sensitive to S-glutathionylation in response to H2O2 treatment. These proteins covered a range of molecular types and molecular functions with free radical scavenging, and cell death and survival included as the most significantly enriched functional categories. Overall the results demonstrate that our approach is effective for site-specific identification and quantification of S-glutathionylated proteins. The analytical strategy also provides a unique approach to determining the major pathways and cell processes most susceptible to glutathionylation at a proteome-wide scale.« less

  18. Methods for quantification of soil-transmitted helminths in environmental media: current techniques and recent advances

    PubMed Central

    Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.

    2015-01-01

    Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788

  19. Microscopic theory of vortex interaction in two-band superconductors and type-1.5 superconductivity

    NASA Astrophysics Data System (ADS)

    Silaev, Mihail; Babaev, Egor

    2011-03-01

    In the framework of self-consistent microscopic theory we study the structure and interaction of vortices in two-gap superconductor taking into account the interband Josephson coupling. The asymptotical behavior of order parameter densities and magnetic field is studied analytically within the microscopic theory at low temperature. At higher temperatures, results consistent with Ginzburg-Landau theory are obtained. It is shown that under quite general conditions and in a wide temperature ranges (in particular outside the validity of the Ginzburg-Landau theory) there can exist an additional characteristic length scale of the order parameter density variation which exceeds the London penetration length of magnetic field due to the multi-component nature of superconducting state. Such behavior of order parameter density variation leads to the attractive long-range and repulsive short-range interaction between vortices. Supported by NSF CAREER Award DMR-0955902, Knut and Alice Wallenberg Foundation through the Royal Swedish Academy of Sciences and Swedish Research Council, ''Dynasty'' foundation and Russian Foundation for Basic Research.

  20. Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification

    DOT National Transportation Integrated Search

    2011-04-29

    For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...

  1. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    PubMed

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  2. "Utilizing" signal detection theory.

    PubMed

    Lynn, Spencer K; Barrett, Lisa Feldman

    2014-09-01

    What do inferring what a person is thinking or feeling, judging a defendant's guilt, and navigating a dimly lit room have in common? They involve perceptual uncertainty (e.g., a scowling face might indicate anger or concentration, for which different responses are appropriate) and behavioral risk (e.g., a cost to making the wrong response). Signal detection theory describes these types of decisions. In this tutorial, we show how incorporating the economic concept of utility allows signal detection theory to serve as a model of optimal decision making, going beyond its common use as an analytic method. This utility approach to signal detection theory clarifies otherwise enigmatic influences of perceptual uncertainty on measures of decision-making performance (accuracy and optimality) and on behavior (an inverse relationship between bias magnitude and sensitivity optimizes utility). A "utilized" signal detection theory offers the possibility of expanding the phenomena that can be understood within a decision-making framework. © The Author(s) 2014.

  3. Can time-dependent density functional theory predict intersystem crossing in organic chromophores? A case study on benzo(bis)-X-diazole based donor-acceptor-donor type molecules.

    PubMed

    Tam, Teck Lip Dexter; Lin, Ting Ting; Chua, Ming Hui

    2017-06-21

    Here we utilized new diagnostic tools in time-dependent density functional theory to explain the trend of intersystem crossing in benzo(bis)-X-diazole based donor-acceptor-donor type molecules. These molecules display a wide range of fluorescence quantum yields and triplet yields, making them excellent candidates for testing the validity of these diagnostic tools. We believe that these tools are cost-effective and can be applied to structurally similar organic chromophores to predict/explain the trends of intersystem crossing, and thus fluorescence quantum yields and triplet yields without the use of complex and expensive multireference configuration interaction or multireference pertubation theory methods.

  4. A Leonard-Sanders-Budiansky-Koiter-Type Nonlinear Shell Theory with a Hierarchy of Transverse-Shearing Deformations

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.

    2013-01-01

    A detailed exposition on a refined nonlinear shell theory suitable for nonlinear buckling analyses of laminated-composite shell structures is presented. This shell theory includes the classical nonlinear shell theory attributed to Leonard, Sanders, Koiter, and Budiansky as an explicit proper subset. This approach is used in order to leverage the exisiting experience base and to make the theory attractive to industry. In addition, the formalism of general tensors is avoided in order to expose the details needed to fully understand and use the theory. The shell theory is based on "small" strains and "moderate" rotations, and no shell-thinness approximations are used. As a result, the strain-displacement relations are exact within the presumptions of "small" strains and "moderate" rotations. The effects of transverse-shearing deformations are included in the theory by using analyst-defined functions to describe the through-the-thickness distributions of transverse-shearing strains. Constitutive equations for laminated-composite shells are derived without using any shell-thinness approximations, and simplified forms and special cases are presented.

  5. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  6. Kinetic Theories for Biofilms (Preprint)

    DTIC Science & Technology

    2011-01-01

    2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Kinetic Theories for Biofilms 5a. CONTRACT NUMBER 5b...binary complex fluids to develop a set of hydrodynamic models for the two-phase mixture of biofilms and solvent (water). It is aimed to model...kinetics along with the intrinsic molecular elasticity of the EPS network strand modeled as an elastic dumbbell. This theory is valid in both the biofilm

  7. Experimental validation of 2D uncertainty quantification for DIC.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reu, Phillip L.

    Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual testmore » images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.« less

  8. Type Theory, Computation and Interactive Theorem Proving

    DTIC Science & Technology

    2015-09-01

    postdoc Cody Roux, to develop new methods of verifying real-valued inequalities automatically. They developed a prototype implementation in Python [8] (an...he has developed new heuristic, geometric methods of verifying real-valued inequalities. A python -based implementation has performed surprisingly...express complex mathematical and computational assertions. In this project, Avigad and Harper developed type-theoretic algorithms and formalisms that

  9. The role of PET quantification in cardiovascular imaging.

    PubMed

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  10. Factors Influencing Physical Activity Behavior among Iranian Women with Type 2 Diabetes Using the Extended Theory of Reasoned Action

    PubMed Central

    Didarloo, Alireza; Ardebili, Hassan Eftekhar; Niknami, Shamsaddin; Hajizadeh, Ebrahim; Alizadeh, Mohammad

    2011-01-01

    Background Findings of most studies indicate that the only way to control diabetes and prevent its debilitating effects is through the continuous performance of self-care behaviors. Physical activity is a non-pharmacological method of diabetes treatment and because of its positive effects on diabetic patients, it is being increasingly considered by researchers and practitioners. This study aimed at determining factors influencing physical activity among diabetic women in Iran, using the extended theory of reasoned action in Iran. Methods A sample of 352 women with type 2 diabetes, referring to a Diabetes Clinic in Khoy, Iran, participated in the study. Appropriate instruments were designed to measure the desired variables (knowledge of diabetes, personal beliefs, subjective norms, perceived self-efficacy, behavioral intention and physical activity behavior). The reliability and validity of the instruments were examined and approved. Statistical analyses of the study were conducted by inferential statistical techniques (independent t-test, correlations and regressions) using the SPSS package. Results The findings of this investigation indicated that among the constructs of the model, self efficacy was the strongest predictor of intentions among women with type 2 diabetes and both directly and indirectly affected physical activity. In addition to self efficacy, diabetic patients' physical activity also was influenced by other variables of the model and sociodemographic factors. Conclusion Our findings suggest that the high ability of the theory of reasoned action extended by self-efficacy in forecasting and explaining physical activity can be a base for educational intervention. Educational interventions based on the proposed model are necessary for improving diabetics' physical activity behavior and controlling disease. PMID:22111043

  11. The Psychology of Career Theory--A New Perspective?

    ERIC Educational Resources Information Center

    Woodd, Maureen

    2000-01-01

    New perspectives on human behavior have invalidated some assumptions of career theories such as personality type, career stages, and life-cycle models. Other theories, such as Driver's Objective Career Patterns, Schein's Temporal Development Model, and Nicholson's Transition Cycle, are compatible with current psychological understanding. (SK)

  12. Four human Plasmodium species quantification using droplet digital PCR.

    PubMed

    Srisutham, Suttipat; Saralamba, Naowarat; Malleret, Benoit; Rénia, Laurent; Dondorp, Arjen M; Imwong, Mallika

    2017-01-01

    Droplet digital polymerase chain reaction (ddPCR) is a partial PCR based on water-oil emulsion droplet technology. It is a highly sensitive method for detecting and delineating minor alleles from complex backgrounds and provides absolute quantification of DNA targets. The ddPCR technology has been applied for detection of many pathogens. Here the sensitive assay utilizing ddPCR for detection and quantification of Plasmodium species was investigated. The assay was developed for two levels of detection, genus specific for all Plasmodium species and for specific Plasmodium species detection. The ddPCR assay was developed based on primers and probes specific to the Plasmodium genus 18S rRNA gene. Using ddPCR for ultra-sensitive P. falciparum assessment, the lower level of detection from concentrated DNA obtained from a high volume (1 mL) blood sample was 11 parasites/mL. For species identification, in particular for samples with mixed infections, a duplex reaction was developed for detection and quantification P. falciparum/ P. vivax and P. malariae/ P. ovale. Amplification of each Plasmodium species in the duplex reaction showed equal sensitivity to singleplex single species detection. The duplex ddPCR assay had higher sensitivity to identify minor species in 32 subpatent parasitaemia samples from Cambodia, and performed better than real-time PCR. The ddPCR assay shows high sensitivity to assess very low parasitaemia of all human Plasmodium species. This provides a useful research tool for studying the role of the asymptomatic parasite reservoir for transmission in regions aiming for malaria elimination.

  13. Fractonic line excitations: An inroad from three-dimensional elasticity theory

    NASA Astrophysics Data System (ADS)

    Pai, Shriya; Pretko, Michael

    2018-06-01

    We demonstrate the existence of a fundamentally new type of excitation, fractonic lines, which are linelike excitations with the restricted mobility properties of fractons. These excitations, described using an amalgamation of higher-form gauge theories with symmetric tensor gauge theories, see direct physical realization as the topological lattice defects of ordinary three-dimensional quantum crystals. Starting with the more familiar elasticity theory, we show how theory maps onto a rank-4 tensor gauge theory, with phonons corresponding to gapless gauge modes and disclination defects corresponding to linelike charges. We derive flux conservation laws which lock these linelike excitations in place, analogous to the higher moment charge conservation laws of fracton theories. This way of encoding mobility restrictions of lattice defects could shed light on melting transitions in three dimensions. This new type of extended object may also be a useful tool in the search for improved quantum error-correcting codes in three dimensions.

  14. Concept analysis and the building blocks of theory: misconceptions regarding theory development.

    PubMed

    Bergdahl, Elisabeth; Berterö, Carina M

    2016-10-01

    The purpose of this article is to discuss the attempts to justify concepts analysis as a way to construct theory - a notion often advocated in nursing. The notion that concepts are the building blocks or threads from which theory is constructed is often repeated. It can be found in many articles and well-known textbooks. However, this notion is seldom explained or defended. The notion of concepts as building blocks has also been questioned by several authors. However, most of these authors seem to agree to some degree that concepts are essential components from which theory is built. Discussion paper. Literature was reviewed to synthesize and debate current knowledge. Our point is that theory is not built by concepts analysis or clarification and we will show that this notion has its basis in some serious misunderstandings. We argue that concept analysis is not a part of sound scientific method and should be abandoned. The current methods of concept analysis in nursing have no foundation in philosophy of science or in language philosophy. The type of concept analysis performed in nursing is not a way to 'construct' theory. Rather, theories are formed by creative endeavour to propose a solution to a scientific and/or practical problem. The bottom line is that the current style and form of concept analysis in nursing should be abandoned in favour of methods in line with modern theory of science. © 2016 John Wiley & Sons Ltd.

  15. Whole-Body Computed Tomography-Based Body Mass and Body Fat Quantification: A Comparison to Hydrostatic Weighing and Air Displacement Plethysmography.

    PubMed

    Gibby, Jacob T; Njeru, Dennis K; Cvetko, Steve T; Heiny, Eric L; Creer, Andrew R; Gibby, Wendell A

    We correlate and evaluate the accuracy of accepted anthropometric methods of percent body fat (%BF) quantification, namely, hydrostatic weighing (HW) and air displacement plethysmography (ADP), to 2 automatic adipose tissue quantification methods using computed tomography (CT). Twenty volunteer subjects (14 men, 6 women) received head-to-toe CT scans. Hydrostatic weighing and ADP were obtained from 17 and 12 subjects, respectively. The CT data underwent conversion using 2 separate algorithms, namely, the Schneider method and the Beam method, to convert Hounsfield units to their respective tissue densities. The overall mass and %BF of both methods were compared with HW and ADP. When comparing ADP to CT data using the Schneider method and Beam method, correlations were r = 0.9806 and 0.9804, respectively. Paired t tests indicated there were no statistically significant biases. Additionally, observed average differences in %BF between ADP and the Schneider method and the Beam method were 0.38% and 0.77%, respectively. The %BF measured from ADP, the Schneider method, and the Beam method all had significantly higher mean differences when compared with HW (3.05%, 2.32%, and 1.94%, respectively). We have shown that total body mass correlates remarkably well with both the Schneider method and Beam method of mass quantification. Furthermore, %BF calculated with the Schneider method and Beam method CT algorithms correlates remarkably well with ADP. The application of these CT algorithms have utility in further research to accurately stratify risk factors with periorgan, visceral, and subcutaneous types of adipose tissue, and has the potential for significant clinical application.

  16. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  17. Higher derivative couplings in theories with sixteen supersymmetries

    DOE PAGES

    Lin, Ying -Hsuan; Shao, Shu -Heng; Yin, Xi; ...

    2015-12-15

    We give simple arguments for new non-renormalization theorems on higher derivative couplings of gauge theories to supergravity, with sixteen supersymmetries, by considerations of brane-bulk superamplitudes. This leads to some exact results on the effective coupling of D3-branes in type IIB string theory. As a result, we also derive exact results on higher dimensional operators in the torus compactification of the six dimensional (0, 2) superconformal theory.

  18. Theory of a general class of dissipative processes.

    NASA Technical Reports Server (NTRS)

    Hale, J. K.; Lasalle, J. P.; Slemrod, M.

    1972-01-01

    Development of a theory of periodic processes that is of sufficient generality for being applied to systems defined by partial differential equations (distributed parameter systems) and functional differential equations of the retarded and neutral type (hereditary systems), as well as to systems arising in the theory of elasticity. In particular, the attempt is made to develop a meaningful general theory of dissipative periodic systems with a wide range of applications.

  19. Quantification of color vision using a tablet display.

    PubMed

    Chacon, Alicia; Rabin, Jeff; Yu, Dennis; Johnston, Shawn; Bradshaw, Timothy

    2015-01-01

    Accurate color vision is essential for optimal performance in aviation and space environments using nonredundant color coding to convey critical information. Most color tests detect color vision deficiency (CVD) but fail to diagnose type or severity of CVD, which are important to link performance to occupational demands. The computer-based Cone Contrast Test (CCT) diagnoses type and severity of CVD. It is displayed on a netbook computer for clinical application, but a more portable version may prove useful for deployments, space and aviation cockpits, as well as accident and sports medicine settings. Our purpose was to determine if the CCT can be conducted on a tablet display (Windows 8, Microsoft, Seattle, WA) using touch-screen response input. The CCT presents colored letters visible only to red (R), green (G), and blue (B) sensitive retinal cones to determine the lowest R, G, and B cone contrast visible to the observer. The CCT was measured in 16 color vision normals (CVN) and 16 CVDs using the standard netbook computer and a Windows 8 tablet display calibrated to produce equal color contrasts. Both displays showed 100% specificity for confirming CVN and 100% sensitivity for detecting CVD. In CVNs there was no difference between scores on netbook vs. tablet displays. G cone CVDs showed slightly lower G cone CCT scores on the tablet. CVD can be diagnosed with a tablet display. Ease-of-use, portability, and complete computer capabilities make tablets ideal for multiple settings, including aviation, space, military deployments, accidents and rescue missions, and sports vision. Chacon A, Rabin J, Yu D, Johnston S, Bradshaw T. Quantification of color vision using a tablet display.

  20. Novel Techniques for Quantification of Correlation Between Primary Liquid Jet Breakup and Downstream Spray Characteristics

    DTIC Science & Technology

    2016-05-08

    unlimited. 5 1. Introduction Several liquid -fuelled combustion systems, such as liquid propellant rocket engines and gas turbines...AFRL-AFOSR-JP-TR-2016-0084 Novel techniques for quantification of correlation between primary liquid jet breakup and downstream spray characteristics...to 17 Apr 2016 4.  TITLE AND SUBTITLE Novel techniques for quantification of correlation between primary liquid jet breakup and downstream spray