Sample records for accurate quantitative description

  1. QUANTITATION OF MENSTRUAL BLOOD LOSS: A RADIOACTIVE METHOD UTILIZING A COUNTING DOME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tauxe, W.N.

    A description has been given of a simple, accurate tech nique for the quantitation of menstrual blood loss, involving the determination of a three- dimensional isosensitivity curve and the fashioning of a lucite dome with cover to fit these specifications. Ten normal subjects lost no more than 50 ml each per menstrual period. (auth)

  2. Optimal dimensionality reduction of complex dynamics: the chess game as diffusion on a free-energy landscape.

    PubMed

    Krivov, Sergei V

    2011-07-01

    Dimensionality reduction is ubiquitous in the analysis of complex dynamics. The conventional dimensionality reduction techniques, however, focus on reproducing the underlying configuration space, rather than the dynamics itself. The constructed low-dimensional space does not provide a complete and accurate description of the dynamics. Here I describe how to perform dimensionality reduction while preserving the essential properties of the dynamics. The approach is illustrated by analyzing the chess game--the archetype of complex dynamics. A variable that provides complete and accurate description of chess dynamics is constructed. The winning probability is predicted by describing the game as a random walk on the free-energy landscape associated with the variable. The approach suggests a possible way of obtaining a simple yet accurate description of many important complex phenomena. The analysis of the chess game shows that the approach can quantitatively describe the dynamics of processes where human decision-making plays a central role, e.g., financial and social dynamics.

  3. Optimal dimensionality reduction of complex dynamics: The chess game as diffusion on a free-energy landscape

    NASA Astrophysics Data System (ADS)

    Krivov, Sergei V.

    2011-07-01

    Dimensionality reduction is ubiquitous in the analysis of complex dynamics. The conventional dimensionality reduction techniques, however, focus on reproducing the underlying configuration space, rather than the dynamics itself. The constructed low-dimensional space does not provide a complete and accurate description of the dynamics. Here I describe how to perform dimensionality reduction while preserving the essential properties of the dynamics. The approach is illustrated by analyzing the chess game—the archetype of complex dynamics. A variable that provides complete and accurate description of chess dynamics is constructed. The winning probability is predicted by describing the game as a random walk on the free-energy landscape associated with the variable. The approach suggests a possible way of obtaining a simple yet accurate description of many important complex phenomena. The analysis of the chess game shows that the approach can quantitatively describe the dynamics of processes where human decision-making plays a central role, e.g., financial and social dynamics.

  4. The attentional drift-diffusion model extends to simple purchasing decisions.

    PubMed

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions.

  5. The Attentional Drift-Diffusion Model Extends to Simple Purchasing Decisions

    PubMed Central

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions. PMID:22707945

  6. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNAmore » populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.« less

  7. PBPK Models, BBDR Models, and Virtual Tissues: How Will They Contribute to the Use of Toxicity Pathways in Risk Assessment?

    EPA Science Inventory

    Accuracy in risk assessment, which is desirable in order to ensure protection of the public health while avoiding over-regulation of economically-important substances, requires quantitatively accurate, in vivo descriptions of dose-response and time-course behaviors. This level of...

  8. A novel approach to teach the generation of bioelectrical potentials from a descriptive and quantitative perspective.

    PubMed

    Rodriguez-Falces, Javier

    2013-12-01

    In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are difficult to describe and conceptualize. In addition, most traditional approaches aimed at describing extracellular potentials consist of complex mathematical machinery that gives no chance for physical interpretation. The aim of the present study is to present a new method to teach the formation of extracellular potentials around a muscle fiber from both a descriptive and quantitative perspective. The implementation of this method was tested through a written exam and a satisfaction survey. The new method enhanced the ability of students to visualize the generation of bioelectrical potentials. In addition, the new approach improved students' understanding of how changes in the fiber-to-electrode distance and in the shape of the excitation source are translated into changes in the extracellular potential. The survey results show that combining general principles of electrical fields with accurate graphic imagery gives students an intuitive, yet quantitative, feel for electrophysiological signals and enhances their motivation to continue their studies in the biomedical engineering field.

  9. A Novel Approach to Anharmonicity for a Wealth of Applications in Nonlinear Science Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanusse, Patrick

    2011-04-19

    We present a new theory of the anharmonicity of nonlinear oscillations that are exhibited by many physical systems. New physical quantities are introduced that describe the departure from linear or harmonic behavior and as far as extremely anharmonic situations. In order to solve the nonlinear phase equation, the key notion of our theory, which controls the anharmonic behavior, a new and fascinating nonlinear trigonometry is designed. These results provide a general and accurate yet compact description of such signals, by far better than the Fourier description, both quantitatively and qualitatively and will benefit many application fields.

  10. Total Protein Content Determination of Microalgal Biomass by Elemental Nitrogen Analysis and a Dedicated Nitrogen-to-Protein Conversion Factor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laurens, Lieve M; Olstad-Thompson, Jessica L; Templeton, David W

    Accurately determining protein content is important in the valorization of algal biomass in food, feed, and fuel markets, where these values are used for component balance calculations. Conversion of elemental nitrogen to protein is a well-accepted and widely practiced method, but depends on developing an applicable nitrogen-to-protein conversion factor. The methodology reported here covers the quantitative assessment of the total nitrogen content of algal biomass and a description of the methodology that underpins the accurate de novo calculation of a dedicated nitrogen-to-protein conversion factor.

  11. Method for a quantitative investigation of the frozen flow hypothesis

    PubMed

    Schock; Spillar

    2000-09-01

    We present a technique to test the frozen flow hypothesis quantitatively, using data from wave-front sensors such as those found in adaptive optics systems. Detailed treatments of the theoretical background of the method and of the error analysis are presented. Analyzing data from the 1.5-m and 3.5-m telescopes at the Starfire Optical Range, we find that the frozen flow hypothesis is an accurate description of the temporal development of atmospheric turbulence on time scales of the order of 1-10 ms but that significant deviations from the frozen flow behavior are present for longer time scales.

  12. Light-propagation management in coupled waveguide arrays: Quantitative experimental and theoretical assessment from band structures to functional patterns

    NASA Astrophysics Data System (ADS)

    Moison, Jean-Marie; Belabas, Nadia; Levenson, Juan Ariel; Minot, Christophe

    2012-09-01

    We assess the band structure of arrays of coupled optical waveguides both by ab initio calculations and by experiments, with an excellent quantitative agreement without any adjustable physical parameter. The band structures we obtain can deviate strongly from the expectations of the standard coupled mode theory approximation, but we describe them efficiently by a few parameters within an extended coupled mode theory. We also demonstrate that this description is in turn a firm and simple basis for accurate beam management in functional patterns of coupled waveguides, in full accordance with their design.

  13. Water Lone Pair Delocalization in Classical and Quantum Descriptions of the Hydration of Model Ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Remsing, Richard C.; Duignan, Timothy T.; Baer, Marcel D.

    Understanding the nature of ionic hydration at a fundamental level has eluded scientists despite intense interest for nearly a century. In particular, the microscopic origins of the asymmetry of ion solvation thermodynamics with respect to the sign of the ionic charge remains a mystery. Here, we determine the response of accurate quantum mechanical water models to strong nanoscale solvation forces arising from excluded volumes and ionic electrostatic fields. This is compared to the predictions of two important limiting classes of classical models of water with fixed point changes, differing in their treatment of "lone-pair" electrons. Using the quantum water modelmore » as our standard of accuracy, we find that a single fixed classical treatment of lone pair electrons cannot accurately describe solvation of both apolar and cationic solutes, underlining the need for a more flexible description of local electronic effects in solvation processes. However, we explicitly show that all water models studied respond to weak long-ranged electrostatic perturbations in a manner that follows macroscopic dielectric continuum models, as would be expected. We emphasize the importance of these findings in the context of realistic ion models, using density functional theory and empirical models, and discuss the implications of our results for quantitatively accurate reduced descriptions of solvation in dielectric media.« less

  14. Multialternative drift-diffusion model predicts the relationship between visual fixations and choice in value-based decisions.

    PubMed

    Krajbich, Ian; Rangel, Antonio

    2011-08-16

    How do we make decisions when confronted with several alternatives (e.g., on a supermarket shelf)? Previous work has shown that accumulator models, such as the drift-diffusion model, can provide accurate descriptions of the psychometric data for binary value-based choices, and that the choice process is guided by visual attention. However, the computational processes used to make choices in more complicated situations involving three or more options are unknown. We propose a model of trinary value-based choice that generalizes what is known about binary choice, and test it using an eye-tracking experiment. We find that the model provides a quantitatively accurate description of the relationship between choice, reaction time, and visual fixation data using the same parameters that were estimated in previous work on binary choice. Our findings suggest that the brain uses similar computational processes to make binary and trinary choices.

  15. In silico method for modelling metabolism and gene product expression at genome scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lerman, Joshua A.; Hyduke, Daniel R.; Latif, Haythem

    2012-07-03

    Transcription and translation use raw materials and energy generated metabolically to create the macromolecular machinery responsible for all cellular functions, including metabolism. A biochemically accurate model of molecular biology and metabolism will facilitate comprehensive and quantitative computations of an organism's molecular constitution as a function of genetic and environmental parameters. Here we formulate a model of metabolism and macromolecular expression. Prototyping it using the simple microorganism Thermotoga maritima, we show our model accurately simulates variations in cellular composition and gene expression. Moreover, through in silico comparative transcriptomics, the model allows the discovery of new regulons and improving the genome andmore » transcription unit annotations. Our method presents a framework for investigating molecular biology and cellular physiology in silico and may allow quantitative interpretation of multi-omics data sets in the context of an integrated biochemical description of an organism.« less

  16. Molecular hydrodynamics: Vortex formation and sound wave propagation

    DOE PAGES

    Han, Kyeong Hwan; Kim, Changho; Talkner, Peter; ...

    2018-01-14

    In the present study, quantitative feasibility tests of the hydrodynamic description of a two-dimensional fluid at the molecular level are performed, both with respect to length and time scales. Using high-resolution fluid velocity data obtained from extensive molecular dynamics simulations, we computed the transverse and longitudinal components of the velocity field by the Helmholtz decomposition and compared them with those obtained from the linearized Navier-Stokes (LNS) equations with time-dependent transport coefficients. By investigating the vortex dynamics and the sound wave propagation in terms of these field components, we confirm the validity of the LNS description for times comparable to ormore » larger than several mean collision times. The LNS description still reproduces the transverse velocity field accurately at smaller times, but it fails to predict characteristic patterns of molecular origin visible in the longitudinal velocity field. Based on these observations, we validate the main assumptions of the mode-coupling approach. The assumption that the velocity autocorrelation function can be expressed in terms of the fluid velocity field and the tagged particle distribution is found to be remarkably accurate even for times comparable to or smaller than the mean collision time. This suggests that the hydrodynamic-mode description remains valid down to the molecular scale.« less

  17. Molecular hydrodynamics: Vortex formation and sound wave propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Kyeong Hwan; Kim, Changho; Talkner, Peter

    In the present study, quantitative feasibility tests of the hydrodynamic description of a two-dimensional fluid at the molecular level are performed, both with respect to length and time scales. Using high-resolution fluid velocity data obtained from extensive molecular dynamics simulations, we computed the transverse and longitudinal components of the velocity field by the Helmholtz decomposition and compared them with those obtained from the linearized Navier-Stokes (LNS) equations with time-dependent transport coefficients. By investigating the vortex dynamics and the sound wave propagation in terms of these field components, we confirm the validity of the LNS description for times comparable to ormore » larger than several mean collision times. The LNS description still reproduces the transverse velocity field accurately at smaller times, but it fails to predict characteristic patterns of molecular origin visible in the longitudinal velocity field. Based on these observations, we validate the main assumptions of the mode-coupling approach. The assumption that the velocity autocorrelation function can be expressed in terms of the fluid velocity field and the tagged particle distribution is found to be remarkably accurate even for times comparable to or smaller than the mean collision time. This suggests that the hydrodynamic-mode description remains valid down to the molecular scale.« less

  18. Identifying nursing interventions associated with the accuracy used nursing diagnoses for patients with liver cirrhosis 1

    PubMed Central

    Gimenes, Fernanda Raphael Escobar; Motta, Ana Paula Gobbo; da Silva, Patrícia Costa dos Santos; Gobbo, Ana Flora Fogaça; Atila, Elisabeth; de Carvalho, Emilia Campos

    2017-01-01

    ABSTRACT Objective: to identify the nursing interventions associated with the most accurate and frequently used NANDA International, Inc. (NANDA-I) nursing diagnoses for patients with liver cirrhosis. Method: this is a descriptive, quantitative, cross-sectional study. Results: a total of 12 nursing diagnoses were evaluated, seven of which showed high accuracy (IVC ≥ 0.8); 70 interventions were identified and 23 (32.86%) were common to more than one diagnosis. Conclusion: in general, nurses often perform nursing interventions suggested in the NIC for the seven highly accurate nursing diagnoses identified in this study to care patients with liver cirrhosis. Accurate and valid nursing diagnoses guide the selection of appropriate interventions that nurses can perform to enhance patient safety and thus improve patient health outcomes.

  19. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  20. Noisy Oscillations in the Actin Cytoskeleton of Chemotactic Amoeba.

    PubMed

    Negrete, Jose; Pumir, Alain; Hsu, Hsin-Fang; Westendorf, Christian; Tarantola, Marco; Beta, Carsten; Bodenschatz, Eberhard

    2016-09-30

    Biological systems with their complex biochemical networks are known to be intrinsically noisy. Here we investigate the dynamics of actin polymerization of amoeboid cells, which are close to the onset of oscillations. We show that the large phenotypic variability in the polymerization dynamics can be accurately captured by a generic nonlinear oscillator model in the presence of noise. We determine the relative role of the noise with a single dimensionless, experimentally accessible parameter, thus providing a quantitative description of the variability in a population of cells. Our approach, which rests on a generic description of a system close to a Hopf bifurcation and includes the effect of noise, can characterize the dynamics of a large class of noisy systems close to an oscillatory instability.

  1. Noisy Oscillations in the Actin Cytoskeleton of Chemotactic Amoeba

    NASA Astrophysics Data System (ADS)

    Negrete, Jose; Pumir, Alain; Hsu, Hsin-Fang; Westendorf, Christian; Tarantola, Marco; Beta, Carsten; Bodenschatz, Eberhard

    2016-09-01

    Biological systems with their complex biochemical networks are known to be intrinsically noisy. Here we investigate the dynamics of actin polymerization of amoeboid cells, which are close to the onset of oscillations. We show that the large phenotypic variability in the polymerization dynamics can be accurately captured by a generic nonlinear oscillator model in the presence of noise. We determine the relative role of the noise with a single dimensionless, experimentally accessible parameter, thus providing a quantitative description of the variability in a population of cells. Our approach, which rests on a generic description of a system close to a Hopf bifurcation and includes the effect of noise, can characterize the dynamics of a large class of noisy systems close to an oscillatory instability.

  2. Accurate atomistic first-principles calculations of electronic stopping

    DOE PAGES

    Schleife, André; Kanai, Yosuke; Correa, Alfredo A.

    2015-01-20

    In this paper, we show that atomistic first-principles calculations based on real-time propagation within time-dependent density functional theory are capable of accurately describing electronic stopping of light projectile atoms in metal hosts over a wide range of projectile velocities. In particular, we employ a plane-wave pseudopotential scheme to solve time-dependent Kohn-Sham equations for representative systems of H and He projectiles in crystalline aluminum. This approach to simulate nonadiabatic electron-ion interaction provides an accurate framework that allows for quantitative comparison with experiment without introducing ad hoc parameters such as effective charges, or assumptions about the dielectric function. Finally, our work clearlymore » shows that this atomistic first-principles description of electronic stopping is able to disentangle contributions due to tightly bound semicore electrons and geometric aspects of the stopping geometry (channeling versus off-channeling) in a wide range of projectile velocities.« less

  3. Toward canonical ensemble distribution from self-guided Langevin dynamics simulation

    NASA Astrophysics Data System (ADS)

    Wu, Xiongwu; Brooks, Bernard R.

    2011-04-01

    This work derives a quantitative description of the conformational distribution in self-guided Langevin dynamics (SGLD) simulations. SGLD simulations employ guiding forces calculated from local average momentums to enhance low-frequency motion. This enhancement in low-frequency motion dramatically accelerates conformational search efficiency, but also induces certain perturbations in conformational distribution. Through the local averaging, we separate properties of molecular systems into low-frequency and high-frequency portions. The guiding force effect on the conformational distribution is quantitatively described using these low-frequency and high-frequency properties. This quantitative relation provides a way to convert between a canonical ensemble and a self-guided ensemble. Using example systems, we demonstrated how to utilize the relation to obtain canonical ensemble properties and conformational distributions from SGLD simulations. This development makes SGLD not only an efficient approach for conformational searching, but also an accurate means for conformational sampling.

  4. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  5. Quasiparticle Level Alignment for Photocatalytic Interfaces.

    PubMed

    Migani, Annapaoala; Mowbray, Duncan J; Zhao, Jin; Petek, Hrvoje; Rubio, Angel

    2014-05-13

    Electronic level alignment at the interface between an adsorbed molecular layer and a semiconducting substrate determines the activity and efficiency of many photocatalytic materials. Standard density functional theory (DFT)-based methods have proven unable to provide a quantitative description of this level alignment. This requires a proper treatment of the anisotropic screening, necessitating the use of quasiparticle (QP) techniques. However, the computational complexity of QP algorithms has meant a quantitative description of interfacial levels has remained elusive. We provide a systematic study of a prototypical interface, bare and methanol-covered rutile TiO2(110) surfaces, to determine the type of many-body theory required to obtain an accurate description of the level alignment. This is accomplished via a direct comparison with metastable impact electron spectroscopy (MIES), ultraviolet photoelectron spectroscopy (UPS), and two-photon photoemission (2PP) spectroscopy. We consider GGA DFT, hybrid DFT, and G0W0, scQPGW1, scQPGW0, and scQPGW QP calculations. Our results demonstrate that G0W0, or our recently introduced scQPGW1 approach, are required to obtain the correct alignment of both the highest occupied and lowest unoccupied interfacial molecular levels (HOMO/LUMO). These calculations set a new standard in the interpretation of electronic structure probe experiments of complex organic molecule/semiconductor interfaces.

  6. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    PubMed

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  7. Quantitative measurement and analysis for detection and treatment planning of developmental dysplasia of the hip

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Lu, Hongbing; Chen, Hanyong; Zhao, Li; Shi, Zhengxing; Liang, Zhengrong

    2009-02-01

    Developmental dysplasia of the hip is a congenital hip joint malformation affecting the proximal femurs and acetabulum that are subluxatable, dislocatable, and dislocated. Conventionally, physicians made diagnoses and treatments only based on findings from two-dimensional (2D) images by manually calculating clinic parameters. However, anatomical complexity of the disease and the limitation of current standard procedures make accurate diagnosis quite difficultly. In this study, we developed a system that provides quantitative measurement of 3D clinical indexes based on computed tomography (CT) images. To extract bone structure from surrounding tissues more accurately, the system firstly segments the bone using a knowledge-based fuzzy clustering method, which is formulated by modifying the objective function of the standard fuzzy c-means algorithm with additive adaptation penalty. The second part of the system calculates automatically the clinical indexes, which are extended from 2D to 3D for accurate description of spatial relationship between femurs and acetabulum. To evaluate the system performance, experimental study based on 22 patients with unilateral or bilateral affected hip was performed. The results of 3D acetabulum index (AI) automatically provided by the system were validated by comparison with 2D results measured by surgeons manually. The correlation between the two results was found to be 0.622 (p<0.01).

  8. Quantitative Surface Chirality Detection with Sum Frequency Generation Vibrational Spectroscopy: Twin Polarization Angle Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Feng; Xu, Yanyan; Guo, Yuan

    2009-12-27

    Here we report a novel twin polarization angle (TPA) approach in the quantitative chirality detection with the surface sum-frequency generation vibrational spectroscopy (SFG-VS). Generally, the achiral contribution dominates the surface SFG-VS signal, and the pure chiral signal is usually two or three orders of magnitude smaller. Therefore, it has been difficult to make quantitative detection and analysis of the chiral contributions to the surface SFG- VS signal. In the TPA method, by varying together the polarization angles of the incoming visible light and the sum frequency signal at fixed s or p polarization of the incoming infrared beam, the polarizationmore » dependent SFG signal can give not only direct signature of the chiral contribution in the total SFG-VS signal, but also the accurate measurement of the chiral and achiral components in the surface SFG signal. The general description of the TPA method is presented and the experiment test of the TPA approach is also presented for the SFG-VS from the S- and R-limonene chiral liquid surfaces. The most accurate degree of chiral excess values thus obtained for the 2878 cm⁻¹ spectral peak of the S- and R-limonene liquid surfaces are (23.7±0.4)% and ({25.4±1.3)%, respectively.« less

  9. Research Participants' Understanding of and Reactions to Certificates of Confidentiality.

    PubMed

    Beskow, Laura M; Check, Devon K; Ammarell, Natalie

    2014-01-01

    Certificates of Confidentiality are intended to facilitate participation in critical public health research by protecting against forced disclosure of identifying data in legal proceedings, but little is known about the effect of Certificate descriptions in consent forms. To gain preliminary insights, we conducted qualitative interviews with 50 HIV-positive individuals in Durham, North Carolina to explore their subjective understanding of Certificate descriptions and whether their reactions differed based on receiving a standard versus simplified description. Most interviewees were neither reassured nor alarmed by Certificate information, and most said it would not influence their willingness to participate or provide truthful information. However, compared with those receiving the simplified description, more who read the standard description said it raised new concerns, that their likelihood of participating would be lower, and that they might be less forthcoming. Most interviewees said they found the Certificate description clear, but standard-group participants often found particular words and phrases confusing, while simplified-group participants more often questioned the information's substance. Valid informed consent requires comprehension and voluntariness. Our findings highlight the importance of developing consent descriptions of Certificates and other confidentiality protections that are simple and accurate. These qualitative results provide rich detail to inform a larger, quantitative study that would permit further rigorous comparisons.

  10. Research Participants’ Understanding of and Reactions to Certificates of Confidentiality

    PubMed Central

    Check, Devon K.; Ammarell, Natalie

    2013-01-01

    Background Certificates of Confidentiality are intended to facilitate participation in critical public health research by protecting against forced disclosure of identifying data in legal proceedings, but little is known about the effect of Certificate descriptions in consent forms. Methods To gain preliminary insights, we conducted qualitative interviews with 50 HIV-positive individuals in Durham, North Carolina to explore their subjective understanding of Certificate descriptions and whether their reactions differed based on receiving a standard versus simplified description. Results Most interviewees were neither reassured nor alarmed by Certificate information, and most said it would not influence their willingness to participate or provide truthful information. However, compared with those receiving the simplified description, more who read the standard description said it raised new concerns, that their likelihood of participating would be lower, and that they might be less forthcoming. Most interviewees said they found the Certificate description clear, but standard-group participants often found particular words and phrases confusing, while simplified-group participants more often questioned the information’s substance. Conclusions Valid informed consent requires comprehension and voluntariness. Our findings highlight the importance of developing consent descriptions of Certificates and other confidentiality protections that are simple and accurate. These qualitative results provide rich detail to inform a larger, quantitative study that would permit further rigorous comparisons. PMID:24563806

  11. Kinetics of Fast Atoms in the Terrestrial Atmosphere

    NASA Technical Reports Server (NTRS)

    Kharchenko, Vasili A.; Dalgarno, A.; Mellott, Mary (Technical Monitor)

    2002-01-01

    This report summarizes our investigations performed under NASA Grant NAG5-8058. The three-year research supported by the Geospace Sciences SR&T program (Ionospheric, Thermospheric, and Mesospheric Physics) has been designed to investigate fluxes of energetic oxygen and nitrogen atoms in the terrestrial thermosphere. Fast atoms are produced due to absorption of the solar radiation and due to coupling between the ionosphere and the neutral thermospheric gas. We have investigated the impact of hot oxygen and nitrogen atoms on the thermal balance, chemistry and radiation properties of the terrestrial thermosphere. Our calculations have been focused on the accurate quantitative description of the thermalization of O and N energetic atoms in collisions with atom and molecules of the ambient neutral gas. Upward fluxes of oxygen and nitrogen atoms, the rate of atmospheric heating by hot oxygen atoms, and the energy input into translational and rotational-vibrational degrees of atmospheric molecules have been evaluated. Altitude profiles of hot oxygen and nitrogen atoms have been analyzed and compared with available observational data. Energetic oxygen atoms in the terrestrial atmosphere have been investigated for decades, but insufficient information on the kinetics of fast atmospheric atoms has been a main obstacle for the interpretation of observational data and modeling of the hot geocorona. The recent development of accurate computational methods of the collisional kinetics is seen as an important step in the quantitative description of hot atoms in the thermosphere. Modeling of relaxation processes in the terrestrial atmosphere has incorporated data of recent observations, and theoretical predictions have been tested by new laboratory measurements.

  12. Computable visually observed phenotype ontological framework for plants

    PubMed Central

    2011-01-01

    Background The ability to search for and precisely compare similar phenotypic appearances within and across species has vast potential in plant science and genetic research. The difficulty in doing so lies in the fact that many visual phenotypic data, especially visually observed phenotypes that often times cannot be directly measured quantitatively, are in the form of text annotations, and these descriptions are plagued by semantic ambiguity, heterogeneity, and low granularity. Though several bio-ontologies have been developed to standardize phenotypic (and genotypic) information and permit comparisons across species, these semantic issues persist and prevent precise analysis and retrieval of information. A framework suitable for the modeling and analysis of precise computable representations of such phenotypic appearances is needed. Results We have developed a new framework called the Computable Visually Observed Phenotype Ontological Framework for plants. This work provides a novel quantitative view of descriptions of plant phenotypes that leverages existing bio-ontologies and utilizes a computational approach to capture and represent domain knowledge in a machine-interpretable form. This is accomplished by means of a robust and accurate semantic mapping module that automatically maps high-level semantics to low-level measurements computed from phenotype imagery. The framework was applied to two different plant species with semantic rules mined and an ontology constructed. Rule quality was evaluated and showed high quality rules for most semantics. This framework also facilitates automatic annotation of phenotype images and can be adopted by different plant communities to aid in their research. Conclusions The Computable Visually Observed Phenotype Ontological Framework for plants has been developed for more efficient and accurate management of visually observed phenotypes, which play a significant role in plant genomics research. The uniqueness of this framework is its ability to bridge the knowledge of informaticians and plant science researchers by translating descriptions of visually observed phenotypes into standardized, machine-understandable representations, thus enabling the development of advanced information retrieval and phenotype annotation analysis tools for the plant science community. PMID:21702966

  13. A method for three-dimensional quantitative observation of the microstructure of biological samples

    NASA Astrophysics Data System (ADS)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  14. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. The use of immunohistochemistry for biomarker assessment--can it compete with other technologies?

    PubMed

    Dunstan, Robert W; Wharton, Keith A; Quigley, Catherine; Lowe, Amanda

    2011-10-01

    A morphology-based assay such as immunohistochemistry (IHC) should be a highly effective means to define the expression of a target molecule of interest, especially if the target is a protein. However, over the past decade, IHC as a platform for biomarkers has been challenged by more quantitative molecular assays with reference standards but that lack morphologic context. For IHC to be considered a "top-tier" biomarker assay, it must provide truly quantitative data on par with non-morphologic assays, which means it needs to be run with reference standards. However, creating such standards for IHC will require optimizing all aspects of tissue collection, fixation, section thickness, morphologic criteria for assessment, staining processes, digitization of images, and image analysis. This will also require anatomic pathology to evolve from a discipline that is descriptive to one that is quantitative. A major step in this transformation will be replacing traditional ocular microscopes with computer monitors and whole slide images, for without digitization, there can be no accurate quantitation; without quantitation, there can be no standardization; and without standardization, the value of morphology-based IHC assays will not be realized.

  16. Quantifying and predicting Drosophila larvae crawling phenotypes

    NASA Astrophysics Data System (ADS)

    Günther, Maximilian N.; Nettesheim, Guilherme; Shubeita, George T.

    2016-06-01

    The fruit fly Drosophila melanogaster is a widely used model for cell biology, development, disease, and neuroscience. The fly’s power as a genetic model for disease and neuroscience can be augmented by a quantitative description of its behavior. Here we show that we can accurately account for the complex and unique crawling patterns exhibited by individual Drosophila larvae using a small set of four parameters obtained from the trajectories of a few crawling larvae. The values of these parameters change for larvae from different genetic mutants, as we demonstrate for fly models of Alzheimer’s disease and the Fragile X syndrome, allowing applications such as genetic or drug screens. Using the quantitative model of larval crawling developed here we use the mutant-specific parameters to robustly simulate larval crawling, which allows estimating the feasibility of laborious experimental assays and aids in their design.

  17. Fast and accurate modeling of nonlinear pulse propagation in graded-index multimode fibers.

    PubMed

    Conforti, Matteo; Mas Arabi, Carlos; Mussot, Arnaud; Kudlinski, Alexandre

    2017-10-01

    We develop a model for the description of nonlinear pulse propagation in multimode optical fibers with a parabolic refractive index profile. It consists of a 1+1D generalized nonlinear Schrödinger equation with a periodic nonlinear coefficient, which can be solved in an extremely fast and efficient way. The model is able to quantitatively reproduce recently observed phenomena like geometric parametric instability and broadband dispersive wave emission. We envisage that our equation will represent a valuable tool for the study of spatiotemporal nonlinear dynamics in the growing field of multimode fiber optics.

  18. [Development and validation of the Family Vulnerability Index to Disability and Dependence (FVI-DD)].

    PubMed

    Amendola, Fernanda; Alvarenga, Márcia Regina Martins; Latorre, Maria do Rosário Dias de Oliveira; Oliveira, Maria Amélia de Campos

    2014-02-01

    This exploratory, descriptive, cross-sectional, and quantitative study aimed to develop and validate an index of family vulnerability to disability and dependence (FVI-DD). This study was adapted from the Family Development Index, with the addition of social and health indicators of disability and dependence. The instrument was applied to 248 families in the city of Sao Paulo, followed by exploratory factor analysis. Factor validation was performed using the concurrent and discriminant validity of the Lawton scale and Katz Index. The descriptive level adopted for the study was p < 0.05. The final vulnerability index comprised 50 questions classified into seven factors contemplating social and health dimensions, and this index exhibited good internal consistency (Cronbach's alpha = 0.82). FVI-DD was validated using both the Lawton scale and Katz Index. We conclude that FVI-DD can accurately and reliably assess family vulnerability to disability and dependence.

  19. Exchange Coupling Interactions from the Density Matrix Renormalization Group and N-Electron Valence Perturbation Theory: Application to a Biomimetic Mixed-Valence Manganese Complex.

    PubMed

    Roemelt, Michael; Krewald, Vera; Pantazis, Dimitrios A

    2018-01-09

    The accurate description of magnetic level energetics in oligonuclear exchange-coupled transition-metal complexes remains a formidable challenge for quantum chemistry. The density matrix renormalization group (DMRG) brings such systems for the first time easily within reach of multireference wave function methods by enabling the use of unprecedentedly large active spaces. But does this guarantee systematic improvement in predictive ability and, if so, under which conditions? We identify operational parameters in the use of DMRG using as a test system an experimentally characterized mixed-valence bis-μ-oxo/μ-acetato Mn(III,IV) dimer, a model for the oxygen-evolving complex of photosystem II. A complete active space of all metal 3d and bridge 2p orbitals proved to be the smallest meaningful starting point; this is readily accessible with DMRG and greatly improves on the unrealistic metal-only configuration interaction or complete active space self-consistent field (CASSCF) values. Orbital optimization is critical for stabilizing the antiferromagnetic state, while a state-averaged approach over all spin states involved is required to avoid artificial deviations from isotropic behavior that are associated with state-specific calculations. Selective inclusion of localized orbital subspaces enables probing the relative contributions of different ligands and distinct superexchange pathways. Overall, however, full-valence DMRG-CASSCF calculations fall short of providing a quantitative description of the exchange coupling owing to insufficient recovery of dynamic correlation. Quantitatively accurate results can be achieved through a DMRG implementation of second order N-electron valence perturbation theory (NEVPT2) in conjunction with a full-valence metal and ligand active space. Perspectives for future applications of DMRG-CASSCF/NEVPT2 to exchange coupling in oligonuclear clusters are discussed.

  20. Quantitative 3D determination of self-assembled structures on nanoparticles using small angle neutron scattering.

    PubMed

    Luo, Zhi; Marson, Domenico; Ong, Quy K; Loiudice, Anna; Kohlbrecher, Joachim; Radulescu, Aurel; Krause-Heuer, Anwen; Darwish, Tamim; Balog, Sandor; Buonsanti, Raffaella; Svergun, Dmitri I; Posocco, Paola; Stellacci, Francesco

    2018-04-09

    The ligand shell (LS) determines a number of nanoparticles' properties. Nanoparticles' cores can be accurately characterized; yet the structure of the LS, when composed of mixture of molecules, can be described only qualitatively (e.g., patchy, Janus, and random). Here we show that quantitative description of the LS' morphology of monodisperse nanoparticles can be obtained using small-angle neutron scattering (SANS), measured at multiple contrasts, achieved by either ligand or solvent deuteration. Three-dimensional models of the nanoparticles' core and LS are generated using an ab initio reconstruction method. Characteristic length scales extracted from the models are compared with simulations. We also characterize the evolution of the LS upon thermal annealing, and investigate the LS morphology of mixed-ligand copper and silver nanoparticles as well as gold nanoparticles coated with ternary mixtures. Our results suggest that SANS combined with multiphase modeling is a versatile approach for the characterization of nanoparticles' LS.

  1. A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images

    PubMed Central

    Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.

    1986-01-01

    The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16

  2. Numerical simulation of magmatic hydrothermal systems

    USGS Publications Warehouse

    Ingebritsen, S.E.; Geiger, S.; Hurwitz, S.; Driesner, T.

    2010-01-01

    The dynamic behavior of magmatic hydrothermal systems entails coupled and nonlinear multiphase flow, heat and solute transport, and deformation in highly heterogeneous media. Thus, quantitative analysis of these systems depends mainly on numerical solution of coupled partial differential equations and complementary equations of state (EOS). The past 2 decades have seen steady growth of computational power and the development of numerical models that have eliminated or minimized the need for various simplifying assumptions. Considerable heuristic insight has been gained from process-oriented numerical modeling. Recent modeling efforts employing relatively complete EOS and accurate transport calculations have revealed dynamic behavior that was damped by linearized, less accurate models, including fluid property control of hydrothermal plume temperatures and three-dimensional geometries. Other recent modeling results have further elucidated the controlling role of permeability structure and revealed the potential for significant hydrothermally driven deformation. Key areas for future reSearch include incorporation of accurate EOS for the complete H2O-NaCl-CO2 system, more realistic treatment of material heterogeneity in space and time, realistic description of large-scale relative permeability behavior, and intercode benchmarking comparisons. Copyright 2010 by the American Geophysical Union.

  3. A model for the Pockels effect in distorted liquid crystal blue phases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castles, F., E-mail: flynn.castles@materials.ox.ac.uk

    2015-09-07

    Recent experiments have found that a mechanically distorted blue phase can exhibit a primary linear electro-optic (Pockels) effect [F. Castles et al., Nat. Mater. 13, 817 (2014)]. Here, it is shown that flexoelectricity can account for the experimental results and a model, which is based on continuum theory but takes into account the sub-unit-cell structure, is proposed. The model provides a quantitative description of the effect accurate to the nearest order of magnitude and predicts that the Pockels coefficient(s) in an optimally distorted blue phase may be two orders of magnitude larger than in lithium niobate.

  4. A computational approach to predicting ligand selectivity for the size-based separation of trivalent lanthanides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanov, Alexander S.; Bryantsev, Vyacheslav S.

    An accurate description of solvation effects for trivalent lanthanide ions is a main stumbling block to the qualitative prediction of selectivity trends along the lanthanide series. In this work, we propose a simple model to describe the differential effect of solvation in the competitive binding of a ligand by lanthanide ions by including weakly co-ordinated counterions in the complexes of more than a +1 charge. The success of the approach to quantitatively reproduce selectivities obtained from aqueous phase complexation studies demonstrates its potential for the design and screening of new ligands for efficient size-based separation.

  5. A computational approach to predicting ligand selectivity for the size-based separation of trivalent lanthanides

    DOE PAGES

    Ivanov, Alexander S.; Bryantsev, Vyacheslav S.

    2016-06-20

    An accurate description of solvation effects for trivalent lanthanide ions is a main stumbling block to the qualitative prediction of selectivity trends along the lanthanide series. In this work, we propose a simple model to describe the differential effect of solvation in the competitive binding of a ligand by lanthanide ions by including weakly co-ordinated counterions in the complexes of more than a +1 charge. The success of the approach to quantitatively reproduce selectivities obtained from aqueous phase complexation studies demonstrates its potential for the design and screening of new ligands for efficient size-based separation.

  6. TFBSshape: a motif database for DNA shape features of transcription factor binding sites.

    PubMed

    Yang, Lin; Zhou, Tianyin; Dror, Iris; Mathelier, Anthony; Wasserman, Wyeth W; Gordân, Raluca; Rohs, Remo

    2014-01-01

    Transcription factor binding sites (TFBSs) are most commonly characterized by the nucleotide preferences at each position of the DNA target. Whereas these sequence motifs are quite accurate descriptions of DNA binding specificities of transcription factors (TFs), proteins recognize DNA as a three-dimensional object. DNA structural features refine the description of TF binding specificities and provide mechanistic insights into protein-DNA recognition. Existing motif databases contain extensive nucleotide sequences identified in binding experiments based on their selection by a TF. To utilize DNA shape information when analysing the DNA binding specificities of TFs, we developed a new tool, the TFBSshape database (available at http://rohslab.cmb.usc.edu/TFBSshape/), for calculating DNA structural features from nucleotide sequences provided by motif databases. The TFBSshape database can be used to generate heat maps and quantitative data for DNA structural features (i.e., minor groove width, roll, propeller twist and helix twist) for 739 TF datasets from 23 different species derived from the motif databases JASPAR and UniPROBE. As demonstrated for the basic helix-loop-helix and homeodomain TF families, our TFBSshape database can be used to compare, qualitatively and quantitatively, the DNA binding specificities of closely related TFs and, thus, uncover differential DNA binding specificities that are not apparent from nucleotide sequence alone.

  7. TFBSshape: a motif database for DNA shape features of transcription factor binding sites

    PubMed Central

    Yang, Lin; Zhou, Tianyin; Dror, Iris; Mathelier, Anthony; Wasserman, Wyeth W.; Gordân, Raluca; Rohs, Remo

    2014-01-01

    Transcription factor binding sites (TFBSs) are most commonly characterized by the nucleotide preferences at each position of the DNA target. Whereas these sequence motifs are quite accurate descriptions of DNA binding specificities of transcription factors (TFs), proteins recognize DNA as a three-dimensional object. DNA structural features refine the description of TF binding specificities and provide mechanistic insights into protein–DNA recognition. Existing motif databases contain extensive nucleotide sequences identified in binding experiments based on their selection by a TF. To utilize DNA shape information when analysing the DNA binding specificities of TFs, we developed a new tool, the TFBSshape database (available at http://rohslab.cmb.usc.edu/TFBSshape/), for calculating DNA structural features from nucleotide sequences provided by motif databases. The TFBSshape database can be used to generate heat maps and quantitative data for DNA structural features (i.e., minor groove width, roll, propeller twist and helix twist) for 739 TF datasets from 23 different species derived from the motif databases JASPAR and UniPROBE. As demonstrated for the basic helix-loop-helix and homeodomain TF families, our TFBSshape database can be used to compare, qualitatively and quantitatively, the DNA binding specificities of closely related TFs and, thus, uncover differential DNA binding specificities that are not apparent from nucleotide sequence alone. PMID:24214955

  8. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion.

    PubMed

    Fröhlich, Fabian; Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J; Grima, Ramon; Hasenauer, Jan

    2016-07-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity.

  9. Inference for Stochastic Chemical Kinetics Using Moment Equations and System Size Expansion

    PubMed Central

    Thomas, Philipp; Kazeroonian, Atefeh; Theis, Fabian J.; Grima, Ramon; Hasenauer, Jan

    2016-01-01

    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity. PMID:27447730

  10. Sequencing sit-to-stand and upright posture for mobility limitation assessment: determination of the timing of the task phases from force platform data.

    PubMed

    Mazzà, Claudia; Zok, Mounir; Della Croce, Ugo

    2005-06-01

    The identification of quantitative tools to assess an individual's mobility limitation is a complex and challenging task. Several motor tasks have been designated as potential indicators of mobility limitation. In this study, a multiple motor task obtained by sequencing sit-to-stand and upright posture was used. Algorithms based on data obtained exclusively from a single force platform were developed to detect the timing of the motor task phases (sit-to-stand, preparation to the upright posture and upright posture). To test these algorithms, an experimental protocol inducing predictable changes in the acquired signals was designed. Twenty-two young, able-bodied subjects performed the task in four different conditions: self-selected natural and high speed with feet kept together, and self-selected natural and high speed with feet pelvis-width apart. The proposed algorithms effectively detected the timing of the task phases, the duration of which was sensitive to the four different experimental conditions. As expected, the duration of the sit-to-stand was sensitive to the speed of the task and not to the foot position, while the duration of the preparation to the upright posture was sensitive to foot position but not to speed. In addition to providing a simple and effective description of the execution of the motor task, the correct timing of the studied multiple task could facilitate the accurate determination of variables descriptive of the single isolated phases, allowing for a more thorough description of the motor task and therefore could contribute to the development of effective quantitative functional evaluation tests.

  11. Density functional theory for field emission from carbon nano-structures.

    PubMed

    Li, Zhibing

    2015-12-01

    Electron field emission is understood as a quantum mechanical many-body problem in which an electronic quasi-particle of the emitter is converted into an electron in vacuum. Fundamental concepts of field emission, such as the field enhancement factor, work-function, edge barrier and emission current density, will be investigated, using carbon nanotubes and graphene as examples. A multi-scale algorithm basing on density functional theory is introduced. We will argue that such a first principle approach is necessary and appropriate for field emission of nano-structures, not only for a more accurate quantitative description, but, more importantly, for deeper insight into field emission. Copyright © 2015 The Author. Published by Elsevier B.V. All rights reserved.

  12. Capturing Functional Independence Measure (FIM®) Ratings.

    PubMed

    Torres, Audrey

    The aim of the study was to identify interventions to capture admission functional independence measure (FIM®) ratings on the day of admission to an inpatient rehabilitation facility. A quantitative evidence-based practice quality improvement study utilizing descriptive statistics. Admission FIM® ratings from patients discharged in June 2012 (retrospective review) were compared to admission FIM® ratings from patients discharged in June 2014 (prospective review). The logic model was utilized to determine the project inputs, outputs, and outcomes. Interventions to capture admission FIM® ratings on the day of admission are essential to accurately predict the patient's burden of care, length of stay, and reimbursement. Waiting until Day 2 or Day 3 after admission to capture the admission FIM® assessment resulted in inflated admission FIM® ratings and suboptimal quality outcomes. Interventions to capture admission FIM® ratings on the day of admission were successful at improving the quality of care, length of stay efficiency, and accurately recording admission FIM® ratings to determine the patient's burden of care.

  13. Accurate description of charged excitations in molecular solids from embedded many-body perturbation theory

    NASA Astrophysics Data System (ADS)

    Li, Jing; D'Avino, Gabriele; Duchemin, Ivan; Beljonne, David; Blase, Xavier

    2018-01-01

    We present a novel hybrid quantum/classical approach to the calculation of charged excitations in molecular solids based on the many-body Green's function G W formalism. Molecules described at the G W level are embedded into the crystalline environment modeled with an accurate classical polarizable scheme. This allows the calculation of electron addition and removal energies in the bulk and at crystal surfaces where charged excitations are probed in photoelectron experiments. By considering the paradigmatic case of pentacene and perfluoropentacene crystals, we discuss the different contributions from intermolecular interactions to electronic energy levels, distinguishing between polarization, which is accounted for combining quantum and classical polarizabilities, and crystal field effects, that can impact energy levels by up to ±0.6 eV. After introducing band dispersion, we achieve quantitative agreement (within 0.2 eV) on the ionization potential and electron affinity measured at pentacene and perfluoropentacene crystal surfaces characterized by standing molecules.

  14. Extended Tersoff potential for boron nitride: Energetics and elastic properties of pristine and defective h -BN

    NASA Astrophysics Data System (ADS)

    Los, J. H.; Kroes, J. M. H.; Albe, K.; Gordillo, R. M.; Katsnelson, M. I.; Fasolino, A.

    2017-11-01

    We present an extended Tersoff potential for boron nitride (BN-ExTeP) for application in large scale atomistic simulations. BN-ExTeP accurately describes the main low energy B, N, and BN structures and yields quantitatively correct trends in the bonding as a function of coordination. The proposed extension of the bond order, added to improve the dependence of bonding on the chemical environment, leads to an accurate description of point defects in hexagonal BN (h -BN) and cubic BN (c -BN). We have implemented this potential in the molecular dynamics LAMMPS code and used it to determine some basic properties of pristine 2D h -BN and the elastic properties of defective h -BN as a function of defect density at zero temperature. Our results show that there is a strong correlation between the size of the static corrugation induced by the defects and the weakening of the in-plane elastic moduli.

  15. Rigorous numerical modeling of scattering-type scanning near-field optical microscopy and spectroscopy

    NASA Astrophysics Data System (ADS)

    Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun

    2017-11-01

    Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.

  16. Energy level alignment at molecule-metal interfaces from an optimally tuned range-separated hybrid functional

    DOE PAGES

    Liu, Zhen-Fei; Egger, David A.; Refaely-Abramson, Sivan; ...

    2017-02-21

    The alignment of the frontier orbital energies of an adsorbed molecule with the substrate Fermi level at metal-organic interfaces is a fundamental observable of significant practical importance in nanoscience and beyond. Typical density functional theory calculations, especially those using local and semi-local functionals, often underestimate level alignment leading to inaccurate electronic structure and charge transport properties. Here, we develop a new fully self-consistent predictive scheme to accurately compute level alignment at certain classes of complex heterogeneous molecule-metal interfaces based on optimally tuned range-separated hybrid functionals. Starting from a highly accurate description of the gas-phase electronic structure, our method by constructionmore » captures important nonlocal surface polarization effects via tuning of the long-range screened exchange in a range-separated hybrid in a non-empirical and system-specific manner. We implement this functional in a plane-wave code and apply it to several physisorbed and chemisorbed molecule-metal interface systems. Our results are in quantitative agreement with experiments, the both the level alignment and work function changes. This approach constitutes a new practical scheme for accurate and efficient calculations of the electronic structure of molecule-metal interfaces.« less

  17. Energy level alignment at molecule-metal interfaces from an optimally tuned range-separated hybrid functional

    NASA Astrophysics Data System (ADS)

    Liu, Zhen-Fei; Egger, David A.; Refaely-Abramson, Sivan; Kronik, Leeor; Neaton, Jeffrey B.

    2017-03-01

    The alignment of the frontier orbital energies of an adsorbed molecule with the substrate Fermi level at metal-organic interfaces is a fundamental observable of significant practical importance in nanoscience and beyond. Typical density functional theory calculations, especially those using local and semi-local functionals, often underestimate level alignment leading to inaccurate electronic structure and charge transport properties. In this work, we develop a new fully self-consistent predictive scheme to accurately compute level alignment at certain classes of complex heterogeneous molecule-metal interfaces based on optimally tuned range-separated hybrid functionals. Starting from a highly accurate description of the gas-phase electronic structure, our method by construction captures important nonlocal surface polarization effects via tuning of the long-range screened exchange in a range-separated hybrid in a non-empirical and system-specific manner. We implement this functional in a plane-wave code and apply it to several physisorbed and chemisorbed molecule-metal interface systems. Our results are in quantitative agreement with experiments, the both the level alignment and work function changes. Our approach constitutes a new practical scheme for accurate and efficient calculations of the electronic structure of molecule-metal interfaces.

  18. Metabolite profiling of soy sauce using gas chromatography with time-of-flight mass spectrometry and analysis of correlation with quantitative descriptive analysis.

    PubMed

    Yamamoto, Shinya; Bamba, Takeshi; Sano, Atsushi; Kodama, Yukako; Imamura, Miho; Obata, Akio; Fukusaki, Eiichiro

    2012-08-01

    Soy sauces, produced from different ingredients and brewing processes, have variations in components and quality. Therefore, it is extremely important to comprehend the relationship between components and the sensory attributes of soy sauces. The current study sought to perform metabolite profiling in order to devise a method of assessing the attributes of soy sauces. Quantitative descriptive analysis (QDA) data for 24 soy sauce samples were obtained from well selected sensory panelists. Metabolite profiles primarily concerning low-molecular-weight hydrophilic components were based on gas chromatography with time-of-flightmass spectrometry (GC/TOFMS). QDA data for soy sauces were accurately predicted by projection to latent structure (PLS), with metabolite profiles serving as explanatory variables and QDA data set serving as a response variable. Moreover, analysis of correlation between matrices of metabolite profiles and QDA data indicated contributing compounds that were highly correlated with QDA data. Especially, it was indicated that sugars are important components of the tastes of soy sauces. This new approach which combines metabolite profiling with QDA is applicable to analysis of sensory attributes of food as a result of the complex interaction between its components. This approach is effective to search important compounds that contribute to the attributes. Copyright © 2012 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  19. Reliable enumeration of malaria parasites in thick blood films using digital image analysis.

    PubMed

    Frean, John A

    2009-09-23

    Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.

  20. Recommended procedures and techniques for the petrographic description of bituminous coals

    USGS Publications Warehouse

    Chao, E.C.T.; Minkin, J.A.; Thompson, C.L.

    1982-01-01

    Modern coal petrology requires rapid and precise description of great numbers of coal core or bench samples in order to acquire the information required to understand and predict vertical and lateral variation of coal quality for correlation with coal-bed thickness, depositional environment, suitability for technological uses, etc. Procedures for coal description vary in accordance with the objectives of the description. To achieve our aim of acquiring the maximum amount of quantitative information within the shortest period of time, we have adopted a combined megascopic-microscopic procedure. Megascopic analysis is used to identify the distinctive lithologies present, and microscopic analysis is required only to describe representative examples of the mixed lithologies observed. This procedure greatly decreases the number of microscopic analyses needed for adequate description of a sample. For quantitative megascopic description of coal microlithotypes, microlithotype assemblages, and lithotypes, we use (V) for vitrite or vitrain, (E) for liptite, (I) for inertite or fusain, (M) for mineral layers or lenses other than iron sulfide, (S) for iron sulfide, and (X1), (X2), etc. for mixed lithologies. Microscopic description is expressed in terms of V representing the vitrinite maceral group, E the exinite group, I the inertinite group, and M mineral components. volume percentages are expressed as subscripts. Thus (V)20(V80E10I5M5)80 indicates a lithotype or assemblage of microlithotypes consisting of 20 vol. % vitrite and 80% of a mixed lithology having a modal maceral composition V80E10I5M5. This bulk composition can alternatively be recalculated and described as V84E8I4M4. To generate these quantitative data rapidly and accurately, we utilize an automated image analysis system (AIAS). Plots of VEIM data on easily constructed ternary diagrams provide readily comprehended illustrations of the range of modal composition of the lithologic units making up a given coal bed. The use of bulk-specific-gravity determinations is alo recommended for identification and characterization of the distinctive lithologic units. The availability of an AIAS also enhances the capability to acquire textural information. Ranges of size of maceral and mineral grains can be quickly and precisely determined by use of an AIAS. We assume that shape characteristics of coal particles can also be readily evaluated by automated image analysis, although this evaluation has not yet been attempted in our laboratory. Definitive data on the particulate mineral content of coal constitute another important segment of petrographic description. Characterization of mineral content may be accomplished by optical identification, electron microprobe analysis, X-ray diffraction, and scanning and transmission electron microscopy. Individual mineral grains in place in polished blocks or polished this sections, or separated from the coal matrix by sink-float methods are studied by analytical techniques appropriate to the conditions of sampling. Finally, whenever possible, identification of the probable genus or plant species from which a given coal component is derived will add valuable information and meaning to the petrographic description. ?? 1982.

  1. Deformation in Metallic Glass: Connecting Atoms to Continua

    NASA Astrophysics Data System (ADS)

    Hinkle, Adam R.; Falk, Michael L.; Rycroft, Chris H.; Shields, Michael D.

    Metallic glasses like other amorphous solids experience strain localization as the primary mode of failure. However, the development of continuum constitutive laws which provide a quantitative description of disorder and mechanical deformation remains an open challenge. Recent progress has shown the necessity of accurately capturing fluctuations in material structure, in particular the statistical changes in potential energy of the atomic constituents during the non-equilibrium process of applied shear. Here we directly cross-compare molecular dynamics shear simulations of a ZrCu glass with continuum shear transformation zone (STZ) theory representations. We present preliminary results for a methodology to coarse-grain detailed molecular dynamics data with the goal of initializing a continuum representation in the STZ theory. NSF Grants Awards 1107838, 1408685, and 0801471.

  2. Light-Cone and Diffusive Propagation of Correlations in a Many-Body Dissipative System.

    PubMed

    Bernier, Jean-Sébastien; Tan, Ryan; Bonnes, Lars; Guo, Chu; Poletti, Dario; Kollath, Corinna

    2018-01-12

    We analyze the propagation of correlations after a sudden interaction change in a strongly interacting quantum system in contact with an environment. In particular, we consider an interaction quench in the Bose-Hubbard model, deep within the Mott-insulating phase, under the effect of dephasing. We observe that dissipation effectively speeds up the propagation of single-particle correlations while reducing their coherence. In contrast, for two-point density correlations, the initial ballistic propagation regime gives way to diffusion at intermediate times. Numerical simulations, based on a time-dependent matrix product state algorithm, are supplemented by a quantitatively accurate fermionic quasiparticle approach providing an intuitive description of the initial dynamics in terms of holon and doublon excitations.

  3. A general method for bead-enhanced quantitation by flow cytometry

    PubMed Central

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  4. Quantification of mitral valve morphology with three-dimensional echocardiography--can measurement lead to better management?

    PubMed

    Lee, Alex Pui-Wai; Fang, Fang; Jin, Chun-Na; Kam, Kevin Ka-Ho; Tsui, Gary K W; Wong, Kenneth K Y; Looi, Jen-Li; Wong, Randolph H L; Wan, Song; Sun, Jing Ping; Underwood, Malcolm J; Yu, Cheuk-Man

    2014-01-01

    The mitral valve (MV) has complex 3-dimensional (3D) morphology and motion. Advance in real-time 3D echocardiography (RT3DE) has revolutionized clinical imaging of the MV by providing clinicians with realistic visualization of the valve. Thus far, RT3DE of the MV structure and dynamics has adopted an approach that depends largely on subjective and qualitative interpretation of the 3D images of the valve, rather than objective and reproducible measurement. RT3DE combined with image-processing computer techniques provides precise segmentation and reliable quantification of the complex 3D morphology and rapid motion of the MV. This new approach to imaging may provide additional quantitative descriptions that are useful in diagnostic and therapeutic decision-making. Quantitative analysis of the MV using RT3DE has increased our understanding of the pathologic mechanism of degenerative, ischemic, functional, and rheumatic MV disease. Most recently, 3D morphologic quantification has entered into clinical use to provide more accurate diagnosis of MV disease and for planning surgery and transcatheter interventions. Current limitations of this quantitative approach to MV imaging include labor-intensiveness during image segmentation and lack of a clear definition of the clinical significance of many of the morphologic parameters. This review summarizes the current development and applications of quantitative analysis of the MV morphology using RT3DE.

  5. Management Approaches to Stomal and Peristomal Complications: A Narrative Descriptive Study.

    PubMed

    Beitz, Janice M; Colwell, Janice C

    2016-01-01

    The purpose of this study was to identify optimal interventions for selected complications based on WOC nurse experts' judgment/expertise. A cross-sectional quantitative descriptive design with qualitative, narrative-type components was used for this study. Following validation rating of appropriateness of interventions and quantitative rankings of first-, second-, and third-line approaches, participants provided substantive handwritten narrative comments about listed interventions. Comments were organized and prioritized using frequency count. Narrative comments reflected the quantitative rankings of efficacy of approaches. Clinicians offered further specific suggestions regarding product use and progression of care for selected complications. Narrative analysis using descriptive quantitative frequency count supported the rankings of most preferred treatments of selected stomal and peristomal complications. Findings add to the previous research on prioritized approaches and evidence-based practice in ostomy care.

  6. Quantitating Organoleptic Volatile Phenols in Smoke-Exposed Vitis vinifera Berries.

    PubMed

    Noestheden, Matthew; Thiessen, Katelyn; Dennis, Eric G; Tiet, Ben; Zandberg, Wesley F

    2017-09-27

    Accurate methods for quantitating volatile phenols (i.e., guaiacol, syringol, 4-ethylphenol, etc.) in smoke-exposed Vitis vinifera berries prior to fermentation are needed to predict the likelihood of perceptible smoke taint following vinification. Reported here is a complete, cross-validated analytical workflow to accurately quantitate free and glycosidically bound volatile phenols in smoke-exposed berries using liquid-liquid extraction, acid-mediated hydrolysis, and gas chromatography-tandem mass spectrometry. The reported workflow addresses critical gaps in existing methods for volatile phenols that impact quantitative accuracy, most notably the effect of injection port temperature and the variability in acid-mediated hydrolytic procedures currently used. Addressing these deficiencies will help the wine industry make accurate, informed decisions when producing wines from smoke-exposed berries.

  7. Semiconductor color-center structure and excitation spectra: Equation-of-motion coupled-cluster description of vacancy and transition-metal defect photoluminescence

    NASA Astrophysics Data System (ADS)

    Lutz, Jesse J.; Duan, Xiaofeng F.; Burggraf, Larry W.

    2018-03-01

    Valence excitation spectra are computed for deep-center silicon-vacancy defects in 3C, 4H, and 6H silicon carbide (SiC), and comparisons are made with literature photoluminescence measurements. Optimizations of nuclear geometries surrounding the defect centers are performed within a Gaussian basis-set framework using many-body perturbation theory or density functional theory (DFT) methods, with computational expenses minimized by a QM/MM technique called SIMOMM. Vertical excitation energies are subsequently obtained by applying excitation-energy, electron-attached, and ionized equation-of-motion coupled-cluster (EOMCC) methods, where appropriate, as well as time-dependent (TD) DFT, to small models including only a few atoms adjacent to the defect center. We consider the relative quality of various EOMCC and TD-DFT methods for (i) energy-ordering potential ground states differing incrementally in charge and multiplicity, (ii) accurately reproducing experimentally measured photoluminescence peaks, and (iii) energy-ordering defects of different types occurring within a given polytype. The extensibility of this approach to transition-metal defects is also tested by applying it to silicon-substituted chromium defects in SiC and comparing with measurements. It is demonstrated that, when used in conjunction with SIMOMM-optimized geometries, EOMCC-based methods can provide a reliable prediction of the ground-state charge and multiplicity, while also giving a quantitative description of the photoluminescence spectra, accurate to within 0.1 eV of measurement for all cases considered.

  8. Variety of geologic silhouette shapes distinguishable by multiple rotations method of quantitative shape analysis text

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, D.G.; Parks, J.M.

    1984-04-01

    Silhouette shapes are two-dimensional projections of three-dimensional objects such as sand grains, gravel, and fossils. Within-the-margin markings such as chamber boundaries, sutures, or ribs are ignored. Comparisons between populations of objects from similar and differential origins (i.e., environments, species or genera, growth series, etc) is aided by quantifying the shapes. The Multiple Rotations Method (MRM) uses a variation of ''eigenshapes'', which is capable of distinguishing most of the subtle variations that the ''trained eye'' can detect. With a video-digitizer and microcomputer, MRM is fast, more accurate, and more objective than the human eye. The resulting shape descriptors comprise 5 ormore » 6 numbers per object that can be stored and retrieved to compare with similar descriptions of other objects. The original-shape outlines can be reconstituted sufficiently for gross recognition from these few numerical descriptors. Thus, a semi-automated data-retrieval system becomes feasible, with silhouette-shape descriptions as one of several recognition criteria. MRM consists of four ''rotations'': rotation about a center to a comparable orientation; a principal-components rotation to reduce the many original shape descriptors to a few; a VARIMAX orthogonal-factor rotation to achieve simple structure; and a rotation to achieve factor scores on individual objects. A variety of subtly different shapes includes sand grains from several locations, ages, and environments, and fossils of several types. This variety illustrates the feasibility of quantitative comparisons by MRM.« less

  9. Beyond CCT: The spectral index system as a tool for the objective, quantitative characterization of lamps

    NASA Astrophysics Data System (ADS)

    Galadí-Enríquez, D.

    2018-02-01

    Correlated color temperature (CCT) is a semi-quantitative system that roughly describes the spectra of lamps. This parameter gives the temperature (measured in kelvins) of the black body that would show the hue more similar to that of the light emitted by the lamp. Modern lamps for indoor and outdoor lighting display many spectral energy distributions, most of them extremely different to those of black bodies, what makes CCT to be far from a perfect descriptor from the physical point of view. The spectral index system presented in this work provides an accurate, objective, quantitative procedure to characterize the spectral properties of lamps, with just a few numbers. The system is an adaptation to lighting technology of the classical procedures of multi-band astronomical photometry with wide and intermediate-band filters. We describe the basic concepts and we apply the system to a representative set of lamps of many kinds. The results lead to interesting, sometimes surprising conclusions. The spectral index system is extremely easy to implement from the spectral data that are routinely measured at laboratories. Thus, including this kind of computations in the standard protocols for the certification of lamps will be really straightforward, and will enrich the technical description of lighting devices.

  10. From in silica to in silico: retention thermodynamics at solid-liquid interfaces.

    PubMed

    El Hage, Krystel; Bemish, Raymond J; Meuwly, Markus

    2018-06-28

    The dynamics of solvated molecules at the solid/liquid interface is essential for a molecular-level understanding for the solution thermodynamics in reversed phase liquid chromatography (RPLC). The heterogeneous nature of the systems and the competing intermolecular interactions makes solute retention in RPLC a surprisingly challenging problem which benefits greatly from modelling at atomistic resolution. However, the quality of the underlying computational model needs to be sufficiently accurate to provide a realistic description of the energetics and dynamics of systems, especially for solution-phase simulations. Here, the retention thermodynamics and the retention mechanism of a range of benzene-derivatives in C18 stationary-phase chains in contact with water/methanol mixtures is studied using point charge (PC) and multipole (MTP) electrostatic models. The results demonstrate that free energy simulations with a faithful MTP representation of the computational model provide quantitative and molecular level insight into the thermodynamics of adsorption/desorption in chromatographic systems while a conventional PC representation fails in doing so. This provides a rational basis to develop more quantitative and validated models for the optimization of separation systems.

  11. Topex/Poseidon: A United States/France mission. Oceanography from space: The oceans and climate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The TOPEX/POSEIDON space mission, sponsored by NASA and France's space agency, the Centre National d'Etudes Spatiales (CNES), will give new observations of the Earth from space to gain a quantitative understanding of the role of ocean currents in climate change. Rising atmospheric concentrations of carbon dioxide and other 'greenhouse gases' produced as a result of human activities could generate a global warming, followed by an associated rise in sea level. The satellite will use radar altimetry to measure sea-surface height and will be tracked by three independent systems to yield accurate topographic maps over the dimensions of entire ocean basins. The satellite data, together with the Tropical Ocean and Global Atmosphere (TOGA) program and the World Ocean Circulation Experiment (WOCE) measurements, will be analyzed by an international scientific team. By merging the satellite observations with TOGA and WOCE findings, the scientists will establish the extensive data base needed for the quantitative description and computer modeling of ocean circulation. The ocean models will eventually be coupled with atmospheric models to lay the foundation for predictions of global climate change.

  12. Integrated experimental and theoretical approach for the structural characterization of Hg2+ aqueous solutions

    NASA Astrophysics Data System (ADS)

    D'Angelo, Paola; Migliorati, Valentina; Mancini, Giordano; Barone, Vincenzo; Chillemi, Giovanni

    2008-02-01

    The structural and dynamic properties of the solvated Hg2+ ion in aqueous solution have been investigated by a combined experimental-theoretical approach employing x-ray absorption spectroscopy and molecular dynamics (MD) simulations. This method allows one to perform a quantitative analysis of the x-ray absorption near-edge structure (XANES) spectra of ionic solutions using a proper description of the thermal and structural fluctuations. XANES spectra have been computed starting from the MD trajectory, without carrying out any minimization in the structural parameter space. The XANES experimental data are accurately reproduced by a first-shell heptacoordinated cluster only if the second hydration shell is included in the calculations. These results confirm at the same time the existence of a sevenfold first hydration shell for the Hg2+ ion in aqueous solution and the reliability of the potentials used in the MD simulations. The combination of MD and XANES is found to be very helpful to get important new insights into the quantitative estimation of structural properties of disordered systems.

  13. Leadership Styles at Middle- and Early-College Programs: A Quantitative Descriptive Correlational Study

    ERIC Educational Resources Information Center

    Berksteiner, Earl J.

    2013-01-01

    The purpose of this quantitative descriptive correlational study was to determine if associations existed between middle- and early-college (MEC) principals' leadership styles, teacher motivation, and teacher satisfaction. MEC programs were programs designed to assist high school students who were not served well in a traditional setting (Middle…

  14. A Novel Approach to Teach the Generation of Bioelectrical Potentials from a Descriptive and Quantitative Perspective

    ERIC Educational Resources Information Center

    Rodriguez-Falces, Javier

    2013-01-01

    In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are…

  15. Sequentially Simulated Outcomes: Kind Experience versus Nontransparent Description

    ERIC Educational Resources Information Center

    Hogarth, Robin M.; Soyer, Emre

    2011-01-01

    Recently, researchers have investigated differences in decision making based on description and experience. We address the issue of when experience-based judgments of probability are more accurate than are those based on description. If description is well understood ("transparent") and experience is misleading ("wicked"), it…

  16. Specific material recognition by small peptides mediated by the interfacial solvent structure.

    PubMed

    Schneider, Julian; Ciacchi, Lucio Colombi

    2012-02-01

    We present evidence that specific material recognition by small peptides is governed by local solvent density variations at solid/liquid interfaces, sensed by the side-chain residues with atomic-scale precision. In particular, we unveil the origin of the selectivity of the binding motif RKLPDA for Ti over Si using a combination of metadynamics and steered molecular dynamics simulations, obtaining adsorption free energies and adhesion forces in quantitative agreement with corresponding experiments. For an accurate description, we employ realistic models of the natively oxidized surfaces which go beyond the commonly used perfect crystal surfaces. These results have profound implications for nanotechnology and materials science applications, offering a previously missing structure-function relationship for the rational design of materials-selective peptide sequences. © 2011 American Chemical Society

  17. Systematic uncertainties in long-baseline neutrino-oscillation experiments

    NASA Astrophysics Data System (ADS)

    Ankowski, Artur M.; Mariani, Camillo

    2017-05-01

    Future neutrino-oscillation experiments are expected to bring definite answers to the questions of neutrino-mass hierarchy and violation of charge-parity symmetry in the lepton-sector. To realize this ambitious program it is necessary to ensure a significant reduction of uncertainties, particularly those related to neutrino-energy reconstruction. In this paper, we discuss different sources of systematic uncertainties, paying special attention to those arising from nuclear effects and detector response. By analyzing nuclear effects we show the importance of developing accurate theoretical models, capable of providing a quantitative description of neutrino cross sections, together with the relevance of their implementation in Monte Carlo generators and extensive testing against lepton-scattering data. We also point out the fundamental role of efforts aiming to determine detector responses in test-beam exposures.

  18. Transforming Verbal Counts in Reports of Qualitative Descriptive Studies Into Numbers

    PubMed Central

    Chang, YunKyung; Voils, Corrine I.; Sandelowski, Margarete; Hasselblad, Vic; Crandell, Jamie L.

    2009-01-01

    Reports of qualitative studies typically do not offer much information on the numbers of respondents linked to any one finding. This information may be especially useful in reports of basic, or minimally interpretive, qualitative descriptive studies focused on surveying a range of experiences in a target domain, and its lack may limit the ability to synthesize the results of such studies with quantitative results in systematic reviews. Accordingly, the authors illustrate strategies for deriving plausible ranges of respondents expressing a finding in a set of reports of basic qualitative descriptive studies on antiretroviral adherence and suggest how the results might be used. These strategies have limitations and are never appropriate for use with findings from interpretive qualitative studies. Yet they offer a temporary workaround for preserving and maximizing the value of information from basic qualitative descriptive studies for systematic reviews. They show also why quantitizing is never simply quantitative. PMID:19448052

  19. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  20. Local structure in LaMnO3 and CaMnO3 perovskites: A quantitative structural refinement of Mn K -edge XANES data

    NASA Astrophysics Data System (ADS)

    Monesi, C.; Meneghini, C.; Bardelli, F.; Benfatto, M.; Mobilio, S.; Manju, U.; Sarma, D. D.

    2005-11-01

    Hole-doped perovskites such as La1-xCaxMnO3 present special magnetic and magnetotransport properties, and it is commonly accepted that the local atomic structure around Mn ions plays a crucial role in determining these peculiar features. Therefore experimental techniques directly probing the local atomic structure, like x-ray absorption spectroscopy (XAS), have been widely exploited to deeply understand the physics of these compounds. Quantitative XAS analysis usually concerns the extended region [extended x-ray absorption fine structure (EXAFS)] of the absorption spectra. The near-edge region [x-ray absorption near-edge spectroscopy (XANES)] of XAS spectra can provide detailed complementary information on the electronic structure and local atomic topology around the absorber. However, the complexity of the XANES analysis usually prevents a quantitative understanding of the data. This work exploits the recently developed MXAN code to achieve a quantitative structural refinement of the Mn K -edge XANES of LaMnO3 and CaMnO3 compounds; they are the end compounds of the doped manganite series LaxCa1-xMnO3 . The results derived from the EXAFS and XANES analyses are in good agreement, demonstrating that a quantitative picture of the local structure can be obtained from XANES in these crystalline compounds. Moreover, the quantitative XANES analysis provides topological information not directly achievable from EXAFS data analysis. This work demonstrates that combining the analysis of extended and near-edge regions of Mn K -edge XAS spectra could provide a complete and accurate description of Mn local atomic environment in these compounds.

  1. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  2. Descriptive and numeric estimation of risk for psychotic disorders among affected individuals and relatives: Implications for clinical practice

    PubMed Central

    Austin, Jehannine C.; Hippman, Catriona; Honer, William G.

    2013-01-01

    Studies show that individuals with psychotic illnesses and their families want information about psychosis risks for other relatives. However, deriving accurate numeric probabilities for psychosis risk is challenging, and people have difficulty interpreting probabilistic information, thus some have suggested that clinicians should use risk descriptors, such as ‘moderate’ or ‘quite high’, rather than numbers. Little is known about how individuals with psychosis and their family members use quantitative and qualitative descriptors of risk in the specific context of chance for an individual to develop psychosis. We explored numeric and descriptive estimations of psychosis risk among individuals with psychotic disorders and unaffected first-degree relatives. In an online survey, respondents numerically and descriptively estimated risk for an individual to develop psychosis in scenarios where they had: A) no affected family members; and B) an affected sibling. 219 affected individuals and 211 first-degree relatives participated. Affected individuals estimated significantly higher risks than relatives. Participants attributed all descriptors between “very low” and “very high” to probabilities of 1%, 10%, 25% and 50%+. For a given numeric probability, different risk descriptors were attributed in different scenarios. Clinically, brief interventions around risk (using either probabilities or descriptors alone) are vulnerable to miscommunication and potentially profoundly negative consequences –interventions around risk are best suited to in-depth discussion. PMID:22421074

  3. Descriptive and numeric estimation of risk for psychotic disorders among affected individuals and relatives: implications for clinical practice.

    PubMed

    Austin, Jehannine C; Hippman, Catriona; Honer, William G

    2012-03-30

    Studies show that individuals with psychotic illnesses and their families want information about psychosis risks for other relatives. However, deriving accurate numeric probabilities for psychosis risk is challenging, and people have difficulty interpreting probabilistic information; thus, some have suggested that clinicians should use risk descriptors, such as "moderate" or "quite high", rather than numbers. Little is known about how individuals with psychosis and their family members use quantitative and qualitative descriptors of risk in the specific context of chance for an individual to develop psychosis. We explored numeric and descriptive estimations of psychosis risk among individuals with psychotic disorders and unaffected first-degree relatives. In an online survey, respondents numerically and descriptively estimated risk for an individual to develop psychosis in scenarios where they had: A) no affected family members; and B) an affected sibling. Participants comprised 219 affected individuals and 211 first-degree relatives participated. Affected individuals estimated significantly higher risks than relatives. Participants attributed all descriptors between "very low" and "very high" to probabilities of 1%, 10%, 25% and 50%+. For a given numeric probability, different risk descriptors were attributed in different scenarios. Clinically, brief interventions around risk (using either probabilities or descriptors alone) are vulnerable to miscommunication and potentially negative consequences-interventions around risk are best suited to in-depth discussion. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Global Seabed Materials and Habitats Mapped: The Computational Methods

    NASA Astrophysics Data System (ADS)

    Jenkins, C. J.

    2016-02-01

    What the seabed is made of has proven difficult to map on the scale of whole ocean-basins. Direct sampling and observation can be augmented with proxy-parameter methods such as acoustics. Both avenues are essential to obtain enough detail and coverage, and also to validate the mapping methods. We focus on the direct observations such as samplings, photo and video, probes, diver and sub reports, and surveyed features. These are often in word-descriptive form: over 85% of the records for site materials are in this form, whether as sample/view descriptions or classifications, or described parameters such as consolidation, color, odor, structures and components. Descriptions are absolutely necessary for unusual materials and for processes - in other words, for research. This project dbSEABED not only has the largest collection of seafloor materials data worldwide, but it uses advanced computing math to obtain the best possible coverages and detail. Included in those techniques are linguistic text analysis (e.g., Natural Language Processing, NLP), fuzzy set theory (FST), and machine learning (ML, e.g., Random Forest). These techniques allow efficient and accurate import of huge datasets, thereby optimizing the data that exists. They merge quantitative and qualitative types of data for rich parameter sets, and extrapolate where the data are sparse for best map production. The dbSEABED data resources are now very widely used worldwide in oceanographic research, environmental management, the geosciences, engineering and survey.

  5. Reporting guidelines for primary research: Saying what you did.

    PubMed

    O'Connor, Annette

    2010-12-01

    Reporting guidelines aim to facilitate publication of a full and accurate description of research conducted. The motivations for a full and accurate description of research is to enable reproduction of the study, assessment of bias, extraction of data from the study, and to fulfill an ethical obligation to maximize the utility of research findings. Many reporting guidelines exist and most are based on a specific study design such as randomized controlled trials (CONSORT statement) and observational studies (STROBE statement). The REFLECT statement focuses on randomized control trials in livestock and food safety studies. The REFLECT statement has increased emphasis on conveying information about animal housing, group level allocation and challenge studies. Guidelines can be used by authors, reviewers and editors to provide readers with a full and accurate description of the work conducted. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. Quantitative Near-field Microscopy of Heterogeneous and Correlated Electron Oxides

    NASA Astrophysics Data System (ADS)

    McLeod, Alexander Swinton

    Scanning near-field optical microscopy (SNOM) is a novel scanning probe microscopy technique capable of circumventing the conventional diffraction limit of light, affording unparalleled optical resolution (down to 10 nanometers) even for radiation in the infrared and terahertz energy regimes, with light wavelengths exceeding 10 micrometers. However, although this technique has been developed and employed for more than a decade to a qualitatively impressive effect, researchers have lacked a practically quantitative grasp of its capabilities, and its application scope has so far remained restricted by implementations limited to ambient atmospheric conditions. The two-fold objective of this dissertation work has been to address both these shortcomings. The first half of the dissertation presents a realistic, semi-analytic, and benchmarked theoretical description of probe-sample near-field interactions that form the basis of SNOM. Owing its name to the efficient nano-focusing of light at a sharp metallic apex, the "lightning rod model" of probe-sample near-field interactions is mathematically developed from a flexible and realistic scattering formalism. Powerful and practical applications are demonstrated through the accurate prediction of spectroscopic near-field optical contrasts, as well as the "inversion" of these spectroscopic contrasts into a quantitative description of material optical properties. Thus enabled, this thesis work proceeds to present quantitative applications of infrared near-field spectroscopy to investigate nano-resolved chemical compositions in a diverse host of samples, including technologically relevant lithium ion battery materials, astrophysical planetary materials, and invaluable returned extraterrestrial samples. The second half of the dissertation presents the design, construction, and demonstration of a sophisticated low-temperature scanning near-field infrared microscope. This instrument operates in an ultra-high vacuum environment suitable for the investigation of nano-scale physics in correlated electron matter at cryogenic temperatures, thus vastly expanding the scope of applications for infrared SNOM. Performance of the microscope is demonstrated through quanttiative exploration of the canonical insulator-metal transition occuring in the correlated electron insulator V2O3. The methodology established for this investigation provides a model for ongoing and future nano-optical studies of phase transitions and phase coexistence in correlated electron oxides.

  7. Soft Biometrics; Human Identification Using Comparative Descriptions.

    PubMed

    Reid, Daniel A; Nixon, Mark S; Stevenage, Sarah V

    2014-06-01

    Soft biometrics are a new form of biometric identification which use physical or behavioral traits that can be naturally described by humans. Unlike other biometric approaches, this allows identification based solely on verbal descriptions, bridging the semantic gap between biometrics and human description. To permit soft biometric identification the description must be accurate, yet conventional human descriptions comprising of absolute labels and estimations are often unreliable. A novel method of obtaining human descriptions will be introduced which utilizes comparative categorical labels to describe differences between subjects. This innovative approach has been shown to address many problems associated with absolute categorical labels-most critically, the descriptions contain more objective information and have increased discriminatory capabilities. Relative measurements of the subjects' traits can be inferred from comparative human descriptions using the Elo rating system. The resulting soft biometric signatures have been demonstrated to be robust and allow accurate recognition of subjects. Relative measurements can also be obtained from other forms of human representation. This is demonstrated using a support vector machine to determine relative measurements from gait biometric signatures-allowing retrieval of subjects from video footage by using human comparisons, bridging the semantic gap.

  8. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease.

    PubMed

    Margolin, Ezra J; Mlynarczyk, Carrie M; Mulhall, John P; Stember, Doron S; Stahl, Peter J

    2017-06-01

    Non-curvature penile deformities are prevalent and bothersome manifestations of Peyronie's disease (PD), but the quantitative metrics that are currently used to describe these deformities are inadequate and non-standardized, presenting a barrier to clinical research and patient care. To introduce erect penile volume (EPV) and percentage of erect penile volume loss (percent EPVL) as novel metrics that provide detailed quantitative information about non-curvature penile deformities and to study the feasibility and reliability of three-dimensional (3D) photography for measurement of quantitative penile parameters. We constructed seven penis models simulating deformities found in PD. The 3D photographs of each model were captured in triplicate by four observers using a 3D camera. Computer software was used to generate automated measurements of EPV, percent EPVL, penile length, minimum circumference, maximum circumference, and angle of curvature. The automated measurements were statistically compared with measurements obtained using water-displacement experiments, a tape measure, and a goniometer. Accuracy of 3D photography for average measurements of all parameters compared with manual measurements; inter-test, intra-observer, and inter-observer reliabilities of EPV and percent EPVL measurements as assessed by the intraclass correlation coefficient. The 3D images were captured in a median of 52 seconds (interquartile range = 45-61). On average, 3D photography was accurate to within 0.3% for measurement of penile length. It overestimated maximum and minimum circumferences by averages of 4.2% and 1.6%, respectively; overestimated EPV by an average of 7.1%; and underestimated percent EPVL by an average of 1.9%. All inter-test, inter-observer, and intra-observer intraclass correlation coefficients for EPV and percent EPVL measurements were greater than 0.75, reflective of excellent methodologic reliability. By providing highly descriptive and reliable measurements of penile parameters, 3D photography can empower researchers to better study volume-loss deformities in PD and enable clinicians to offer improved clinical assessment, communication, and documentation. This is the first study to apply 3D photography to the assessment of PD and to accurately measure the novel parameters of EPV and percent EPVL. This proof-of-concept study is limited by the lack of data in human subjects, which could present additional challenges in obtaining reliable measurements. EPV and percent EPVL are novel metrics that can be quickly, accurately, and reliably measured using computational analysis of 3D photographs and can be useful in describing non-curvature volume-loss deformities resulting from PD. Margolin EJ, Mlynarczyk CM, Muhall JP, et al. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease. J Sex Med 2017;14:829-833. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  9. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR modelling is the most robust and reliable method to reconstruct accurate quantitative iodine-131 SPECT images.

  10. Fractual interrelationships in field and seismic data. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-01-07

    Fractals provide a description of physical patterns over a range of scales in both time and space. Studies presented herein examine the fractal characteristics of various geological variables such as deformed bed-lengths, fold relief, seismic reflection arrival time variations, drainage and topographic patterns, and fracture systems. The studies are also extended to consider the possibility that the fractal characteristics of these variables are interrelated. Fractal interrelationships observed in these studies provide a method for relating variations in the fractal characteristics of seismic reflection events from reservoir intervals to the fractal characteristics of reservoir fracture systems, faults, and fold distributions. Themore » work is motivated by current exploration and development interests to detect fractured reservoirs and to accurately predict flow rates and flow patterns within the fractured reservoir. Accurate prediction requires an understanding of several reservoir properties including the fractal geometry of the reservoir fracture network. Results of these studies provide a method to remotely assess the fractal characteristics of a fractured reservoir, and help guide field development activities. The most significant outgrowth of this research is that the fractal properties of structural relief inferred from seismic data and structural cross sections provide a quantitative means to characterize and compare complex structural patterns. Production from fractured reservoirs is the result of complex structural and stratigraphic controls; hence, the import of fractal characterization to the assessment of fractured reservoirs lies in its potential to quantitatively define interrelationships between subtle structural variation and production. The potential uses are illustrated using seismic data from the Granny Creek oil field in the Appalachian Plateau.« less

  11. 3D visualization and quantification of bone and teeth mineralization for the study of osteo/dentinogenesis in mice models

    NASA Astrophysics Data System (ADS)

    Marchadier, A.; Vidal, C.; Ordureau, S.; Lédée, R.; Léger, C.; Young, M.; Goldberg, M.

    2011-03-01

    Research on bone and teeth mineralization in animal models is critical for understanding human pathologies. Genetically modified mice represent highly valuable models for the study of osteo/dentinogenesis defects and osteoporosis. Current investigations on mice dental and skeletal phenotype use destructive and time consuming methods such as histology and scanning microscopy. Micro-CT imaging is quicker and provides high resolution qualitative phenotypic description. However reliable quantification of mineralization processes in mouse bone and teeth are still lacking. We have established novel CT imaging-based software for accurate qualitative and quantitative analysis of mouse mandibular bone and molars. Data were obtained from mandibles of mice lacking the Fibromodulin gene which is involved in mineralization processes. Mandibles were imaged with a micro-CT originally devoted to industrial applications (Viscom, X8060 NDT). 3D advanced visualization was performed using the VoxBox software (UsefulProgress) with ray casting algorithms. Comparison between control and defective mice mandibles was made by applying the same transfer function for each 3D data, thus allowing to detect shape, colour and density discrepencies. The 2D images of transverse slices of mandible and teeth were similar and even more accurate than those obtained with scanning electron microscopy. Image processing of the molars allowed the 3D reconstruction of the pulp chamber, providing a unique tool for the quantitative evaluation of dentinogenesis. This new method is highly powerful for the study of oro-facial mineralizations defects in mice models, complementary and even competitive to current histological and scanning microscopy appoaches.

  12. Protein Folding Free Energy Landscape along the Committor - the Optimal Folding Coordinate.

    PubMed

    Krivov, Sergei V

    2018-06-06

    Recent advances in simulation and experiment have led to dramatic increases in the quantity and complexity of produced data, which makes the development of automated analysis tools very important. A powerful approach to analyze dynamics contained in such data sets is to describe/approximate it by diffusion on a free energy landscape - free energy as a function of reaction coordinates (RC). For the description to be quantitatively accurate, RCs should be chosen in an optimal way. Recent theoretical results show that such an optimal RC exists; however, determining it for practical systems is a very difficult unsolved problem. Here we describe a solution to this problem. We describe an adaptive nonparametric approach to accurately determine the optimal RC (the committor) for an equilibrium trajectory of a realistic system. In contrast to alternative approaches, which require a functional form with many parameters to approximate an RC and thus extensive expertise with the system, the suggested approach is nonparametric and can approximate any RC with high accuracy without system specific information. To avoid overfitting for a realistically sampled system, the approach performs RC optimization in an adaptive manner by focusing optimization on less optimized spatiotemporal regions of the RC. The power of the approach is illustrated on a long equilibrium atomistic folding simulation of HP35 protein. We have determined the optimal folding RC - the committor, which was confirmed by passing a stringent committor validation test. It allowed us to determine a first quantitatively accurate protein folding free energy landscape. We have confirmed the recent theoretical results that diffusion on such a free energy profile can be used to compute exactly the equilibrium flux, the mean first passage times, and the mean transition path times between any two points on the profile. We have shown that the mean squared displacement along the optimal RC grows linear with time as for simple diffusion. The free energy profile allowed us to obtain a direct rigorous estimate of the pre-exponential factor for the folding dynamics.

  13. [Applications of 2D and 3D landscape pattern indices in landscape pattern analysis of mountainous area at county level].

    PubMed

    Lu, Chao; Qi, Wei; Li, Le; Sun, Yao; Qin, Tian-Tian; Wang, Na-Na

    2012-05-01

    Landscape pattern indices are the commonly used tools for the quantitative analysis of landscape pattern. However, the traditional 2D landscape pattern indices neglect the effects of terrain on landscape, existing definite limitations in quantitatively describing the landscape patterns in mountains areas. Taking the Qixia City, a typical mountainous and hilly region in Shandong Province of East China, as a case, this paper compared the differences between 2D and 3D landscape pattern indices in quantitatively describing the landscape patterns and their dynamic changes in mountainous areas. On the basis of terrain structure analysis, a set of landscape pattern indices were selected, including area and density (class area and mean patch size), edge and shape (edge density, landscape shape index, and fractal dimension of mean patch), diversity (Shannon's diversity index and evenness index) , and gathering and spread (contagion index). There existed obvious differences between the 3D class area, mean patch area, and edge density and the corresponding 2D indices, but no significant differences between the 3D landscape shape index, fractal dimension of mean patch, and Shannon' s diversity index and evenness index and the corresponding 2D indices. The 3D contagion index and 2D contagion index had no difference. Because the 3D landscape pattern indices were calculated by using patch surface area and surface perimeter whereas the 2D landscape pattern indices were calculated by adopting patch projective area and projective perimeter, the 3D landscape pattern indices could be relative accurate and efficient in describing the landscape area, density and borderline, in mountainous areas. However, there were no distinct differences in describing landscape shape, diversity, and gathering and spread between the 3D and 2D landscape pattern indices. Generally, by introducing 3D landscape pattern indices to topographic pattern, the description of landscape pattern and its dynamic change would be relatively accurate.

  14. A quantitative reconstruction software suite for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  15. Compact and Hybrid Feature Description for Building Extraction

    NASA Astrophysics Data System (ADS)

    Li, Z.; Liu, Y.; Hu, Y.; Li, P.; Ding, Y.

    2017-05-01

    Building extraction in aerial orthophotos is crucial for various applications. Currently, deep learning has been shown to be successful in addressing building extraction with high accuracy and high robustness. However, quite a large number of samples is required in training a classifier when using deep learning model. In order to realize accurate and semi-interactive labelling, the performance of feature description is crucial, as it has significant effect on the accuracy of classification. In this paper, we bring forward a compact and hybrid feature description method, in order to guarantees desirable classification accuracy of the corners on the building roof contours. The proposed descriptor is a hybrid description of an image patch constructed from 4 sets of binary intensity tests. Experiments show that benefiting from binary description and making full use of color channels, this descriptor is not only computationally frugal, but also accurate than SURF for building extraction.

  16. Complexity Reduction in Large Quantum Systems: Fragment Identification and Population Analysis via a Local Optimized Minimal Basis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohr, Stephan; Masella, Michel; Ratcliff, Laura E.

    We present, within Kohn-Sham Density Functional Theory calculations, a quantitative method to identify and assess the partitioning of a large quantum mechanical system into fragments. We then introduce a simple and efficient formalism (which can be written as generalization of other well-known population analyses) to extract, from first principles, electrostatic multipoles for these fragments. The corresponding fragment multipoles can in this way be seen as reliable (pseudo-) observables. By applying our formalism within the code BigDFT, we show that the usage of a minimal set of in-situ optimized basis functions is of utmost importance for having at the same timemore » a proper fragment definition and an accurate description of the electronic structure. With this approach it becomes possible to simplify the modeling of environmental fragments by a set of multipoles, without notable loss of precision in the description of the active quantum mechanical region. Furthermore, this leads to a considerable reduction of the degrees of freedom by an effective coarsegraining approach, eventually also paving the way towards efficient QM/QM and QM/MM methods coupling together different levels of accuracy.« less

  17. Complexity Reduction in Large Quantum Systems: Fragment Identification and Population Analysis via a Local Optimized Minimal Basis

    DOE PAGES

    Mohr, Stephan; Masella, Michel; Ratcliff, Laura E.; ...

    2017-07-21

    We present, within Kohn-Sham Density Functional Theory calculations, a quantitative method to identify and assess the partitioning of a large quantum mechanical system into fragments. We then introduce a simple and efficient formalism (which can be written as generalization of other well-known population analyses) to extract, from first principles, electrostatic multipoles for these fragments. The corresponding fragment multipoles can in this way be seen as reliable (pseudo-) observables. By applying our formalism within the code BigDFT, we show that the usage of a minimal set of in-situ optimized basis functions is of utmost importance for having at the same timemore » a proper fragment definition and an accurate description of the electronic structure. With this approach it becomes possible to simplify the modeling of environmental fragments by a set of multipoles, without notable loss of precision in the description of the active quantum mechanical region. Furthermore, this leads to a considerable reduction of the degrees of freedom by an effective coarsegraining approach, eventually also paving the way towards efficient QM/QM and QM/MM methods coupling together different levels of accuracy.« less

  18. Economical in situ processing for orbital debris removal

    NASA Astrophysics Data System (ADS)

    Ramohalli, Kumar

    This paper proposes and develops the first description of a novel concept for the removal of large pieces of orbital debris; removing the large ones prevents the future formation of innumerable smaller ones. After a brief discussion of the growing importance of the general problem of orbital debris, the idea of utilizing local resources for clearing the debris is introduced. The fundamental concept revolves around the collection of solar energy via high-tech, light-weight, thermally stable reflectors; concentrating the radiation into highly focused beams; and carefully cutting the debris into accurate pieces that can be further used by the processing craft itself. The unusable parts are stowed behind the reflecting surfaces. At the end of these operations, the craft, with the collected debris, can process itself into a specific "shape" depending on the final disposal mode-either retrieval by the shuttle, splashdown into the ocean, or re-entry for burnup. The propulstion requirements are shown to be reasonable, through three very specific examples, using a quantitative computer animation. A description of the initial (manual, at this stage) terrestrially working hardware and future projections for this Autonomous Space Processor for Orbital Debris conclude this paper.

  19. Basic research and data analysis for the earth and ocean physics applications program and for the National Geodetic Satellite Program

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data analysis and supporting research in connection with the following objectives are discussed: (1) provide a precise and accurate geometric description of the earth's surface, (2) provide a precise and accurate mathematical description of the earth's gravitational field, and (3) determine time variations of the geometry of the ocean surface, the solid earth, the gravity field and other geophysical parameters.

  20. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    PubMed

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. 77 FR 33133 - Patient Protection and Affordable Care Act; Data Collection To Support Standards Related to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-05

    ... includes both quantitative and non-quantitative limits on benefits. Examples of quantitative limits include... duration of treatment. Examples of non-quantitative limits include prior authorization and step therapy... relevant issuers would submit data and descriptive information on the [[Page 33136

  2. A quantitative study of the clustering of polycyclic aromatic hydrocarbons at high temperatures.

    PubMed

    Totton, Tim S; Misquitta, Alston J; Kraft, Markus

    2012-03-28

    The clustering of polycyclic aromatic hydrocarbon (PAH) molecules is investigated in the context of soot particle inception and growth using an isotropic potential developed from the benchmark PAHAP potential. This potential is used to estimate equilibrium constants of dimerisation for five representative PAH molecules based on a statistical mechanics model. Molecular dynamics simulations are also performed to study the clustering of homomolecular systems at a range of temperatures. The results from both sets of calculations demonstrate that at flame temperatures pyrene (C(16)H(10)) dimerisation cannot be a key step in soot particle formation and that much larger molecules (e.g. circumcoronene, C(54)H(18)) are required to form small clusters at flame temperatures. The importance of using accurate descriptions of the intermolecular interactions is demonstrated by comparing results to those calculated with a popular literature potential with an order of magnitude variation in the level of clustering observed. By using an accurate intermolecular potential we are able to show that physical binding of PAH molecules based on van der Waals interactions alone can only be a viable soot inception mechanism if concentrations of large PAH molecules are significantly higher than currently thought.

  3. Estimating background-subtracted fluorescence transients in calcium imaging experiments: a quantitative approach.

    PubMed

    Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe

    2013-08-01

    Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Descriptive approaches to landscape analysis

    Treesearch

    R. Burton Litton Jr.

    1979-01-01

    Descriptive landscape analyses include various procedures used to document visual/scenic resources. Historic and regional examples of landscape description represent desirable insight for contemporary professional inventory work. Routed and areal landscape inventories are discussed as basic tools. From them, qualitative and quantitative evaluations can be developed...

  5. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  6. Multifractal spectrum and lacunarity as measures of complexity of osseointegration.

    PubMed

    de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko

    2016-07-01

    The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.

  7. 40 CFR 124.57 - Public notice.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... a quantitative statement, of the thermal effluent limitations proposed under section 301 or 306; (2... brief description, including a quantitative statement, of the alternative effluent limitations, if any...

  8. 40 CFR 124.57 - Public notice.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... a quantitative statement, of the thermal effluent limitations proposed under section 301 or 306; (2... brief description, including a quantitative statement, of the alternative effluent limitations, if any...

  9. 40 CFR 124.57 - Public notice.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... a quantitative statement, of the thermal effluent limitations proposed under section 301 or 306; (2... brief description, including a quantitative statement, of the alternative effluent limitations, if any...

  10. 40 CFR 124.57 - Public notice.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... a quantitative statement, of the thermal effluent limitations proposed under section 301 or 306; (2... brief description, including a quantitative statement, of the alternative effluent limitations, if any...

  11. 40 CFR 124.57 - Public notice.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... a quantitative statement, of the thermal effluent limitations proposed under section 301 or 306; (2... brief description, including a quantitative statement, of the alternative effluent limitations, if any...

  12. Multiblob coarse-graining for mixtures of long polymers and soft colloids

    NASA Astrophysics Data System (ADS)

    Locatelli, Emanuele; Capone, Barbara; Likos, Christos N.

    2016-11-01

    Soft nanocomposites represent both a theoretical and an experimental challenge due to the high number of the microscopic constituents that strongly influence the behaviour of the systems. An effective theoretical description of such systems invokes a reduction of the degrees of freedom to be analysed, hence requiring the introduction of an efficient, quantitative, coarse-grained description. We here report on a novel coarse graining approach based on a set of transferable potentials that quantitatively reproduces properties of mixtures of linear and star-shaped homopolymeric nanocomposites. By renormalizing groups of monomers into a single effective potential between a f-functional star polymer and an homopolymer of length N0, and through a scaling argument, it will be shown how a substantial reduction of the to degrees of freedom allows for a full quantitative description of the system. Our methodology is tested upon full monomer simulations for systems of different molecular weight, proving its full predictive potential.

  13. Descriptive statistics.

    PubMed

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  14. Computer analysis of lighting style in fine art: steps towards inter-artist studies

    NASA Astrophysics Data System (ADS)

    Stork, David G.

    2011-03-01

    Stylometry in visual art-the mathematical description of artists' styles - has been based on a number of properties of works, such as color, brush stroke shape, visual texture, and measures of contours' curvatures. We introduce the concept of quantitative measures of lighting, such as statistical descriptions of spatial coherence, diuseness, and so forth, as properties of artistic style. Some artists of the high Renaissance, such as Leonardo, worked from nature and strove to render illumination "faithfully" photorealists, such as Richard Estes, worked from photographs and duplicated the "physics based" lighting accurately. As such, each had dierent motivations, methodologies, stagings, and "accuracies" in rendering lighting clues. Perceptual studies show that observers are poor judges of properties of lighting in photographs such as consistency (and thus by extension in paintings as well); computer methods such as rigorous cast-shadow analysis, occluding-contour analysis and spherical harmonic based estimation of light fields can be quite accurate. For this reasons, computer lighting analysis can provide a new tools for art historical studies. We review lighting analysis in paintings such as Vermeer's Girl with a pearl earring, de la Tour's Christ in the carpenter's studio, Caravaggio's Magdalen with the smoking flame and Calling of St. Matthew) and extend our corpus to works where lighting coherence is of interest to art historians, such as Caravaggio's Adoration of the Shepherds or Nativity (1609) in the Capuchin church of Santa Maria degli Angeli. Our measure of lighting coherence may help reveal the working methods of some artists and in diachronic studies of individual artists. We speculate on artists and art historical questions that may ultimately profit from future renements to these new computational tools.

  15. Basic research and data analysis for the National Geodetic Satellite Program and for the Earth and Ocean Physics Application Program

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Accomplishments in the continuing programs are reported. The data were obtained in support of the following broad objectives: (1) to provide a precise and accurate geometric description of the earth's surface; (2) to provide a precise and accurate mathematical description of the earth's gravitational field; and (3) to determine time variations of the geometry of the ocean surface, the solid earth, the gravity field, and other geophysical parameters.

  16. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    ERIC Educational Resources Information Center

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  17. X-ray electron density investigation of chemical bonding in van der Waals materials

    NASA Astrophysics Data System (ADS)

    Kasai, Hidetaka; Tolborg, Kasper; Sist, Mattia; Zhang, Jiawei; Hathwar, Venkatesha R.; Filsø, Mette Ø.; Cenedese, Simone; Sugimoto, Kunihisa; Overgaard, Jacob; Nishibori, Eiji; Iversen, Bo B.

    2018-03-01

    Van der Waals (vdW) solids have attracted great attention ever since the discovery of graphene, with the essential feature being the weak chemical bonding across the vdW gap. The nature of these weak interactions is decisive for many extraordinary properties, but it is a strong challenge for current theory to accurately model long-range electron correlations. Here we use synchrotron X-ray diffraction data to precisely determine the electron density in the archetypal vdW solid, TiS2, and compare the results with density functional theory calculations. Quantitative agreement is observed for the chemical bonding description in the covalent TiS2 slabs, but significant differences are identified for the interactions across the gap, with experiment revealing more electron deformation than theory. The present data provide an experimental benchmark for testing theoretical models of weak chemical bonding.

  18. Temperature dependence of the cross section for the fragmentation of thymine via dissociative electron attachment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopyra, Janina; Abdoul-Carime, Hassan, E-mail: hcarime@ipnl.in2p3.fr

    Providing experimental values for absolute Dissociative Electron Attachment (DEA) cross sections for nucleobases at realistic biological conditions is a considerable challenge. In this work, we provide the temperature dependence of the cross section, σ, of the dehydrogenated thymine anion (T − H){sup −} produced via DEA. Within the 393-443 K temperature range, it is observed that σ varies by one order of magnitude. By extrapolating to a temperature of 313 K, the relative DEA cross section for the production of the dehydrogenated thymine anion at an incident energy of 1 eV decreases by 2 orders of magnitude and the absolutemore » value reaches approximately 6 × 10{sup −19} cm{sup 2}. These quantitative measurements provide a benchmark for theoretical prediction and also a contribution to a more accurate description of the effects of ionizing radiation on molecular medium.« less

  19. Local Descriptors of Dynamic and Nondynamic Correlation.

    PubMed

    Ramos-Cordoba, Eloy; Matito, Eduard

    2017-06-13

    Quantitatively accurate electronic structure calculations rely on the proper description of electron correlation. A judicious choice of the approximate quantum chemistry method depends upon the importance of dynamic and nondynamic correlation, which is usually assesed by scalar measures. Existing measures of electron correlation do not consider separately the regions of the Cartesian space where dynamic or nondynamic correlation are most important. We introduce real-space descriptors of dynamic and nondynamic electron correlation that admit orbital decomposition. Integration of the local descriptors yields global numbers that can be used to quantify dynamic and nondynamic correlation. Illustrative examples over different chemical systems with varying electron correlation regimes are used to demonstrate the capabilities of the local descriptors. Since the expressions only require orbitals and occupation numbers, they can be readily applied in the context of local correlation methods, hybrid methods, density matrix functional theory, and fractional-occupancy density functional theory.

  20. The clinical nurse specialist in an Irish hospital.

    PubMed

    Wickham, Sheelagh

    2011-01-01

    This study was set in an acute Irish health care setting and aimed to explore the activity of the clinical nurse specialist (CNS) in this setting. Quantitative methodology, using a valid and reliable questionnaire, provided descriptive statistics that gave accurate data on the total population of CNSs in the health care setting. The study was set in an acute-care 750-bed hospital that had 25 CNSs in practice. The sample consisted of all 25 CNSs who are the total population of CNSs working in the acute health care institution. The findings show the CNS to be active in the roles of researcher, educator, communicator, change agent, leader, and clinical specialist, but the level of activity varies between different roles. There is variety in the activity of CNSs in the various roles and to what extent they enact the role. The findings merit further study on CNS role activity and possible variables that influence role activity.

  1. Enhancement of problem solving ability of high school students through learning with real engagement in active problem solving (REAPS) model on the concept of heat transfer

    NASA Astrophysics Data System (ADS)

    Yulindar, A.; Setiawan, A.; Liliawati, W.

    2018-05-01

    This study aims to influence the enhancement of problem solving ability before and after learning using Real Engagement in Active Problem Solving (REAPS) model on the concept of heat transfer. The research method used is quantitative method with 35 high school students in Pontianak as sample. The result of problem solving ability of students is obtained through the test in the form of 3 description questions. The instrument has tested the validity by the expert judgment and field testing that obtained the validity value of 0.84. Based on data analysis, the value of N-Gain is 0.43 and the enhancement of students’ problem solving ability is in medium category. This was caused of students who are less accurate in calculating the results of answers and they also have limited time in doing the questions given.

  2. Quantitative characterization of the water trimer torsional manifold by terahertz laser spectroscopy and theoretical analysis. II. (H2O)3

    NASA Astrophysics Data System (ADS)

    Brown, Mac G.; Viant, Mark R.; McLaughlin, Ryan P.; Keoshian, Christy J.; Michael, Ernest; Cruzan, Jeff D.; Saykally, Richard J.; van der Avoird, Ad

    1999-11-01

    We report the measurement of two new (H2O)3 bands by terahertz laser vibration-rotation-tunneling (VRT) spectroscopy. Both bands have been assigned to torsional ("pseudorotational") transitions and are highly perturbed by Coriolis interactions. The 42.9 cm-1 band corresponds to the k=±2←±1 transition while the 65.6 cm-1 band corresponds to the k=±2←0 transition. A model Hamiltonian is derived which allowed a global fit of 361 VRT transitions of these two new bands and the previously reported torsional band at 87.1 cm-1. Each of the bifurcation tunneling components is accurately described. This global fit represents a complete description of the VRT transitions of (H2O)3 up to 150 cm-1, and complements our similar treatment of the (D2O)3 torsional dynamics.

  3. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  4. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  5. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  6. 29 CFR 1607.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (essential). (6) Sample description. A description of how the research sample was identified and selected... the size of each subgroup (essential). A description of how the research sample compares with the...). Any quantitative data which identify or define the job constructs, such as factor analyses, should be...

  7. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  8. Accurate color synthesis of three-dimensional objects in an image

    NASA Astrophysics Data System (ADS)

    Xin, John H.; Shen, Hui-Liang

    2004-05-01

    Our study deals with color synthesis of a three-dimensional object in an image; i.e., given a single image, a target color can be accurately mapped onto the object such that the color appearance of the synthesized object closely resembles that of the actual one. As it is almost impossible to acquire the complete geometric description of the surfaces of an object in an image, this study attempted to recover the implicit description of geometry for the color synthesis. The description was obtained from either a series of spectral reflectances or the RGB signals at different surface positions on the basis of the dichromatic reflection model. The experimental results showed that this implicit image-based representation is related to the object geometry and is sufficient for accurate color synthesis of three-dimensional objects in an image. The method established is applicable to the color synthesis of both rigid and deformable objects and should contribute to color fidelity in virtual design, manufacturing, and retailing.

  9. Crater studies: Part A: lunar crater morphometry

    USGS Publications Warehouse

    Pike, Richard J.

    1973-01-01

    Morphometry, the quantitative study of shape, complements the visual observation and photointerpretation in analyzing the most outstanding landforms of the Moon, its craters (refs. 32-1 and 32-2). All three of these interpretative tools, which were developed throughout the long history of telescopic lunar study preceding the Apollo Program, will continue to be applicable to crater analysis until detailed field work becomes possible. Although no large (>17.5 km diameter) craters were examined in situ on any of the Apollo landings, the photographs acquired from the command modules will markedly strengthen results of less direct investigations of the craters. For morphometry, the most useful materials are the orbital metric and panoramic photographs from the final three Apollo missions. These photographs permit preparation of contour maps, topographic profiles, and other numerical data that accurately portray for the first time the surface geometry of lunar craters of all sizes. Interpretations of craters no longer need be compromised by inadequate topographic data. In the pre-Apollo era, hypotheses for the genesis of lunar craters usually were constructed without any numerical descriptive data. Such speculations will have little credibility unless supported by accurate, quantitative data, especially those generated from Apollo orbital photographs. This paper presents a general study of the surface geometry of 25 far-side craters and a more detailed study of rim-crest evenness for 15 near-side and far-side craters. Analysis of this preliminary sample of Apollo 15 and 17 data, which includes craters between 1.5 and 275 km in diameter, suggests that most genetic interpretations of craters made from pre-Apollo topographic measurements may require no drastic revision. All measurements were made from topographic profiles generated on a stereoplotter at the Photogrammetric Unit of the U.S. Geological Survey, Center of Astrogeology, Flagstaff, Arizona.

  10. Assessment of a fully 3D Monte Carlo reconstruction method for preclinical PET with iodine-124

    NASA Astrophysics Data System (ADS)

    Moreau, M.; Buvat, I.; Ammour, L.; Chouin, N.; Kraeber-Bodéré, F.; Chérel, M.; Carlier, T.

    2015-03-01

    Iodine-124 is a radionuclide well suited to the labeling of intact monoclonal antibodies. Yet, accurate quantification in preclinical imaging with I-124 is challenging due to the large positron range and a complex decay scheme including high-energy gammas. The aim of this work was to assess the quantitative performance of a fully 3D Monte Carlo (MC) reconstruction for preclinical I-124 PET. The high-resolution small animal PET Inveon (Siemens) was simulated using GATE 6.1. Three system matrices (SM) of different complexity were calculated in addition to a Siddon-based ray tracing approach for comparison purpose. Each system matrix accounted for a more or less complete description of the physics processes both in the scanned object and in the PET scanner. One homogeneous water phantom and three heterogeneous phantoms including water, lungs and bones were simulated, where hot and cold regions were used to assess activity recovery as well as the trade-off between contrast recovery and noise in different regions. The benefit of accounting for scatter, attenuation, positron range and spurious coincidences occurring in the object when calculating the system matrix used to reconstruct I-124 PET images was highlighted. We found that the use of an MC SM including a thorough modelling of the detector response and physical effects in a uniform water-equivalent phantom was efficient to get reasonable quantitative accuracy in homogeneous and heterogeneous phantoms. Modelling the phantom heterogeneities in the SM did not necessarily yield the most accurate estimate of the activity distribution, due to the high variance affecting many SM elements in the most sophisticated SM.

  11. 75 FR 81665 - Notice of Intent to Seek Approval to Reinstate an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... are both quantitative and descriptive. Quantitative information from the most recently completed... activities with respect to industrial collaboration [cir] Conducting a survey of all center participants to probe the participant satisfaction with center activities [cir] Compiling a set of quantitative...

  12. The use of cognitive task analysis to improve instructional descriptions of procedures.

    PubMed

    Clark, Richard E; Pugh, Carla M; Yates, Kenneth A; Inaba, Kenji; Green, Donald J; Sullivan, Maura E

    2012-03-01

    Surgical training relies heavily on the ability of expert surgeons to provide complete and accurate descriptions of a complex procedure. However, research from a variety of domains suggests that experts often omit critical information about the judgments, analysis, and decisions they make when solving a difficult problem or performing a complex task. In this study, we compared three methods for capturing surgeons' descriptions of how to perform the procedure for inserting a femoral artery shunt (unaided free-recall, unaided free-recall with simulation, and cognitive task analysis methods) to determine which method produced more accurate and complete results. Cognitive task analysis was approximately 70% more complete and accurate than free-recall and or free-recall during a simulation of the procedure. Ten expert trauma surgeons at a major urban trauma center were interviewed separately and asked to describe how to perform an emergency shunt procedure. Four surgeons provided an unaided free-recall description of the shunt procedure, five surgeons provided an unaided free-recall description of the procedure using visual aids and surgical instruments (simulation), and one (chosen randomly) was interviewed using cognitive task analysis (CTA) methods. An 11th vascular surgeon approved the final CTA protocol. The CTA interview with only one expert surgeon resulted in significantly greater accuracy and completeness of the descriptions compared with the unaided free-recall interviews with multiple expert surgeons. Surgeons in the unaided group omitted nearly 70% of necessary decision steps. In the free-recall group, heavy use of simulation improved surgeons' completeness when describing the steps of the procedure. CTA significantly increases the completeness and accuracy of surgeons' instructional descriptions of surgical procedures. In addition, simulation during unaided free-recall interviews may improve the completeness of interview data. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. FAST COGNITIVE AND TASK ORIENTED, ITERATIVE DATA DISPLAY (FACTOID)

    DTIC Science & Technology

    2017-06-01

    approaches. As a result, the following assumptions guided our efforts in developing modeling and descriptive metrics for evaluation purposes...Application Evaluation . Our analytic workflow for evaluation is to first provide descriptive statistics about applications across metrics (performance...distributions for evaluation purposes because the goal of evaluation is accurate description , not inference (e.g., prediction). Outliers depicted

  14. Initial Description of a Quantitative, Cross-Species (Chimpanzee-Human) Social Responsiveness Measure

    ERIC Educational Resources Information Center

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.

    2011-01-01

    Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…

  15. 78 FR 57903 - Notice of Intent To Seek Approval To Renew an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-20

    .... The indicators are both quantitative and descriptive. Quantitative information from the most recently... center activities with respect to industrial collaboration. [cir] Conducting a survey of all center... quantitative indicators determined by NSF to analyze the management and operation of the center. [cir...

  16. Accurate Virus Quantitation Using a Scanning Transmission Electron Microscopy (STEM) Detector in a Scanning Electron Microscope

    DTIC Science & Technology

    2017-06-29

    Accurate Virus Quantitation Using a Scanning Transmission Electron Microscopy (STEM) Detector in a Scanning Electron Microscope Candace D Blancett1...L Norris2, Cynthia A Rossi4 , Pamela J Glass3, Mei G Sun1,* 1 Pathology Division, United States Army Medical Research Institute of Infectious...Diseases (USAMRIID), 1425 Porter Street, Fort Detrick, Maryland, 21702 2Biostatistics Division, United States Army Medical Research Institute of

  17. Molecular Modeling of Thermodynamic and Transport Properties for CO 2 and Aqueous Brines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Hao; Economou, Ioannis G.; Panagiotopoulos, Athanassios Z.

    Molecular simulation techniques using classical force-fields occupy the space between ab initio quantum mechanical methods and phenomenological correlations. In particular, Monte Carlo and molecular dynamics algorithms can be used to provide quantitative predictions of thermodynamic and transport properties of fluids relevant for geologic carbon sequestration at conditions for which experimental data are uncertain or not available. These methods can cover time and length scales far exceeding those of quantum chemical methods, while maintaining transferability and predictive power lacking from phenomenological correlations. The accuracy of predictions depends sensitively on the quality of the molecular models used. Many existing fixed-point-charge models formore » water and aqueous mixtures fail to represent accurately these fluid properties, especially when descriptions covering broad ranges of thermodynamic conditions are needed. Recent work on development of accurate models for water, CO 2, and dissolved salts, as well as their mixtures, is summarized in this Account. Polarizable models that can respond to the different dielectric environments in aqueous versus nonaqueous phases are necessary for predictions of properties over extended ranges of temperatures and pressures. Phase compositions and densities, activity coefficients of the dissolved salts, interfacial tensions, viscosities and diffusivities can be obtained in near-quantitative agreement to available experimental data, using relatively modest computational resources. In some cases, for example, for the composition of the CO 2-rich phase in coexistence with an aqueous phase, recent results from molecular simulations have helped discriminate among conflicting experimental data sets. The sensitivity of properties on the quality of the intermolecular interaction model varies significantly. Properties such as the phase compositions or electrolyte activity coefficients are much more sensitive than phase densities, viscosities, or component diffusivities. Strong confinement effects on physical properties in nanoscale media can also be directly obtained from molecular simulations. Future work on molecular modeling for CO 2 and aqueous brines is likely to be focused on more systematic generation of interaction models by utilizing quantum chemical as well as direct experimental measurements. New ion models need to be developed for use with the current generation of polarizable water models, including ion–ion interactions that will allow for accurate description of dense, mixed brines. Methods will need to be devised that go beyond the use of effective potentials for incorporation of quantum effects known to be important for water, and reactive force fields developed that can handle bond creation and breaking in systems with carbonate and silicate minerals. Lastly, another area of potential future work is the integration of molecular simulation methods in multiscale models for the chemical reactions leading to mineral dissolution and flow within the porous media in underground formations.« less

  18. Molecular Modeling of Thermodynamic and Transport Properties for CO2 and Aqueous Brines.

    PubMed

    Jiang, Hao; Economou, Ioannis G; Panagiotopoulos, Athanassios Z

    2017-04-18

    Molecular simulation techniques using classical force-fields occupy the space between ab initio quantum mechanical methods and phenomenological correlations. In particular, Monte Carlo and molecular dynamics algorithms can be used to provide quantitative predictions of thermodynamic and transport properties of fluids relevant for geologic carbon sequestration at conditions for which experimental data are uncertain or not available. These methods can cover time and length scales far exceeding those of quantum chemical methods, while maintaining transferability and predictive power lacking from phenomenological correlations. The accuracy of predictions depends sensitively on the quality of the molecular models used. Many existing fixed-point-charge models for water and aqueous mixtures fail to represent accurately these fluid properties, especially when descriptions covering broad ranges of thermodynamic conditions are needed. Recent work on development of accurate models for water, CO 2 , and dissolved salts, as well as their mixtures, is summarized in this Account. Polarizable models that can respond to the different dielectric environments in aqueous versus nonaqueous phases are necessary for predictions of properties over extended ranges of temperatures and pressures. Phase compositions and densities, activity coefficients of the dissolved salts, interfacial tensions, viscosities and diffusivities can be obtained in near-quantitative agreement to available experimental data, using relatively modest computational resources. In some cases, for example, for the composition of the CO 2 -rich phase in coexistence with an aqueous phase, recent results from molecular simulations have helped discriminate among conflicting experimental data sets. The sensitivity of properties on the quality of the intermolecular interaction model varies significantly. Properties such as the phase compositions or electrolyte activity coefficients are much more sensitive than phase densities, viscosities, or component diffusivities. Strong confinement effects on physical properties in nanoscale media can also be directly obtained from molecular simulations. Future work on molecular modeling for CO 2 and aqueous brines is likely to be focused on more systematic generation of interaction models by utilizing quantum chemical as well as direct experimental measurements. New ion models need to be developed for use with the current generation of polarizable water models, including ion-ion interactions that will allow for accurate description of dense, mixed brines. Methods will need to be devised that go beyond the use of effective potentials for incorporation of quantum effects known to be important for water, and reactive force fields developed that can handle bond creation and breaking in systems with carbonate and silicate minerals. Another area of potential future work is the integration of molecular simulation methods in multiscale models for the chemical reactions leading to mineral dissolution and flow within the porous media in underground formations.

  19. Molecular Modeling of Thermodynamic and Transport Properties for CO 2 and Aqueous Brines

    DOE PAGES

    Jiang, Hao; Economou, Ioannis G.; Panagiotopoulos, Athanassios Z.

    2017-02-24

    Molecular simulation techniques using classical force-fields occupy the space between ab initio quantum mechanical methods and phenomenological correlations. In particular, Monte Carlo and molecular dynamics algorithms can be used to provide quantitative predictions of thermodynamic and transport properties of fluids relevant for geologic carbon sequestration at conditions for which experimental data are uncertain or not available. These methods can cover time and length scales far exceeding those of quantum chemical methods, while maintaining transferability and predictive power lacking from phenomenological correlations. The accuracy of predictions depends sensitively on the quality of the molecular models used. Many existing fixed-point-charge models formore » water and aqueous mixtures fail to represent accurately these fluid properties, especially when descriptions covering broad ranges of thermodynamic conditions are needed. Recent work on development of accurate models for water, CO 2, and dissolved salts, as well as their mixtures, is summarized in this Account. Polarizable models that can respond to the different dielectric environments in aqueous versus nonaqueous phases are necessary for predictions of properties over extended ranges of temperatures and pressures. Phase compositions and densities, activity coefficients of the dissolved salts, interfacial tensions, viscosities and diffusivities can be obtained in near-quantitative agreement to available experimental data, using relatively modest computational resources. In some cases, for example, for the composition of the CO 2-rich phase in coexistence with an aqueous phase, recent results from molecular simulations have helped discriminate among conflicting experimental data sets. The sensitivity of properties on the quality of the intermolecular interaction model varies significantly. Properties such as the phase compositions or electrolyte activity coefficients are much more sensitive than phase densities, viscosities, or component diffusivities. Strong confinement effects on physical properties in nanoscale media can also be directly obtained from molecular simulations. Future work on molecular modeling for CO 2 and aqueous brines is likely to be focused on more systematic generation of interaction models by utilizing quantum chemical as well as direct experimental measurements. New ion models need to be developed for use with the current generation of polarizable water models, including ion–ion interactions that will allow for accurate description of dense, mixed brines. Methods will need to be devised that go beyond the use of effective potentials for incorporation of quantum effects known to be important for water, and reactive force fields developed that can handle bond creation and breaking in systems with carbonate and silicate minerals. Lastly, another area of potential future work is the integration of molecular simulation methods in multiscale models for the chemical reactions leading to mineral dissolution and flow within the porous media in underground formations.« less

  20. Far-infrared Kerr rotation spectroscopy of graphite and multilayer graphene

    NASA Astrophysics Data System (ADS)

    Levallois, Julien; Tran, Michaël; Kuzmenko, Alexey

    2012-02-01

    Graphite attracts much attention nowadays as a reference 3D material for graphene. Since the early measurements of the cyclotron effect in graphite over fifty years ago [1], a satisfactory quantitative description of this spectacular phenomenon is missing. The analysis of magneto-optical data was hindered either by a limited set of the used photon energies or by the lack of the optical selectivity between electrons and holes. We overcome this issue by measuring the far-infrared magneto-optical Kerr rotation spectra [2] and achieve a highly accurate unified microscopic description of all spectra in a broad range of magnetic fields (0.5 -- 7 T) by taking rigorously the c-axis band dispersion and the trigonal warping into account. We find that the second- and the forth-order cyclotron harmonics are optically almost as strong as the fundamental cyclotron resonance even at high fields. The same effects are expected to strongly influence the magneto-optical spectra of Bernal stacked multilayer graphene and therefore play a major role in the respective applications. [4pt] [1] J. K. Galt, W.A. Yager and H.W. Dail Jr., Phys. Rev. 103, 1586 (1956) [2] J. Levallois, M.K. Tran and A. B. Kuzmenko, arXiv:1110.2754v2; submitted.

  1. The problem with coal-waste dumps inventory in Upper Silesian Coal Basin

    NASA Astrophysics Data System (ADS)

    Abramowicz, Anna; Chybiorz, Ryszard

    2017-04-01

    Coal-waste dumps are the side effect of coal mining, which has lasted in Poland for 250 years. They have negative influence on the landscape and the environment, and pollute soil, vegetation and groundwater. Their number, size and shape is changing over time, as new wastes have been produced and deposited changing their shape and enlarging their size. Moreover deposited wastes, especially overburned, are exploited for example road construction, also causing the shape and size change up to disappearing. Many databases and inventory systems were created in order to control these hazards, but some disadvantages prevent reliable statistics. Three representative databases were analyzed according to their structure and type of waste dumps description, classification and visualization. The main problem is correct classification of dumps in terms of their name and type. An additional difficulty is the accurate quantitative description (area and capacity). A complex database was created as a result of comparison, verification of the information contained in existing databases and its supplementation based on separate documentation. A variability analysis of coal-waste dumps over time is also included. The project has been financed from the funds of the Leading National Research Centre (KNOW) received by the Centre for Polar Studies for the period 2014-2018.

  2. Applied mathematical problems in modern electromagnetics

    NASA Astrophysics Data System (ADS)

    Kriegsman, Gregory

    1994-05-01

    We have primarily investigated two classes of electromagnetic problems. The first contains the quantitative description of microwave heating of dispersive and conductive materials. Such problems arise, for example, when biological tissue are exposed, accidentally or purposefully, to microwave radiation. Other instances occur in ceramic processing, such as sintering and microwave assisted chemical vapor infiltration and other industrial drying processes, such as the curing of paints and concrete. The second class characterizes the scattering of microwaves by complex targets which possess two or more disparate length and/or time scales. Spatially complex scatterers arise in a variety of applications, such as large gratings and slowly changing guiding structures. The former are useful in developing microstrip energy couplers while the later can be used to model anatomical subsystems (e.g., the open guiding structure composed of two legs and the adjoining lower torso). Temporally complex targets occur in applications involving dispersive media whose relaxation times differ by orders of magnitude from thermal and/or electromagnetic time scales. For both cases the mathematical description of the problems gives rise to complicated ill-conditioned boundary value problems, whose accurate solutions require a blend of both asymptotic techniques, such as multiscale methods and matched asymptotic expansions, and numerical methods incorporating radiation boundary conditions, such as finite differences and finite elements.

  3. Analyzing Electric Field Morphology Through Data-Model Comparisons of the GEM IM/S Assessment Challenge Events

    NASA Technical Reports Server (NTRS)

    Liemohn, Michael W.; Ridley, Aaron J.; Kozyra, Janet U.; Gallagher, Dennis L.; Thomsen, Michelle F.; Henderson, Michael G.; Denton, Michael H.; Brandt, Pontus C.; Goldstein, Jerry

    2006-01-01

    The storm-time inner magnetospheric electric field morphology and dynamics are assessed by comparing numerical modeling results of the plasmasphere and ring current with many in situ and remote sensing data sets. Two magnetic storms are analyzed, April 22,2001 and October 21-23,2001, which are the events selected for the Geospace Environment Modeling (GEM) Inner Magnetosphere/Storms (IM/S) Assessment Challenge (IMSAC). The IMSAC seeks to quantify the accuracy of inner magnetospheric models as well as synthesize our understanding of this region. For each storm, the ring current-atmosphere interaction model (RAM) and the dynamic global core plasma model (DGCPM) were run together with various settings for the large-scale convection electric field and the nightside ionospheric conductance. DGCPM plasmaspheric parameters were compared with IMAGE-EUV plasmapause extractions and LANL-MPA plume locations and velocities. RAM parameters were compared with Dst*, LANL-MPA fluxes and moments, IMAGE-MENA images, and IMAGE-HENA images. Both qualitative and quantitative comparisons were made to determine the electric field morphology that allows the model results to best fit the plasma data at various times during these events. The simulations with self-consistent electric fields were, in general, better than those with prescribed field choices. This indicates that the time-dependent modulation of the inner magnetospheric electric fields by the nightside ionosphere is quite significant for accurate determination of these fields (and their effects). It was determined that a shielded Volland-Stern field description driven by the 3-hour Kp index yields accurate results much of the time, but can be quite inconsistent. The modified Mcllwain field description clearly lagged in overall accuracy compared to the other fields, but matched some data sets (like Dst*) quite well. The rankings between the simulations varied depending on the storm and the individual data sets, indicating that each field description did well for some place, time, and energy range during the events, as well as doing less well in other places, times, and energies. Several unresolved issues regarding the storm-time inner magnetospheric electric field are discussed.

  4. Electronic properties of doped and defective NiO: A quantum Monte Carlo study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Hyeondeok; Luo, Ye; Ganesh, Panchapakesan

    NiO is a canonical Mott (or charge-transfer) insulator and as such is notoriously difficult to describe using density functional theory (DFT) based electronic structure methods. Doped Mott insulators such as NiO are of interest for various applications but rigorous theoretical descriptions are lacking. Here, we use quantum Monte Carlo methods, which very accurately include electron-electron interactions, to examine energetics, charge- and spin-structures of NiO with various point defects, such as vacancies or substitutional doping with potassium. The formation energy of a potassium dopant is significantly lower than for a Ni vacancy, making potassium an attractive monovalent dopant for NiO. Wemore » compare our results with DFT results that include an on-site Hubbard U (DFT+U) to account for correlations and find relatively large discrepancies for defect formation energies as well as for charge and spin redistributions in the presence of point defects. Finally, it is unlikely that single-parameter fixes of DFT may be able to obtain accurate accounts of anything but a single parameter, e.g., band gap; responses that, maybe in addition to the band gap, depend in subtle and complex ways on ground state properties, such as charge and spin densities, are likely to contain quantitative and qualitative errors.« less

  5. Spectrally resolved opacities and Rosseland and Planck mean opacities of lowly ionized gold plasmas: a detailed level-accounting investigation.

    PubMed

    Zeng, Jiaolong; Yuan, Jianmin

    2007-08-01

    Calculation details of radiative opacity for lowly ionized gold plasmas by using our developed fully relativistic detailed level-accounting approach are presented to show the importance of accurate atomic data for a quantitative reproduction of the experimental observations. Even though a huge number of transition lines are involved in the radiative absorption of high- Z plasmas so that one believes that statistical models can often give a reasonable description of their opacities, we first show in detail that an adequate treatment of physical effects, in particular the configuration interaction (including the core-valence electron correlation), is essential to produce atomic data of bound-bound and bound-free processes for gold plasmas, which are accurate enough to correctly explain the relative intensity of two strong absorption peaks experimentally observed located near photon energy of 70 and 80 eV. A detailed study is also carried out for gold plasmas of an average ionization degree sequence of 10, for both spectrally resolved opacities and Rosseland and Planck means. For comparison, results obtained by using an average atom model are also given to show that even for a relatively higher density of matter, correlation effects are also important to predict the correct positions of absorption peaks of transition arrays.

  6. Electronic properties of doped and defective NiO: A quantum Monte Carlo study

    DOE PAGES

    Shin, Hyeondeok; Luo, Ye; Ganesh, Panchapakesan; ...

    2017-12-28

    NiO is a canonical Mott (or charge-transfer) insulator and as such is notoriously difficult to describe using density functional theory (DFT) based electronic structure methods. Doped Mott insulators such as NiO are of interest for various applications but rigorous theoretical descriptions are lacking. Here, we use quantum Monte Carlo methods, which very accurately include electron-electron interactions, to examine energetics, charge- and spin-structures of NiO with various point defects, such as vacancies or substitutional doping with potassium. The formation energy of a potassium dopant is significantly lower than for a Ni vacancy, making potassium an attractive monovalent dopant for NiO. Wemore » compare our results with DFT results that include an on-site Hubbard U (DFT+U) to account for correlations and find relatively large discrepancies for defect formation energies as well as for charge and spin redistributions in the presence of point defects. Finally, it is unlikely that single-parameter fixes of DFT may be able to obtain accurate accounts of anything but a single parameter, e.g., band gap; responses that, maybe in addition to the band gap, depend in subtle and complex ways on ground state properties, such as charge and spin densities, are likely to contain quantitative and qualitative errors.« less

  7. A Quantitative Study Identifying Political Strategies Used by Principals of Dual Language Programs

    ERIC Educational Resources Information Center

    Girard, Guadalupe

    2017-01-01

    Purpose. The purpose of this quantitative study was to identify the external and internal political strategies used by principals that allow them to successfully navigate the political environment surrounding dual language programs. Methodology. This quantitative study used descriptive research to collect, analyze, and report data that identified…

  8. Discovering the Quantity of Quality: Scoring "Regional Identity" for Quantitative Research

    ERIC Educational Resources Information Center

    Miller, Daniel A.

    2008-01-01

    The variationist paradigm in sociolinguistics is at a disadvantage when dealing with variables that are traditionally treated qualitatively, e.g., "identity". This study essays to level the accuracy and descriptive value of qualitative research in a quantitative setting by rendering such a variable quantitatively accessible. To this end,…

  9. Quantitative proteomics in the field of microbiology.

    PubMed

    Otto, Andreas; Becher, Dörte; Schmidt, Frank

    2014-03-01

    Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Computational understanding of Li-ion batteries

    NASA Astrophysics Data System (ADS)

    Urban, Alexander; Seo, Dong-Hwa; Ceder, Gerbrand

    2016-03-01

    Over the last two decades, computational methods have made tremendous advances, and today many key properties of lithium-ion batteries can be accurately predicted by first principles calculations. For this reason, computations have become a cornerstone of battery-related research by providing insight into fundamental processes that are not otherwise accessible, such as ionic diffusion mechanisms and electronic structure effects, as well as a quantitative comparison with experimental results. The aim of this review is to provide an overview of state-of-the-art ab initio approaches for the modelling of battery materials. We consider techniques for the computation of equilibrium cell voltages, 0-Kelvin and finite-temperature voltage profiles, ionic mobility and thermal and electrolyte stability. The strengths and weaknesses of different electronic structure methods, such as DFT+U and hybrid functionals, are discussed in the context of voltage and phase diagram predictions, and we review the merits of lattice models for the evaluation of finite-temperature thermodynamics and kinetics. With such a complete set of methods at hand, first principles calculations of ordered, crystalline solids, i.e., of most electrode materials and solid electrolytes, have become reliable and quantitative. However, the description of molecular materials and disordered or amorphous phases remains an important challenge. We highlight recent exciting progress in this area, especially regarding the modelling of organic electrolytes and solid-electrolyte interfaces.

  11. Validation Process for LEWICE Coupled by Use of a Navier-stokes Solver

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    2016-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth for many meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will show the differences in ice shape between LEWICE 3.5 and experimental data. In addition, comparisons will be made between the lift and drag calculated on the ice shapes from experiment and those produced by LEWICE. This report will also provide a description of both programs. Quantitative geometric comparisons are shown for horn height, horn angle, icing limit, area and leading edge thickness. Quantitative comparisons of calculated lift and drag will also be shown. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  12. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  13. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  14. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  15. Three-dimensional segmentation of luminal and adventitial borders in serial intravascular ultrasound images

    NASA Technical Reports Server (NTRS)

    Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.

    1999-01-01

    Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.

  16. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    PubMed Central

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2008-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often present in addition to the results of bivariate and multivariable analyses. Qualitative metasummary, which includes the extraction, grouping, and formatting of findings, and the calculation of frequency and intensity effect sizes, can be used to produce mixed research syntheses and to conduct a posteriori analyses of the relationship between reports and findings. PMID:17243111

  17. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  18. Calibration and data collection protocols for reliable lattice parameter values in electron pair distribution function studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun

    2015-01-30

    Different protocols for calibrating electron pair distribution function (ePDF) measurements are explored and described for quantitative studies on nanomaterials. It is found that the most accurate approach to determine the camera length is to use a standard calibration sample of Au nanoparticles from the National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.

  19. Calibration and data collection protocols for reliable lattice parameter values in electron pair distribution function (ePDF) studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun

    2015-02-01

    We explore and describe different protocols for calibrating electron pair distribution function (ePDF) measurements for quantitative studies on nano-materials. We find the most accurate approach to determine the camera-length is to use a standard calibration sample of Au nanoparticles from National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.

  20. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    PubMed Central

    Rastogi, L.; Dash, K.; Arunachalam, J.

    2013-01-01

    The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814

  1. Preparing Tomorrow's Administrators: A Quantitative Correlation Study of the Relationship between Emotional Intelligence and Effective Leadership Practices

    ERIC Educational Resources Information Center

    May-Vollmar, Kelly

    2017-01-01

    Purpose: The purpose of this quantitative correlation study was to identify whether there is a relationship between emotional intelligence and effective leadership practices, specifically with school administrators in Southern California K-12 public schools. Methods: This study was conducted using a quantitative descriptive design, correlation…

  2. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  3. 21 CFR 514.1 - Applications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...

  4. 21 CFR 514.1 - Applications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...

  5. 21 CFR 514.1 - Applications.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...

  6. 21 CFR 514.1 - Applications.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...

  7. 21 CFR 514.1 - Applications.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...

  8. Getting the most from dermatopathology.

    PubMed

    Campbell, Gregory A; Sauber, Leslie

    2007-03-01

    Dermatohistopathology is one of the most powerful diagnostic tools in clinical dermatology. It is a process in which the veterinary clinician and the veterinary pathologist must consider themselves a team in patient care. The veterinary clinician must know when biopsies are indicated; be able to select lesions to biopsy that are likely to yield diagnostic results; skillfully procure the biopsy samples; and provide the pathologist with an accurate history, clinical description, and clinical differential diagnosis. The pathologist should have particular interest and expertise in dermatohistopathology, be readily accessible to the clinician, and be vigilant in the pursuit of an accurate histologic description and diagnosis.

  9. Accurate virus quantitation using a Scanning Transmission Electron Microscopy (STEM) detector in a scanning electron microscope.

    PubMed

    Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G

    2017-10-01

    A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Quantitative Pedagogy: A Digital Two Player Game to Examine Communicative Competence.

    PubMed

    Lopez-Rosenfeld, Matías; Carrillo, Facundo; Garbulsky, Gerry; Fernandez Slezak, Diego; Sigman, Mariano

    2015-01-01

    Inner concepts are much richer than the words that describe them. Our general objective is to inquire what are the best procedures to communicate conceptual knowledge. We construct a simplified and controlled setup emulating important variables of pedagogy amenable to quantitative analysis. To this aim, we designed a game inspired in Chinese Whispers, to investigate which attributes of a description affect its capacity to faithfully convey an image. This is a two player game: an emitter and a receiver. The emitter was shown a simple geometric figure and was asked to describe it in words. He was informed that this description would be passed to the receiver who had to replicate the drawing from this description. We capitalized on vast data obtained from an android app to quantify the effect of different aspects of a description on communication precision. We show that descriptions more effectively communicate an image when they are coherent and when they are procedural. Instead, the creativity, the use of metaphors and the use of mathematical concepts do not affect its fidelity.

  11. Quantitative Pedagogy: A Digital Two Player Game to Examine Communicative Competence

    PubMed Central

    Lopez-Rosenfeld, Matías; Carrillo, Facundo; Garbulsky, Gerry; Fernandez Slezak, Diego; Sigman, Mariano

    2015-01-01

    Inner concepts are much richer than the words that describe them. Our general objective is to inquire what are the best procedures to communicate conceptual knowledge. We construct a simplified and controlled setup emulating important variables of pedagogy amenable to quantitative analysis. To this aim, we designed a game inspired in Chinese Whispers, to investigate which attributes of a description affect its capacity to faithfully convey an image. This is a two player game: an emitter and a receiver. The emitter was shown a simple geometric figure and was asked to describe it in words. He was informed that this description would be passed to the receiver who had to replicate the drawing from this description. We capitalized on vast data obtained from an android app to quantify the effect of different aspects of a description on communication precision. We show that descriptions more effectively communicate an image when they are coherent and when they are procedural. Instead, the creativity, the use of metaphors and the use of mathematical concepts do not affect its fidelity. PMID:26554833

  12. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. Copyright © RSNA, 2011.

  13. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  14. Optimization of metabolite basis sets prior to quantitation in magnetic resonance spectroscopy: an approach based on quantum mechanics

    NASA Astrophysics Data System (ADS)

    Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.

  15. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe.

    PubMed

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-09

    Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.

  16. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  17. Application of the Rangeland Hydrology and Erosion Model to Ecological Site Descriptions and Management

    USDA-ARS?s Scientific Manuscript database

    The utility of Ecological Site Descriptions (ESDs) and State-and-Transition Models (STMs) concepts in guiding rangeland management hinges on their ability to accurately describe and predict community dynamics and the associated consequences. For many rangeland ecosystems, plant community dynamics ar...

  18. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.

  19. Langley Atmospheric Information Retrieval System (LAIRS): System description and user's guide

    NASA Technical Reports Server (NTRS)

    Boland, D. E., Jr.; Lee, T.

    1982-01-01

    This document presents the user's guide, system description, and mathematical specifications for the Langley Atmospheric Information Retrieval System (LAIRS). It also includes a description of an optimal procedure for operational use of LAIRS. The primary objective of the LAIRS Program is to make it possible to obtain accurate estimates of atmospheric pressure, density, temperature, and winds along Shuttle reentry trajectories for use in postflight data reduction.

  20. Quantitative characterization of viscoelastic behavior in tissue-mimicking phantoms and ex vivo animal tissues.

    PubMed

    Maccabi, Ashkan; Shin, Andrew; Namiri, Nikan K; Bajwa, Neha; St John, Maie; Taylor, Zachary D; Grundfest, Warren; Saddik, George N

    2018-01-01

    Viscoelasticity of soft tissue is often related to pathology, and therefore, has become an important diagnostic indicator in the clinical assessment of suspect tissue. Surgeons, particularly within head and neck subsites, typically use palpation techniques for intra-operative tumor detection. This detection method, however, is highly subjective and often fails to detect small or deep abnormalities. Vibroacoustography (VA) and similar methods have previously been used to distinguish tissue with high-contrast, but a firm understanding of the main contrast mechanism has yet to be verified. The contributions of tissue mechanical properties in VA images have been difficult to verify given the limited literature on viscoelastic properties of various normal and diseased tissue. This paper aims to investigate viscoelasticity theory and present a detailed description of viscoelastic experimental results obtained in tissue-mimicking phantoms (TMPs) and ex vivo tissues to verify the main contrast mechanism in VA and similar imaging modalities. A spherical-tip micro-indentation technique was employed with the Hertzian model to acquire absolute, quantitative, point measurements of the elastic modulus (E), long term shear modulus (η), and time constant (τ) in homogeneous TMPs and ex vivo tissue in rat liver and porcine liver and gallbladder. Viscoelastic differences observed between porcine liver and gallbladder tissue suggest that imaging modalities which utilize the mechanical properties of tissue as a primary contrast mechanism can potentially be used to quantitatively differentiate between proximate organs in a clinical setting. These results may facilitate more accurate tissue modeling and add information not currently available to the field of systems characterization and biomedical research.

  1. Three dimensional motion capture applied to violin playing: A study on feasibility and characterization of the motor strategy.

    PubMed

    Ancillao, Andrea; Savastano, Bernardo; Galli, Manuela; Albertini, Giorgio

    2017-10-01

    Playing string instruments requires advanced motor skills and a long training that is often spent in uncomfortable postures that may lead to injuries or musculoskeletal disorders. Thus, it is interesting to objectively characterize the motor strategy adopted by the players. In this work, we implemented a method for the quantitative analysis of the motor performance of a violin player. The proposed protocol takes advantage of an optoelectronic system and some infra-red reflecting markers in order to track player's motion. The method was tested on a professional violin player performing a legato bowing task. The biomechanical strategy of the upper limb and bow positioning were described by means of quantitative parameters and motion profiles. Measured quantities were: bow trajectory, angles, tracks, velocity, acceleration and jerk. A good repeatability of the bowing motion (CV < 2%) and high smoothness (jerk < 5 m/s 3 ) were observed. Motion profiles of shoulder, elbow and wrist were repeatable (CV < 7%) and comparable to the curves observed in other studies. Jerk and acceleration profiles demonstrated high smoothness in the ascending and descending phases of bowing. High variability was instead observed for the neck angle (CV ∼56%). "Quantitative" measurements, instead of "qualitative" observation, can support the diagnosis of motor disorders and the accurate evaluation of musicians' skills. The proposed protocol is a powerful tool for the description of musician's performance, that may be useful to document improvements in playing abilities and to adjust training strategies. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Quantitative characterization of viscoelastic behavior in tissue-mimicking phantoms and ex vivo animal tissues

    PubMed Central

    Shin, Andrew; Namiri, Nikan K.; Bajwa, Neha; St. John, Maie; Taylor, Zachary D.; Grundfest, Warren; Saddik, George N.

    2018-01-01

    Viscoelasticity of soft tissue is often related to pathology, and therefore, has become an important diagnostic indicator in the clinical assessment of suspect tissue. Surgeons, particularly within head and neck subsites, typically use palpation techniques for intra-operative tumor detection. This detection method, however, is highly subjective and often fails to detect small or deep abnormalities. Vibroacoustography (VA) and similar methods have previously been used to distinguish tissue with high-contrast, but a firm understanding of the main contrast mechanism has yet to be verified. The contributions of tissue mechanical properties in VA images have been difficult to verify given the limited literature on viscoelastic properties of various normal and diseased tissue. This paper aims to investigate viscoelasticity theory and present a detailed description of viscoelastic experimental results obtained in tissue-mimicking phantoms (TMPs) and ex vivo tissues to verify the main contrast mechanism in VA and similar imaging modalities. A spherical-tip micro-indentation technique was employed with the Hertzian model to acquire absolute, quantitative, point measurements of the elastic modulus (E), long term shear modulus (η), and time constant (τ) in homogeneous TMPs and ex vivo tissue in rat liver and porcine liver and gallbladder. Viscoelastic differences observed between porcine liver and gallbladder tissue suggest that imaging modalities which utilize the mechanical properties of tissue as a primary contrast mechanism can potentially be used to quantitatively differentiate between proximate organs in a clinical setting. These results may facilitate more accurate tissue modeling and add information not currently available to the field of systems characterization and biomedical research. PMID:29373598

  3. Propellant Chemistry for CFD Applications

    NASA Technical Reports Server (NTRS)

    Farmer, R. C.; Anderson, P. G.; Cheng, Gary C.

    1996-01-01

    Current concepts for reusable launch vehicle design have created renewed interest in the use of RP-1 fuels for high pressure and tri-propellant propulsion systems. Such designs require the use of an analytical technology that accurately accounts for the effects of real fluid properties, combustion of large hydrocarbon fuel modules, and the possibility of soot formation. These effects are inadequately treated in current computational fluid dynamic (CFD) codes used for propulsion system analyses. The objective of this investigation is to provide an accurate analytical description of hydrocarbon combustion thermodynamics and kinetics that is sufficiently computationally efficient to be a practical design tool when used with CFD codes such as the FDNS code. A rigorous description of real fluid properties for RP-1 and its combustion products will be derived from the literature and from experiments conducted in this investigation. Upon the establishment of such a description, the fluid description will be simplified by using the minimum of empiricism necessary to maintain accurate combustion analyses and including such empirical models into an appropriate CFD code. An additional benefit of this approach is that the real fluid properties analysis simplifies the introduction of the effects of droplet sprays into the combustion model. Typical species compositions of RP-1 have been identified, surrogate fuels have been established for analyses, and combustion and sooting reaction kinetics models have been developed. Methods for predicting the necessary real fluid properties have been developed and essential experiments have been designed. Verification studies are in progress, and preliminary results from these studies will be presented. The approach has been determined to be feasible, and upon its completion the required methodology for accurate performance and heat transfer CFD analyses for high pressure, tri-propellant propulsion systems will be available.

  4. Accurate single-shot quantitative phase imaging of biological specimens with telecentric digital holographic microscopy.

    PubMed

    Doblas, Ana; Sánchez-Ortiga, Emilio; Martínez-Corral, Manuel; Saavedra, Genaro; Garcia-Sucerquia, Jorge

    2014-04-01

    The advantages of using a telecentric imaging system in digital holographic microscopy (DHM) to study biological specimens are highlighted. To this end, the performances of nontelecentric DHM and telecentric DHM are evaluated from the quantitative phase imaging (QPI) point of view. The evaluated stability of the microscope allows single-shot QPI in DHM by using telecentric imaging systems. Quantitative phase maps of a section of the head of the drosophila melanogaster fly and of red blood cells are obtained via single-shot DHM with no numerical postprocessing. With these maps we show that the use of telecentric DHM provides larger field of view for a given magnification and permits more accurate QPI measurements with less number of computational operations.

  5. Stroke onset time estimation from multispectral quantitative magnetic resonance imaging in a rat model of focal permanent cerebral ischemia.

    PubMed

    McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A

    2016-08-01

    Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.

  6. Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.

    PubMed

    Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D

    2015-01-01

    Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.

  7. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  8. An integrated, ethically driven environmental model of clinical decision making in emergency settings.

    PubMed

    Wolf, Lisa

    2013-02-01

    To explore the relationship between multiple variables within a model of critical thinking and moral reasoning. A quantitative descriptive correlational design using a purposive sample of 200 emergency nurses. Measured variables were accuracy in clinical decision-making, moral reasoning, perceived care environment, and demographics. Analysis was by bivariate correlation using Pearson's product-moment correlation coefficients, chi square and multiple linear regression analysis. The elements as identified in the integrated ethically-driven environmental model of clinical decision-making (IEDEM-CD) corrected depict moral reasoning and environment of care as factors significantly affecting accuracy in decision-making. The integrated, ethically driven environmental model of clinical decision making is a framework useful for predicting clinical decision making accuracy for emergency nurses in practice, with further implications in education, research and policy. A diagnostic and therapeutic framework for identifying and remediating individual and environmental challenges to accurate clinical decision making. © 2012, The Author. International Journal of Nursing Knowledge © 2012, NANDA International.

  9. Integrated Experimental and Modelling Research for Non-Ferrous Smelting and Recycling Systems

    NASA Astrophysics Data System (ADS)

    Jak, Evgueni; Hidayat, Taufiq; Shishin, Denis; Mehrjardi, Ata Fallah; Chen, Jiang; Decterov, Sergei; Hayes, Peter

    The chemistries of industrial pyrometallurgical non-ferrous smelting and recycling processes are becoming increasingly complex. Optimisation of process conditions, charge composition, temperature, oxygen partial pressure, and partitioning of minor elements between phases and different process streams require accurate description of phase equilibria and thermodynamics which are the focus of the present research. The experiments involve high temperature equilibration in controlled gas atmospheres, rapid quenching and direct measurement of equilibrium phase compositions with quantitative microanalytical techniques including electron probe X-ray microanalysis and Laser Ablation ICP-MS. The thermodynamic modelling is undertaken using computer package FactSage with the quasi-chemical model for the liquid slag phase and other advanced models. Experimental and modelling studies are combined into an integrated research program focused on the major elements Cu-Pb-Fe-O-Si-S system, slagging Al, Ca, Mg and other minor elements. The ongoing development of the research methodologies has resulted in significant advances in research capabilities. Examples of applications are given.

  10. Electronic structure of aqueous solutions: Bridging the gap between theory and experiments.

    PubMed

    Pham, Tuan Anh; Govoni, Marco; Seidel, Robert; Bradforth, Stephen E; Schwegler, Eric; Galli, Giulia

    2017-06-01

    Predicting the electronic properties of aqueous liquids has been a long-standing challenge for quantum mechanical methods. However, it is a crucial step in understanding and predicting the key role played by aqueous solutions and electrolytes in a wide variety of emerging energy and environmental technologies, including battery and photoelectrochemical cell design. We propose an efficient and accurate approach to predict the electronic properties of aqueous solutions, on the basis of the combination of first-principles methods and experimental validation using state-of-the-art spectroscopic measurements. We present results of the photoelectron spectra of a broad range of solvated ions, showing that first-principles molecular dynamics simulations and electronic structure calculations using dielectric hybrid functionals provide a quantitative description of the electronic properties of the solvent and solutes, including excitation energies. The proposed computational framework is general and applicable to other liquids, thereby offering great promise in understanding and engineering solutions and liquid electrolytes for a variety of important energy technologies.

  11. Mapping the Coulomb Environment in Interference-Quenched Ballistic Nanowires.

    PubMed

    Gutstein, D; Lynall, D; Nair, S V; Savelyev, I; Blumin, M; Ercolani, D; Ruda, H E

    2018-01-10

    The conductance of semiconductor nanowires is strongly dependent on their electrostatic history because of the overwhelming influence of charged surface and interface states on electron confinement and scattering. We show that InAs nanowire field-effect transistor devices can be conditioned to suppress resonances that obscure quantized conduction thereby revealing as many as six sub-bands in the conductance spectra as the Fermi-level is swept across the sub-band energies. The energy level spectra extracted from conductance, coupled with detailed modeling shows the significance of the interface state charge distribution revealing the Coulomb landscape of the nanowire device. Inclusion of self-consistent Coulomb potentials, the measured geometrical shape of the nanowire, the gate geometry and nonparabolicity of the conduction band provide a quantitative and accurate description of the confinement potential and resulting energy level structure. Surfaces of the nanowire terminated by HfO 2 are shown to have their interface donor density reduced by a factor of 30 signifying the passivating role played by HfO 2 .

  12. Electronic structure of aqueous solutions: Bridging the gap between theory and experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Tuan Anh; Govoni, Marco; Seidel, Robert

    Predicting the electronic properties of aqueous liquids has been a long-standing challenge for quantum mechanical methods. However, it is a crucial step in understanding and predicting the key role played by aqueous solutions and electrolytes in a wide variety of emerging energy and environmental technologies, including battery and photoelectrochemical cell design. We propose an efficient and accurate approach to predict the electronic properties of aqueous solutions, on the basis of the combination of first-principles methods and experimental validation using state-of-the-art spectroscopic measurements. We present results of the photoelectron spectra of a broad range of solvated ions, showing that first-principles molecularmore » dynamics simulations and electronic structure calculations using dielectric hybrid functionals provide a quantitative description of the electronic properties of the solvent and solutes, including excitation energies. The proposed computational framework is general and applicable to other liquids, thereby offering great promise in understanding and engineering solutions and liquid electrolytes for a variety of important energy technologies.« less

  13. Fast-response underwater TSP investigation of subcritical instabilities of a cylinder in crossflow

    NASA Astrophysics Data System (ADS)

    Capone, Alessandro; Klein, Christian; Di Felice, Fabio; Beifuss, Uwe; Miozzi, Massimo

    2015-10-01

    We investigate the classic cylinder in crossflow case to test the effectiveness of a fast-response underwater temperature-sensitive paint coating (TSP) in providing highly resolved spatial and time observations of the action of a flow over a bluff body surface. The flow is investigated at Reynolds number <190 k, before the onset of the drag-crisis state. The obtained TSP image sequences convey an accurate description of the evolution of the main features in the fluid-cylinder interaction, like the separation line position, the pattern of the large coherent structures acting on the cylinder's surface and the small-scale intermittent streamwise arrays of vortices. Ad hoc data management and features extraction techniques are proposed which allow extraction of quantitative data, such as separation line position and vortex-shedding frequency, and results are compared to the literature. Use of TSP for water applications introduces an interesting point of view about the fluid-body interactions by focusing directly on the effect of the flow on the model surface.

  14. The fluid trampoline: droplets bouncing on a soap film

    NASA Astrophysics Data System (ADS)

    Bush, John; Gilet, Tristan

    2008-11-01

    We present the results of a combined experimental and theoretical investigation of droplets falling onto a horizontal soap film. Both static and vertically vibrated soap films are considered. A quasi-static description of the soap film shape yields a force-displacement relation that provides excellent agreement with experiment, and allows us to model the film as a nonlinear spring. This approach yields an accurate criterion for the transition between droplet bouncing and crossing on the static film; moreover, it allows us to rationalize the observed constancy of the contact time and scaling for the coefficient of restitution in the bouncing states. On the vibrating film, a variety of bouncing behaviours were observed, including simple and complex periodic states, multiperiodicity and chaos. A simple theoretical model is developed that captures the essential physics of the bouncing process, reproducing all observed bouncing states. Quantitative agreement between model and experiment is deduced for simple periodic modes, and qualitative agreement for more complex periodic and chaotic bouncing states.

  15. Teaching concepts of energy to nigerian children in the 7-11 year-old age range

    NASA Astrophysics Data System (ADS)

    Urevbu, Andrew O.

    This study investigated the level of concept attainment of selected energy concepts for possible inclusion in the Nigerian (Bendel State) Primary Science Project (BPSP). Using an experimental design suggested by Solomon (1949), subjects were taught energy concepts at three levels - descriptive, comparative and quantitative. Results showed that levels of concept comprehension were hierarchical with a signficant decrease in achievement from descriptive to comparative and quantitative concepts. The results of this study suggest the need to describe levels of concept for particular grades in the elementary school curriculum and to match curriculum with thinking strategies of children.

  16. On the Helix Propensity in Generalized Born Solvent Descriptions of Modeling the Dark Proteome

    DTIC Science & Technology

    2017-01-10

    benchmarks of conformational sampling methods and their all-atom force fields plus solvent descriptions to accurately model structural transitions on a...atom simulations of proteins is the replacement of explicit water interactions with a continuum description of treating implicitly the bulk physical... structure was reported by Amarasinghe and coworkers (Leung et al., 2015) of the Ebola nucleoprotein NP in complex with a 28-residue peptide extracted

  17. Behavioral Assembly Required: Particularly for Quantitative Courses

    ERIC Educational Resources Information Center

    Mazen, Abdelmagid

    2008-01-01

    This article integrates behavioral approaches into the teaching and learning of quantitative subjects with application to statistics. Focusing on the emotional component of learning, the article presents a system dynamic model that provides descriptive and prescriptive accounts of learners' anxiety. Metaphors and the metaphorizing process are…

  18. 76 FR 27326 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    .... Proposed Project Quantitative Survey of Physician Practices in Laboratory Test Ordering and Interpretation... Control and Prevention (CDC). Background and Brief Description The Quantitative Survey of Physician... collection. The survey will be funded in full by the Office of Surveillance, Epidemiology, and Laboratory...

  19. Using GPS To Teach More Than Accurate Positions.

    ERIC Educational Resources Information Center

    Johnson, Marie C.; Guth, Peter L.

    2002-01-01

    Undergraduate science majors need practice in critical thinking, quantitative analysis, and judging whether their calculated answers are physically reasonable. Develops exercises using handheld Global Positioning System (GPS) receivers. Reinforces students' abilities to think quantitatively, make realistic "back of the envelope"…

  20. Quantitative and descriptive comparison of four acoustic analysis systems: vowel measurements.

    PubMed

    Burris, Carlyn; Vorperian, Houri K; Fourakis, Marios; Kent, Ray D; Bolt, Daniel M

    2014-02-01

    This study examines accuracy and comparability of 4 trademarked acoustic analysis software packages (AASPs): Praat, WaveSurfer, TF32, and CSL by using synthesized and natural vowels. Features of AASPs are also described. Synthesized and natural vowels were analyzed using each of the AASP's default settings to secure 9 acoustic measures: fundamental frequency (F0), formant frequencies (F1-F4), and formant bandwidths (B1-B4). The discrepancy between the software measured values and the input values (synthesized, previously reported, and manual measurements) was used to assess comparability and accuracy. Basic AASP features are described. Results indicate that Praat, WaveSurfer, and TF32 generate accurate and comparable F0 and F1-F4 data for synthesized vowels and adult male natural vowels. Results varied by vowel for women and children, with some serious errors. Bandwidth measurements by AASPs were highly inaccurate as compared with manual measurements and published data on formant bandwidths. Values of F0 and F1-F4 are generally consistent and fairly accurate for adult vowels and for some child vowels using the default settings in Praat, WaveSurfer, and TF32. Manipulation of default settings yields improved output values in TF32 and CSL. Caution is recommended especially before accepting F1-F4 results for children and B1-B4 results for all speakers.

  1. Tunneling current in HfO2 and Hf0.5Zr0.5O2-based ferroelectric tunnel junction

    NASA Astrophysics Data System (ADS)

    Dong, Zhipeng; Cao, Xi; Wu, Tong; Guo, Jing

    2018-03-01

    Ferroelectric tunnel junctions (FTJs) have been intensively explored for future low power data storage and information processing applications. Among various ferroelectric (FE) materials studied, HfO2 and H0.5Zr0.5O2 (HZO) have the advantage of CMOS process compatibility. The validity of the simple effective mass approximation, for describing the tunneling process in these materials, is examined by computing the complex band structure from ab initio simulations. The results show that the simple effective mass approximation is insufficient to describe the tunneling current in HfO2 and HZO materials, and quantitative accurate descriptions of the complex band structures are indispensable for calculation of the tunneling current. A compact k . p Hamiltonian is parameterized to and validated by ab initio complex band structures, which provides a method for efficiently and accurately computing the tunneling current in HfO2 and HZO. The device characteristics of a metal/FE/metal structure and a metal/FE/semiconductor (M-F-S) structure are investigated by using the non-equilibrium Green's function formalism with the parameterized effective Hamiltonian. The result shows that the M-F-S structure offers a larger resistance window due to an extra barrier in the semiconductor region at off-state. A FTJ utilizing M-F-S structure is beneficial for memory design.

  2. Major Source of Error in QSPR Prediction of Intrinsic Thermodynamic Solubility of Drugs: Solid vs Nonsolid State Contributions?

    PubMed

    Abramov, Yuriy A

    2015-06-01

    The main purpose of this study is to define the major limiting factor in the accuracy of the quantitative structure-property relationship (QSPR) models of the thermodynamic intrinsic aqueous solubility of the drug-like compounds. For doing this, the thermodynamic intrinsic aqueous solubility property was suggested to be indirectly "measured" from the contributions of solid state, ΔGfus, and nonsolid state, ΔGmix, properties, which are estimated by the corresponding QSPR models. The QSPR models of ΔGfus and ΔGmix properties were built based on a set of drug-like compounds with available accurate measurements of fusion and thermodynamic solubility properties. For consistency ΔGfus and ΔGmix models were developed using similar algorithms and descriptor sets, and validated against the similar test compounds. Analysis of the relative performances of these two QSPR models clearly demonstrates that it is the solid state contribution which is the limiting factor in the accuracy and predictive power of the QSPR models of the thermodynamic intrinsic solubility. The performed analysis outlines a necessity of development of new descriptor sets for an accurate description of the long-range order (periodicity) phenomenon in the crystalline state. The proposed approach to the analysis of limitations and suggestions for improvement of QSPR-type models may be generalized to other applications in the pharmaceutical industry.

  3. From The Cover: The X3LYP extended density functional for accurate descriptions of nonbond interactions, spin states, and thermochemical properties.

    PubMed

    Xu, Xin; Goddard, William A

    2004-03-02

    We derive the form for an exact exchange energy density for a density decaying with Gaussian-like behavior at long range. Based on this, we develop the X3LYP (extended hybrid functional combined with Lee-Yang-Parr correlation functional) extended functional for density functional theory to significantly improve the accuracy for hydrogen-bonded and van der Waals complexes while also improving the accuracy in heats of formation, ionization potentials, electron affinities, and total atomic energies [over the most popular and accurate method, B3LYP (Becke three-parameter hybrid functional combined with Lee-Yang-Parr correlation functional)]. X3LYP also leads to a good description of dipole moments, polarizabilities, and accurate excitation energies from s to d orbitals for transition metal atoms and ions. We suggest that X3LYP will be useful for predicting ligand binding in proteins and DNA.

  4. From The Cover: The X3LYP extended density functional for accurate descriptions of nonbond interactions, spin states, and thermochemical properties

    NASA Astrophysics Data System (ADS)

    Xu, Xin; Goddard, William A., III

    2004-03-01

    We derive the form for an exact exchange energy density for a density decaying with Gaussian-like behavior at long range. Based on this, we develop the X3LYP (extended hybrid functional combined with Lee-Yang-Parr correlation functional) extended functional for density functional theory to significantly improve the accuracy for hydrogen-bonded and van der Waals complexes while also improving the accuracy in heats of formation, ionization potentials, electron affinities, and total atomic energies [over the most popular and accurate method, B3LYP (Becke three-parameter hybrid functional combined with Lee-Yang-Parr correlation functional)]. X3LYP also leads to a good description of dipole moments, polarizabilities, and accurate excitation energies from s to d orbitals for transition metal atoms and ions. We suggest that X3LYP will be useful for predicting ligand binding in proteins and DNA.

  5. The X3LYP extended density functional for accurate descriptions of nonbond interactions, spin states, and thermochemical properties

    PubMed Central

    Xu, Xin; Goddard, William A.

    2004-01-01

    We derive the form for an exact exchange energy density for a density decaying with Gaussian-like behavior at long range. Based on this, we develop the X3LYP (extended hybrid functional combined with Lee–Yang–Parr correlation functional) extended functional for density functional theory to significantly improve the accuracy for hydrogen-bonded and van der Waals complexes while also improving the accuracy in heats of formation, ionization potentials, electron affinities, and total atomic energies [over the most popular and accurate method, B3LYP (Becke three-parameter hybrid functional combined with Lee–Yang–Parr correlation functional)]. X3LYP also leads to a good description of dipole moments, polarizabilities, and accurate excitation energies from s to d orbitals for transition metal atoms and ions. We suggest that X3LYP will be useful for predicting ligand binding in proteins and DNA. PMID:14981235

  6. Accuracy and Calibration of High Explosive Thermodynamic Equations of State

    NASA Astrophysics Data System (ADS)

    Baker, Ernest L.; Capellos, Christos; Stiel, Leonard I.; Pincay, Jack

    2010-10-01

    The Jones-Wilkins-Lee-Baker (JWLB) equation of state (EOS) was developed to more accurately describe overdriven detonation while maintaining an accurate description of high explosive products expansion work output. The increased mathematical complexity of the JWLB high explosive equations of state provides increased accuracy for practical problems of interest. Increased numbers of parameters are often justified based on improved physics descriptions but can also mean increased calibration complexity. A generalized extent of aluminum reaction Jones-Wilkins-Lee (JWL)-based EOS was developed in order to more accurately describe the observed behavior of aluminized explosives detonation products expansion. A calibration method was developed to describe the unreacted, partially reacted, and completely reacted explosive using nonlinear optimization. A reasonable calibration of a generalized extent of aluminum reaction JWLB EOS as a function of aluminum reaction fraction has not yet been achieved due to the increased mathematical complexity of the JWLB form.

  7. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    ERIC Educational Resources Information Center

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  8. Prose Style and Critical Reading.

    ERIC Educational Resources Information Center

    Cluett, Robert

    This book is based on the York Computer Inventory of Prose Style, which seeks to provide a quantitative description of the syntactic characteristics of the literary language of specific authors over the last 400 years. After a brief theoretical introduction and a description of texts and sampling procedures, the discussion turns to specific…

  9. 41 CFR 60-3.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... group should be described (essential). (6) Sample description. A description of how the research sample... research sample compares with the relevant labor market or work force, the method by which the relevant... quantitative data which identify or define the job constructs, such as factor analyses, should be provided...

  10. 41 CFR 60-3.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... group should be described (essential). (6) Sample description. A description of how the research sample... research sample compares with the relevant labor market or work force, the method by which the relevant... quantitative data which identify or define the job constructs, such as factor analyses, should be provided...

  11. 41 CFR 60-3.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... group should be described (essential). (6) Sample description. A description of how the research sample... research sample compares with the relevant labor market or work force, the method by which the relevant... quantitative data which identify or define the job constructs, such as factor analyses, should be provided...

  12. 41 CFR 60-3.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... group should be described (essential). (6) Sample description. A description of how the research sample... research sample compares with the relevant labor market or work force, the method by which the relevant... quantitative data which identify or define the job constructs, such as factor analyses, should be provided...

  13. 41 CFR 60-3.15 - Documentation of impact and validity evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... group should be described (essential). (6) Sample description. A description of how the research sample... research sample compares with the relevant labor market or work force, the method by which the relevant... quantitative data which identify or define the job constructs, such as factor analyses, should be provided...

  14. Use of sentiment analysis for capturing patient experience from free-text comments posted online.

    PubMed

    Greaves, Felix; Ramirez-Cano, Daniel; Millett, Christopher; Darzi, Ara; Donaldson, Liam

    2013-11-01

    There are large amounts of unstructured, free-text information about quality of health care available on the Internet in blogs, social networks, and on physician rating websites that are not captured in a systematic way. New analytical techniques, such as sentiment analysis, may allow us to understand and use this information more effectively to improve the quality of health care. We attempted to use machine learning to understand patients' unstructured comments about their care. We used sentiment analysis techniques to categorize online free-text comments by patients as either positive or negative descriptions of their health care. We tried to automatically predict whether a patient would recommend a hospital, whether the hospital was clean, and whether they were treated with dignity from their free-text description, compared to the patient's own quantitative rating of their care. We applied machine learning techniques to all 6412 online comments about hospitals on the English National Health Service website in 2010 using Weka data-mining software. We also compared the results obtained from sentiment analysis with the paper-based national inpatient survey results at the hospital level using Spearman rank correlation for all 161 acute adult hospital trusts in England. There was 81%, 84%, and 89% agreement between quantitative ratings of care and those derived from free-text comments using sentiment analysis for cleanliness, being treated with dignity, and overall recommendation of hospital respectively (kappa scores: .40-.74, P<.001 for all). We observed mild to moderate associations between our machine learning predictions and responses to the large patient survey for the three categories examined (Spearman rho 0.37-0.51, P<.001 for all). The prediction accuracy that we have achieved using this machine learning process suggests that we are able to predict, from free-text, a reasonably accurate assessment of patients' opinion about different performance aspects of a hospital and that these machine learning predictions are associated with results of more conventional surveys.

  15. Nonexperimental Quantitative Research and Its Role in Guiding Instruction

    ERIC Educational Resources Information Center

    Cook, Bryan G.; Cook, Lysandra

    2008-01-01

    Different research designs answer different questions. Educators cannot use nonexperimental quantitative research designs, such as descriptive surveys and correlational research, to determine definitively that an intervention causes improved student outcomes and is an evidence-based practice. However, such research can (a) inform educators about a…

  16. 78 FR 70074 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... after the award expires for their fiscal year of activity. The indicators are both quantitative and descriptive. Quantitative information from the most recently completed fiscal year such as: [cir] Number and... respect to industrial collaboration [cir] Conducting a survey of all center participants to probe the...

  17. Perceptions of Mentoring: Examining the Experiences of Women Superintendents

    ERIC Educational Resources Information Center

    Copeland, Scarlett M.; Calhoun, Daniel W.

    2014-01-01

    This descriptive mixed methods study gathered both quantitative and qualitative data on the mentoring experiences of women superintendents in a Southeastern state. The quantitative participants included 39 women superintendents from this state and the qualitative portion of the study was comprised of eight female superintendents purposefully…

  18. 76 FR 13674 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... expires for their fiscal year of activity. The indicators are both quantitative and descriptive. Quantitative information from the most recently completed fiscal year such as: [cir] Number and diversity of... report of center activities with respect to industrial collaboration [cir] Conducting a survey of all...

  19. 78 FR 48681 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    .... Qualitative and quantitative data will be collected through progress reports, surveys, the health impact tracking tool, and interviews. Quantitative data will be analyzed using descriptive statistics. Qualitative... States (SOTS) online surveys, (3) Interviews, and (4) Online surveys related to the Regional Network...

  20. Spatiotemporal Segmentation and Modeling of the Mitral Valve in Real-Time 3D Echocardiographic Images.

    PubMed

    Pouch, Alison M; Aly, Ahmed H; Lai, Eric K; Yushkevich, Natalie; Stoffers, Rutger H; Gorman, Joseph H; Cheung, Albert T; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A

    2017-09-01

    Transesophageal echocardiography is the primary imaging modality for preoperative assessment of mitral valves with ischemic mitral regurgitation (IMR). While there are well known echocardiographic insights into the 3D morphology of mitral valves with IMR, such as annular dilation and leaflet tethering, less is understood about how quantification of valve dynamics can inform surgical treatment of IMR or predict short-term recurrence of the disease. As a step towards filling this knowledge gap, we present a novel framework for 4D segmentation and geometric modeling of the mitral valve in real-time 3D echocardiography (rt-3DE). The framework integrates multi-atlas label fusion and template-based medial modeling to generate quantitatively descriptive models of valve dynamics. The novelty of this work is that temporal consistency in the rt-3DE segmentations is enforced during both the segmentation and modeling stages with the use of groupwise label fusion and Kalman filtering. The algorithm is evaluated on rt-3DE data series from 10 patients: five with normal mitral valve morphology and five with severe IMR. In these 10 data series that total 207 individual 3DE images, each 3DE segmentation is validated against manual tracing and temporal consistency between segmentations is demonstrated. The ultimate goal is to generate accurate and consistent representations of valve dynamics that can both visually and quantitatively provide insight into normal and pathological valve function.

  1. Effects of biases in domain wall network evolution. II. Quantitative analysis

    NASA Astrophysics Data System (ADS)

    Correia, J. R. C. C. C.; Leite, I. S. C. R.; Martins, C. J. A. P.

    2018-04-01

    Domain walls form at phase transitions which break discrete symmetries. In a cosmological context, they often overclose the Universe (contrary to observational evidence), although one may prevent this by introducing biases or forcing anisotropic evolution of the walls. In a previous work [Correia et al., Phys. Rev. D 90, 023521 (2014), 10.1103/PhysRevD.90.023521], we numerically studied the evolution of various types of biased domain wall networks in the early Universe, confirming that anisotropic networks ultimately reach scaling while those with a biased potential or biased initial conditions decay. We also found that the analytic decay law obtained by Hindmarsh was in good agreement with simulations of biased potentials, but not of biased initial conditions, and suggested that the difference was related to the Gaussian approximation underlying the analytic law. Here, we extend our previous work in several ways. For the cases of biased potential and biased initial conditions, we study in detail the field distributions in the simulations, confirming that the validity (or not) of the Gaussian approximation is the key difference between the two cases. For anisotropic walls, we carry out a more extensive set of numerical simulations and compare them to the canonical velocity-dependent one-scale model for domain walls, finding that the model accurately predicts the linear scaling regime after isotropization. Overall, our analysis provides a quantitative description of the cosmological evolution of these networks.

  2. Concentration of Nicotine and Glycols in 27 Electronic Cigarette Formulations

    PubMed Central

    Peace, Michelle R.; Baird, Tyson R.; Smith, Nathaniel; Wolf, Carl E.; Poklis, Justin L.; Poklis, Alphonse

    2016-01-01

    Personal battery-powered vaporizers or electronic cigarettes were developed to deliver a nicotine vapor such that smokers could simulate smoking tobacco without the inherent pathology of inhaled tobacco smoke. Electronic cigarettes and their e-cigarette liquid formulations are virtually unregulated. These formulations are typically composed of propylene glycol and/or glycerin, flavoring components and an active drug, such as nicotine. Twenty-seven e-cigarette liquid formulations that contain nicotine between 6 and 22 mg/L were acquired within the USA and analyzed by various methods to determine their contents. They were screened by Direct Analysis in Real Time™ Mass Spectrometry (DART-MS). Nicotine was confirmed and quantitated by high-performance liquid chromatography–tandem mass spectrometry, and the glycol composition was confirmed and quantitated by gas chromatography–mass spectrometry. The DART-MS screening method was able to consistently identify the exact mass peaks resulting from the protonated molecular ion of nicotine, glycol and a number of flavor additives within 5 mmu. Nicotine concentrations were determined to range from 45 to 131% of the stated label concentration, with 18 of the 27 have >10% variance. Glycol composition was generally accurate to the product description, with only one exception where the propylene glycol to glycerin percentage ratio was stated as 50:50 and the determined concentration of propylene glycol to glycerin was 81:19 (% v/v). No unlabeled glycols were detected in these formulations. PMID:27165804

  3. s -wave scattering length of a Gaussian potential

    NASA Astrophysics Data System (ADS)

    Jeszenszki, Peter; Cherny, Alexander Yu.; Brand, Joachim

    2018-04-01

    We provide accurate expressions for the s -wave scattering length for a Gaussian potential well in one, two, and three spatial dimensions. The Gaussian potential is widely used as a pseudopotential in the theoretical description of ultracold-atomic gases, where the s -wave scattering length is a physically relevant parameter. We first describe a numerical procedure to compute the value of the s -wave scattering length from the parameters of the Gaussian, but find that its accuracy is limited in the vicinity of singularities that result from the formation of new bound states. We then derive simple analytical expressions that capture the correct asymptotic behavior of the s -wave scattering length near the bound states. Expressions that are increasingly accurate in wide parameter regimes are found by a hierarchy of approximations that capture an increasing number of bound states. The small number of numerical coefficients that enter these expressions is determined from accurate numerical calculations. The approximate formulas combine the advantages of the numerical and approximate expressions, yielding an accurate and simple description from the weakly to the strongly interacting limit.

  4. Comparison of rapid descriptive sensory methodologies: Free-Choice Profiling, Flash Profile and modified Flash Profile.

    PubMed

    Liu, Jing; Bredie, Wender L P; Sherman, Emma; Harbertson, James F; Heymann, Hildegarde

    2018-04-01

    Rapid sensory methods have been developed as alternatives to traditional sensory descriptive analysis methods. Among them, Free-Choice Profiling (FCP) and Flash Profile (FP) are two that have been known for many years. The objectives of this work were to compare the rating-based FCP and ranking-based FP method; to evaluate the impact of adding adjustments to FP approach; to investigate the influence of the number of assessors on the outcome of modified FP. To achieve these aims, a conventional descriptive analysis (DA), FCP, FP and a modified version of FP were carried out. Red wines made by different grape maturity and ethanol concentration were used for sensory testing. This study showed that DA provided a more detailed and accurate information on products through a quantitative measure of the intensity of sensory attributes than FCP and FP. However, the panel hours for conducting DA were higher than that for rapid methods, and FP was even able to separate the samples to a higher degree than DA. When comparing FCP and FP, this study showed that the ranking-based FP provided a clearer separation of samples than rating-based FCP, but the latter was an easier task for most assessors. When restricting assessors on their use of attributes in FP, the sample space became clearer and the ranking task was simplified. The FP protocol with restricted attribute sets seems to be a promising approach for efficient screening of sensory properties in wine. When increasing the number of assessors from 10 to 20 for conducting the modified FP, the outcome tended to be slightly more stable, however, one should consider the degree of panel training when deciding the optimal number of assessors for conducting FP. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Getting the Right Answers for the Right Reasons: Toward Predictive Molecular Simulations of Water with Many-Body Potential Energy Functions.

    PubMed

    Paesani, Francesco

    2016-09-20

    The central role played by water in fundamental processes relevant to different disciplines, including chemistry, physics, biology, materials science, geology, and climate research, cannot be overemphasized. It is thus not surprising that, since the pioneering work by Stillinger and Rahman, many theoretical and computational studies have attempted to develop a microscopic description of the unique properties of water under different thermodynamic conditions. Consequently, numerous molecular models based on either molecular mechanics or ab initio approaches have been proposed over the years. However, despite continued progress, the correct prediction of the properties of water from small gas-phase clusters to the liquid phase and ice through a single molecular model remains challenging. To large extent, this is due to the difficulties encountered in the accurate modeling of the underlying hydrogen-bond network in which both number and strength of the hydrogen bonds vary continuously as a result of a subtle interplay between energetic, entropic, and nuclear quantum effects. In the past decade, the development of efficient algorithms for correlated electronic structure calculations of small molecular complexes, accompanied by tremendous progress in the analytical representation of multidimensional potential energy surfaces, opened the doors to the design of highly accurate potential energy functions built upon rigorous representations of the many-body expansion (MBE) of the interaction energies. This Account provides a critical overview of the performance of the MB-pol many-body potential energy function through a systematic analysis of energetic, structural, thermodynamic, and dynamical properties as well as of vibrational spectra of water from the gas to the condensed phase. It is shown that MB-pol achieves unprecedented accuracy across all phases of water through a quantitative description of each individual term of the MBE, with a physically correct representation of both short- and long-range many-body contributions. Comparisons with experimental data probing different regions of the water potential energy surface from clusters to bulk demonstrate that MB-pol represents a major step toward the long-sought-after "universal model" capable of accurately describing the molecular properties of water under different conditions and in different environments. Along this path, future challenges include the extension of the many-body scheme adopted by MB-pol to the description of generic solutes as well as the integration of MB-pol in an efficient theoretical and computational framework to model acid-base reactions in aqueous environments. In this context, given the nontraditional form of the MB-pol energy and force expressions, synergistic efforts by theoretical/computational chemists/physicists and computer scientists will be critical for the development of high-performance software for many-body molecular dynamics simulations.

  6. Supramolecular assembly affording a ratiometric two-photon fluorescent nanoprobe for quantitative detection and bioimaging.

    PubMed

    Wang, Peng; Zhang, Cheng; Liu, Hong-Wen; Xiong, Mengyi; Yin, Sheng-Yan; Yang, Yue; Hu, Xiao-Xiao; Yin, Xia; Zhang, Xiao-Bing; Tan, Weihong

    2017-12-01

    Fluorescence quantitative analyses for vital biomolecules are in great demand in biomedical science owing to their unique detection advantages with rapid, sensitive, non-damaging and specific identification. However, available fluorescence strategies for quantitative detection are usually hard to design and achieve. Inspired by supramolecular chemistry, a two-photon-excited fluorescent supramolecular nanoplatform ( TPSNP ) was designed for quantitative analysis with three parts: host molecules (β-CD polymers), a guest fluorophore of sensing probes (Np-Ad) and a guest internal reference (NpRh-Ad). In this strategy, the TPSNP possesses the merits of (i) improved water-solubility and biocompatibility; (ii) increased tissue penetration depth for bioimaging by two-photon excitation; (iii) quantitative and tunable assembly of functional guest molecules to obtain optimized detection conditions; (iv) a common approach to avoid the limitation of complicated design by adjustment of sensing probes; and (v) accurate quantitative analysis by virtue of reference molecules. As a proof-of-concept, we utilized the two-photon fluorescent probe NHS-Ad-based TPSNP-1 to realize accurate quantitative analysis of hydrogen sulfide (H 2 S), with high sensitivity and good selectivity in live cells, deep tissues and ex vivo -dissected organs, suggesting that the TPSNP is an ideal quantitative indicator for clinical samples. What's more, TPSNP will pave the way for designing and preparing advanced supramolecular sensors for biosensing and biomedicine.

  7. Quantitative single-photon emission computed tomography/computed tomography for technetium pertechnetate thyroid uptake measurement

    PubMed Central

    Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2016-01-01

    Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139

  8. Quantitative characterization of metastatic disease in the spine. Part I. Semiautomated segmentation using atlas-based deformable registration and the level set method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardisty, M.; Gordon, L.; Agarwal, P.

    2007-08-15

    Quantitative assessment of metastatic disease in bone is often considered immeasurable and, as such, patients with skeletal metastases are often excluded from clinical trials. In order to effectively quantify the impact of metastatic tumor involvement in the spine, accurate segmentation of the vertebra is required. Manual segmentation can be accurate but involves extensive and time-consuming user interaction. Potential solutions to automating segmentation of metastatically involved vertebrae are demons deformable image registration and level set methods. The purpose of this study was to develop a semiautomated method to accurately segment tumor-bearing vertebrae using the aforementioned techniques. By maintaining morphology of anmore » atlas, the demons-level set composite algorithm was able to accurately differentiate between trans-cortical tumors and surrounding soft tissue of identical intensity. The algorithm successfully segmented both the vertebral body and trabecular centrum of tumor-involved and healthy vertebrae. This work validates our approach as equivalent in accuracy to an experienced user.« less

  9. 78 FR 53336 - List of Fisheries for 2013

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-29

    ... provided on the LOF are solely used for descriptive purposes and will not be used in determining future... this information to determine whether the fishery can be classified on the LOF based on quantitative... does not have a quantitative estimate of the number of mortalities and serious injuries of pantropical...

  10. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  11. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  12. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  13. 21 CFR 809.10 - Labeling for in vitro diagnostic products.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... procedure, e.g., qualitative or quantitative. (3) Summary and explanation of the test. Include a short... provides other than quantitative results, provide an adequate description of expected results. (10... are met: (i) For a product in the laboratory research phase of development, and not represented as an...

  14. Visual communication with Haitian women: a look at pictorial literacy.

    PubMed

    Gustafson, M B

    1986-06-01

    A study of village women in Haiti which presents baseline data from their responses to stylized health education pictures is reported. The study questioned the concept that pictorial messages were accurately recognized and self-explanatory to nonliterate Haitian village women. The investigator, who used a descriptive survey, sought answers to a major and a related question: what do nonliterate Haitian village women recognize in selected health education pictures; and are their differences in picture recognition traceable to the complexity of the pictures. There were 110 women (25 from a mountain village, 25 from a plains village, 25 from a seacoast village, and 35 urban dwellers) who responded to 9 health education pictures. The women ranged in age from 18-80 years of age; 32 (29%) had gone to school for a range of an "unknown time" to 8 years. 47% of those who had gone to school indicated that they could read. The investigator rated the verbatim responses to the pictures for accuracy as: accurate, overinclusive, underinclusive, inaccurate, and do not know. The quantitative analysis of this data revealed that the accuracy levels decreased as the complexity level increased. This is best shown in the 129 (39%) accurate responses in the low level; 6 (1.8%) in the moderate level; and no accurate responses in the high complexity level. An unexpected finding was the highest number of inaccurate responses (n = 83, 25.1%) found in the low complexity level, while the moderate and high levels both showed 36 (10.8%). In addition to the differences in accuracy in picture recognition based on picture complexity, there were significant differences on the chi-square test which confirmed the assertion of the question that picture recognition is traceable to the complexity of the picture. These findings are consistent with the picture complexity studies of Holmes, Jelliffe, and Kwansa.

  15. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    PubMed

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p < 0.05) difference in sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  16. Range and energetics of charge hopping in organic semiconductors

    NASA Astrophysics Data System (ADS)

    Abdalla, Hassan; Zuo, Guangzheng; Kemerink, Martijn

    2017-12-01

    The recent upswing in attention for the thermoelectric properties of organic semiconductors (OSCs) adds urgency to the need for a quantitative description of the range and energetics of hopping transport in organic semiconductors under relevant circumstances, i.e., around room temperature (RT). In particular, the degree to which hops beyond the nearest neighbor must be accounted for at RT is still largely unknown. Here, measurements of charge and energy transport in doped OSCs are combined with analytical modeling to reach the univocal conclusion that variable-range hopping is the proper description in a large class of disordered OSC at RT. To obtain quantitative agreement with experiment, one needs to account for the modification of the density of states by ionized dopants. These Coulomb interactions give rise to a deep tail of trap states that is independent of the material's initial energetic disorder. Insertion of this effect into a classical Mott-type variable-range hopping model allows one to give a quantitative description of temperature-dependent conductivity and thermopower measurements on a wide range of disordered OSCs. In particular, the model explains the commonly observed quasiuniversal power-law relation between the Seebeck coefficient and the conductivity.

  17. Estimation of the number of fluorescent end-members for quantitative analysis of multispectral FLIM data.

    PubMed

    Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A

    2014-05-19

    Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.

  18. A unique charge-coupled device/xenon arc lamp based imaging system for the accurate detection and quantitation of multicolour fluorescence.

    PubMed

    Spibey, C A; Jackson, P; Herick, K

    2001-03-01

    In recent years the use of fluorescent dyes in biological applications has dramatically increased. The continual improvement in the capabilities of these fluorescent dyes demands increasingly sensitive detection systems that provide accurate quantitation over a wide linear dynamic range. In the field of proteomics, the detection, quantitation and identification of very low abundance proteins are of extreme importance in understanding cellular processes. Therefore, the instrumentation used to acquire an image of such samples, for spot picking and identification by mass spectrometry, must be sensitive enough to be able, not only, to maximise the sensitivity and dynamic range of the staining dyes but, as importantly, adapt to the ever changing portfolio of fluorescent dyes as they become available. Just as the available fluorescent probes are improving and evolving so are the users application requirements. Therefore, the instrumentation chosen must be flexible to address and adapt to those changing needs. As a result, a highly competitive market for the supply and production of such dyes and the instrumentation for their detection and quantitation have emerged. The instrumentation currently available is based on either laser/photomultiplier tube (PMT) scanning or lamp/charge-coupled device (CCD) based mechanisms. This review briefly discusses the advantages and disadvantages of both System types for fluorescence imaging, gives a technical overview of CCD technology and describes in detail a unique xenon/are lamp CCD based instrument, from PerkinElmer Life Sciences. The Wallac-1442 ARTHUR is unique in its ability to scan both large areas at high resolution and give accurate selectable excitation over the whole of the UV/visible range. It operates by filtering both the excitation and emission wavelengths, providing optimal and accurate measurement and quantitation of virtually any available dye and allows excellent spectral resolution between different fluorophores. This flexibility and excitation accuracy is key to multicolour applications and future adaptation of the instrument to address the application requirements and newly emerging dyes.

  19. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  20. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  1. Changes in body composition of neonatal piglets during growth

    USDA-ARS?s Scientific Manuscript database

    During studies of neonatal piglet growth it is important to be able to accurately assess changes in body composition. Previous studies have demonstrated that quantitative magnetic resonance (QMR) provides precise and accurate measurements of total body fat mass, lean mass and total body water in non...

  2. Assessing Adult Learner's Numeracy as Related to Gender and Performance in Arithmetic

    ERIC Educational Resources Information Center

    Awofala, Adeneye O. A.; Anyikwa, Blessing E.

    2014-01-01

    The study investigated adult learner numeracy as related to gender and performance in arithmetic among 32 Nigerian adult learners from one government accredited adult literacy centre in Lagos State using the quantitative research method within the blueprint of descriptive survey design. Data collected were analysed using the descriptive statistics…

  3. 5-Aminolevulinic acid-induced protoporphyrin IX fluorescence in meningioma: qualitative and quantitative measurements in vivo.

    PubMed

    Valdes, Pablo A; Bekelis, Kimon; Harris, Brent T; Wilson, Brian C; Leblond, Frederic; Kim, Anthony; Simmons, Nathan E; Erkmen, Kadir; Paulsen, Keith D; Roberts, David W

    2014-03-01

    The use of 5-aminolevulinic acid (ALA)-induced protoporphyrin IX (PpIX) fluorescence has shown promise as a surgical adjunct for maximizing the extent of surgical resection in gliomas. To date, the clinical utility of 5-ALA in meningiomas is not fully understood, with most descriptive studies using qualitative approaches to 5-ALA-PpIX. To assess the diagnostic performance of 5-ALA-PpIX fluorescence during surgical resection of meningioma. ALA was administered to 15 patients with meningioma undergoing PpIX fluorescence-guided surgery at our institution. At various points during the procedure, the surgeon performed qualitative, visual assessments of fluorescence by using the surgical microscope, followed by a quantitative fluorescence measurement by using an intraoperative probe. Specimens were collected at each point for subsequent neuropathological analysis. Clustered data analysis of variance was used to ascertain a difference between groups, and receiver operating characteristic analyses were performed to assess diagnostic capabilities. Red-pink fluorescence was observed in 80% (12/15) of patients, with visible fluorescence generally demonstrating a strong, homogenous character. Quantitative fluorescence measured diagnostically significant PpIX concentrations (cPpIx) in both visibly and nonvisibly fluorescent tissues, with significantly higher cPpIx in both visibly fluorescent (P < .001) and tumor tissue (P = .002). Receiver operating characteristic analyses also showed diagnostic accuracies up to 90% for differentiating tumor from normal dura. ALA-induced PpIX fluorescence guidance is a potential and promising adjunct in accurately detecting neoplastic tissue during meningioma resective surgery. These results suggest a broader reach for PpIX as a biomarker for meningiomas than was previously noted in the literature.

  4. 5-Aminolevulinic Acid-Induced Protoporphyrin IX Fluorescence in Meningioma: Qualitative and Quantitative Measurements In Vivo

    PubMed Central

    Valdes, Pablo A.; Bekelis, Kimon; Harris, Brent T.; Wilson, Brian C.; Leblond, Frederic; Kim, Anthony; Simmons, Nathan E.; Erkmen, Kadir; Paulsen, Keith D.; Roberts, David W.

    2014-01-01

    BACKGROUND The use of 5-aminolevulinic acid (ALA)-induced protoporphyrin IX (PpIX) fluorescence has shown promise as a surgical adjunct for maximizing the extent of surgical resection in gliomas. To date, the clinical utility of 5-ALA in meningiomas is not fully understood, with most descriptive studies using qualitative approaches to 5-ALA-PpIX. OBJECTIVE To assess the diagnostic performance of 5-ALA-PpIX fluorescence during surgical resection of meningioma. METHODS ALA was administered to 15 patients with meningioma undergoing PpIX fluorescence-guided surgery at our institution. At various points during the procedure, the surgeon performed qualitative, visual assessments of fluorescence by using the surgical microscope, followed by a quantitative fluorescence measurement by using an intra-operative probe. Specimens were collected at each point for subsequent neuropathological analysis. Clustered data analysis of variance was used to ascertain a difference between groups, and receiver operating characteristic analyses were performed to assess diagnostic capabilities. RESULTS Red-pink fluorescence was observed in 80% (12/15) of patients, with visible fluorescence generally demonstrating a strong, homogenous character. Quantitative fluorescence measured diagnostically significant PpIX concentrations (CPpIx) in both visibly and nonvisibly fluorescent tissues, with significantly higher CPpIx in both visibly fluorescent (P < .001) and tumor tissue (P = .002). Receiver operating characteristic analyses also showed diagnostic accuracies up to 90% for differentiating tumor from normal dura. CONCLUSION ALA-induced PpIX fluorescence guidance is a potential and promising adjunct in accurately detecting neoplastic tissue during meningioma resective surgery. These results suggest a broader reach for PpIX as a biomarker for meningiomas than was previously noted in the literature. PMID:23887194

  5. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  6. Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution

    EPA Science Inventory

    Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...

  7. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  8. Quantitation of spatially-localized proteins in tissue samples using MALDI-MRM imaging.

    PubMed

    Clemis, Elizabeth J; Smith, Derek S; Camenzind, Alexander G; Danell, Ryan M; Parker, Carol E; Borchers, Christoph H

    2012-04-17

    MALDI imaging allows the creation of a "molecular image" of a tissue slice. This image is reconstructed from the ion abundances in spectra obtained while rastering the laser over the tissue. These images can then be correlated with tissue histology to detect potential biomarkers of, for example, aberrant cell types. MALDI, however, is known to have problems with ion suppression, making it difficult to correlate measured ion abundance with concentration. It would be advantageous to have a method which could provide more accurate protein concentration measurements, particularly for screening applications or for precise comparisons between samples. In this paper, we report the development of a novel MALDI imaging method for the localization and accurate quantitation of proteins in tissues. This method involves optimization of in situ tryptic digestion, followed by reproducible and uniform deposition of an isotopically labeled standard peptide from a target protein onto the tissue, using an aerosol-generating device. Data is acquired by MALDI multiple reaction monitoring (MRM) mass spectrometry (MS), and accurate peptide quantitation is determined from the ratio of MRM transitions for the endogenous unlabeled proteolytic peptides to the corresponding transitions from the applied isotopically labeled standard peptides. In a parallel experiment, the quantity of the labeled peptide applied to the tissue was determined using a standard curve generated from MALDI time-of-flight (TOF) MS data. This external calibration curve was then used to determine the quantity of endogenous peptide in a given area. All standard curves generate by this method had coefficients of determination greater than 0.97. These proof-of-concept experiments using MALDI MRM-based imaging show the feasibility for the precise and accurate quantitation of tissue protein concentrations over 2 orders of magnitude, while maintaining the spatial localization information for the proteins.

  9. Mobility-Related Teacher Turnover and the Unequal Distribution of Experienced Teachers in Turkey

    ERIC Educational Resources Information Center

    Özoglu, Murat

    2015-01-01

    This study investigates the issue of mobility-related teacher turnover in Turkey through both quantitative and qualitative methods. The quantitative findings derived from descriptive and correlational analyses of countrywide teacher-assignment and transfer data indicate that a high rate of mobility-related turnover is observed in the…

  10. A Quantitative Exploration of the Relationship between Patient Health and Electronic Personal Health Records

    ERIC Educational Resources Information Center

    Hines, Denise Williams

    2009-01-01

    The use of electronic personal health records is becoming increasingly more popular as healthcare providers, healthcare and government leaders, and patients are seeking ways to improve healthcare quality and to decrease costs (Abrahamsen, 2007). This quantitative, descriptive correlational study examined the relationship between the degree of…

  11. Competency-Based Education: A Quantitative Study of the U.S. Air Force Noncommissioned Officer Academy

    ERIC Educational Resources Information Center

    Houser, Bonnie L.

    2017-01-01

    There are relatively few empirical studies that examine whether using a competency-based education (CBE) approach results in increased student learning or achievement when compared to traditional education approaches. This study uses a quantitative research methodology, a nonexperimental comparative descriptive research design, and a two-group…

  12. An Investigation of Civilians Preparedness to Compete with Individuals with Military Experience for Army Board Select Acquisition Positions

    DTIC Science & Technology

    2017-05-25

    37 Research Design ... research employed a mixed research methodology – quantitative with descriptive statistical analysis and qualitative with a thematic analysis approach...mixed research methodology – quantitative and qualitative, using interviews to collect the data. The interviews included demographic and open-ended

  13. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    ERIC Educational Resources Information Center

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  14. A Quantitative Description of FBI Public Relations.

    ERIC Educational Resources Information Center

    Gibson, Dirk C.

    1997-01-01

    States that the Federal Bureau of Investigation (FBI) had the most successful media relations program of all government agencies from the 1930s to the 1980s. Uses quantitative analysis to show why those media efforts were successful. Identifies themes that typified the verbal component of FBI publicity and the broad spectrum of mass communication…

  15. Effect of Group versus Individual Assessments on Coursework among Undergraduates in Tanzania: Implications for Continuous Assessments in Universities

    ERIC Educational Resources Information Center

    Mbalamula, Yazidu Saidi

    2018-01-01

    The study analyzes students' performance scores in formative assessments depicting the individual and group settings. A case study design was adopted using quantitative approach to extract data of 198 undergraduate students. Data were analyzed quantitatively using descriptive statistics--means and frequencies; spearman correlations, multiple…

  16. A Quantitative Study of the Effectiveness of Teacher Recruitment Strategies in a Rural Midwestern State

    ERIC Educational Resources Information Center

    Kane, Rose Etta

    2010-01-01

    A problem in American education is that rural schools have difficulty recruiting licensed teachers. Teacher shortages in mathematics, science, foreign language, and special education are more acute in rural areas. The purpose of this quantitative descriptive survey study was to examine specific recruiting strategies and newly hired licensed…

  17. Strengthening Student Engagement with Quantitative Subjects in a Business Faculty

    ERIC Educational Resources Information Center

    Warwick, Jon; Howard, Anna

    2014-01-01

    This paper reflects on the results of research undertaken at a large UK university relating to the teaching of quantitative subjects within a Business Faculty. It builds on a simple model of student engagement and, through the description of three case studies, describes research undertaken and developments implemented to strengthen aspects of the…

  18. Linking descriptive geology and quantitative machine learning through an ontology of lithological concepts

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Huber, R.; Robertson, J.; Cox, S. J. D.; Woodcock, R.

    2014-12-01

    Despite the recent explosion of quantitative geological data, geology remains a fundamentally qualitative science. Numerical data only constitute a certain part of data collection in the geosciences. In many cases, geological observations are compiled as text into reports and annotations on drill cores, thin sections or drawings of outcrops. The observations are classified into concepts such as lithology, stratigraphy, geological structure, etc. These descriptions are semantically rich and are generally supported by more quantitative observations using geochemical analyses, XRD, hyperspectral scanning, etc, but the goal is geological semantics. In practice it has been difficult to bring the different observations together due to differing perception or granularity of classification in human observation, or the partial observation of only some characteristics using quantitative sensors. In the past years many geological classification schemas have been transferred into ontologies and vocabularies, formalized using RDF and OWL, and published through SPARQL endpoints. Several lithological ontologies were compiled by stratigraphy.net and published through a SPARQL endpoint. This work is complemented by the development of a Python API to integrate this vocabulary into Python-based text mining applications. The applications for the lithological vocabulary and Python API are automated semantic tagging of geochemical data and descriptions of drill cores, machine learning of geochemical compositions that are diagnostic for lithological classifications, and text mining for lithological concepts in reports and geological literature. This combination of applications can be used to identify anomalies in databases, where composition and lithological classification do not match. It can also be used to identify lithological concepts in the literature and infer quantitative values. The resulting semantic tagging opens new possibilities for linking these diverse sources of data.

  19. Ecological Change, Sliding Baselines and the Importance of Historical Data: Lessons from Combing Observational and Quantitative Data on a Temperate Reef Over 70 Years

    PubMed Central

    Gatti, Giulia; Bianchi, Carlo Nike; Parravicini, Valeriano; Rovere, Alessio; Peirano, Andrea; Montefalcone, Monica; Massa, Francesco; Morri, Carla

    2015-01-01

    Understanding the effects of environmental change on ecosystems requires the identification of baselines that may act as reference conditions. However, the continuous change of these references challenges our ability to define the true natural status of ecosystems. The so-called sliding baseline syndrome can be overcome through the analysis of quantitative time series, which are, however, extremely rare. Here we show how combining historical quantitative data with descriptive ‘naturalistic’ information arranged in a chronological chain allows highlighting long-term trends and can be used to inform present conservation schemes. We analysed the long-term change of a coralligenous reef, a marine habitat endemic to the Mediterranean Sea. The coralligenous assemblages of Mesco Reef (Ligurian Sea, NW Mediterranean) have been studied, although discontinuously, since 1937 thus making available both detailed descriptive information and scanty quantitative data: while the former was useful to understand the natural history of the ecosystem, the analysis of the latter was of paramount importance to provide a formal measure of change over time. Epibenthic assemblages remained comparatively stable until the 1990s, when species replacement, invasion by alien algae, and biotic homogenisation occurred within few years, leading to a new and completely different ecosystem state. The shift experienced by the coralligenous assemblages of Mesco Reef was probably induced by a combination of seawater warming and local human pressures, the latter mainly resulting in increased water turbidity; in turn, cumulative stress may have favoured the establishment of alien species. This study showed that the combined analysis of quantitative and descriptive historical data represent a precious knowledge to understand ecosystem trends over time and provide help to identify baselines for ecological management. PMID:25714413

  20. Investigating the Validity of Two Widely Used Quantitative Text Tools

    ERIC Educational Resources Information Center

    Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne

    2018-01-01

    In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…

  1. NEW TARGET AND CONTROL ASSAYS FOR QUANTITATIVE POLYMERASE CHAIN REACTION (QPCR) ANALYSIS OF ENTEROCOCCI IN WATER

    EPA Science Inventory

    Enterococci are frequently monitored in water samples as indicators of fecal pollution. Attention is now shifting from culture based methods for enumerating these organisms to more rapid molecular methods such as QPCR. Accurate quantitative analyses by this method requires highly...

  2. Equations for description of nonlinear standing waves in constant-cross-sectioned resonators.

    PubMed

    Bednarik, Michal; Cervenka, Milan

    2014-03-01

    This work is focused on investigation of applicability of two widely used model equations for description of nonlinear standing waves in constant-cross-sectioned resonators. The investigation is based on the comparison of numerical solutions of these model equations with solutions of more accurate model equations whose validity has been verified experimentally in a number of published papers.

  3. Voice Identification: Levels-of-Processing and the Relationship between Prior Description Accuracy and Recognition Accuracy.

    ERIC Educational Resources Information Center

    Walter, Todd J.

    A study examined whether a person's ability to accurately identify a voice is influenced by factors similar to those proposed by the Supreme Court for eyewitness identification accuracy. In particular, the Supreme Court has suggested that a person's prior description accuracy of a suspect, degree of attention to a suspect, and confidence in…

  4. A new LC-MS based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous system

    PubMed Central

    Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.

    2018-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718

  5. The generalized Sellmeier equation for air

    PubMed Central

    Voronin, A. A.; Zheltikov, A. M.

    2017-01-01

    We present a compact, uniform generalized Sellmeier-equation (GSE) description of air refraction and its dispersion that remains highly accurate within an ultrabroad spectral range from the ultraviolet to the long-wavelength infrared. While the standard Sellmeier equation (SSE) for atmospheric air is not intended for the description of air refractivity in the mid-infrared and long-wavelength infrared, failing beyond, roughly 2.5 μm, our generalization of this equation is shown to agree remarkably well with full-scale air-refractivity calculations involving over half a million atmospheric absorption lines, providing a highly accurate description of air refractivity in the range of wavelengths from 0.3 to 13 μm. With its validity range being substantially broader than the applicability range of the SSE and its accuracy being at least an order of magnitude higher than the accuracy that the SSE can provide even within its validity range, the GSE-based approach offers a powerful analytical tool for the rapidly progressing mid- and long-wavelength-infrared optics of the atmosphere. PMID:28836624

  6. How variations in distance affect eyewitness reports and identification accuracy.

    PubMed

    Lindsay, R C L; Semmler, Carolyn; Weber, Nathan; Brewer, Neil; Lindsay, Marilyn R

    2008-12-01

    Witnesses observe crimes at various distances and the courts have to interpret their testimony given the likely quality of witnesses' views of events. We examined how accurately witnesses judged the distance between themselves and a target person, and how distance affected description accuracy, choosing behavior, and identification test accuracy. Over 1,300 participants were approached during normal daily activities, and asked to observe a target person at one of a number of possible distances. Under a Perception, Immediate Memory, or Delayed Memory condition, witnesses provided a brief description of the target, estimated the distance to the target, and then examined a 6-person target-present or target-absent lineup to see if they could identify the target. Errors in distance judgments were often substantial. Description accuracy was mediocre and did not vary systematically with distance. Identification choosing rates were not affected by distance, but decision accuracy declined with distance. Contrary to previous research, a 15-m viewing distance was not critical for discriminating accurate from inaccurate decisions.

  7. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE PAGES

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    2017-01-18

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  8. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  9. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    PubMed Central

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  10. Monitoring the injured brain: registered, patient specific atlas models to improve accuracy of recovered brain saturation values

    NASA Astrophysics Data System (ADS)

    Clancy, Michael; Belli, Antonio; Davies, David; Lucas, Samuel J. E.; Su, Zhangjie; Dehghani, Hamid

    2015-07-01

    The subject of superficial contamination and signal origins remains a widely debated topic in the field of Near Infrared Spectroscopy (NIRS), yet the concept of using the technology to monitor an injured brain, in a clinical setting, poses additional challenges concerning the quantitative accuracy of recovered parameters. Using high density diffuse optical tomography probes, quantitatively accurate parameters from different layers (skin, bone and brain) can be recovered from subject specific reconstruction models. This study assesses the use of registered atlas models for situations where subject specific models are not available. Data simulated from subject specific models were reconstructed using the 8 registered atlas models implementing a regional (layered) parameter recovery in NIRFAST. A 3-region recovery based on the atlas model yielded recovered brain saturation values which were accurate to within 4.6% (percentage error) of the simulated values, validating the technique. The recovered saturations in the superficial regions were not quantitatively accurate. These findings highlight differences in superficial (skin and bone) layer thickness between the subject and atlas models. This layer thickness mismatch was propagated through the reconstruction process decreasing the parameter accuracy.

  11. Robust and fast characterization of OCT-based optical attenuation using a novel frequency-domain algorithm for brain cancer detection

    NASA Astrophysics Data System (ADS)

    Yuan, Wu; Kut, Carmen; Liang, Wenxuan; Li, Xingde

    2017-03-01

    Cancer is known to alter the local optical properties of tissues. The detection of OCT-based optical attenuation provides a quantitative method to efficiently differentiate cancer from non-cancer tissues. In particular, the intraoperative use of quantitative OCT is able to provide a direct visual guidance in real time for accurate identification of cancer tissues, especially these without any obvious structural layers, such as brain cancer. However, current methods are suboptimal in providing high-speed and accurate OCT attenuation mapping for intraoperative brain cancer detection. In this paper, we report a novel frequency-domain (FD) algorithm to enable robust and fast characterization of optical attenuation as derived from OCT intensity images. The performance of this FD algorithm was compared with traditional fitting methods by analyzing datasets containing images from freshly resected human brain cancer and from a silica phantom acquired by a 1310 nm swept-source OCT (SS-OCT) system. With graphics processing unit (GPU)-based CUDA C/C++ implementation, this new attenuation mapping algorithm can offer robust and accurate quantitative interpretation of OCT images in real time during brain surgery.

  12. Quantitative Methods in Library and Information Science Literature: Descriptive vs. Inferential Statistics.

    ERIC Educational Resources Information Center

    Brattin, Barbara C.

    Content analysis was performed on the top six core journals for 1990 in library and information science to determine the extent of research in the field. Articles (n=186) were examined for descriptive or inferential statistics and separately for the presence of mathematical models. Results show a marked (14%) increase in research for 1990,…

  13. Thermogravimetric analysis for rapid assessment of moisture diffusivity in polydisperse powder and thin film matrices.

    PubMed

    Thirunathan, Praveena; Arnz, Patrik; Husny, Joeska; Gianfrancesco, Alessandro; Perdana, Jimmy

    2018-03-01

    Accurate description of moisture diffusivity is key to precisely understand and predict moisture transfer behaviour in a matrix. Unfortunately, measuring moisture diffusivity is not trivial, especially at low moisture values and/or elevated temperatures. This paper presents a novel experimental procedure to accurately measure moisture diffusivity based on thermogravimetric approach. The procedure is capable to measure diffusivity even at elevated temperatures (>70°C) and low moisture values (>1%). Diffusivity was extracted from experimental data based on "regular regime approach". The approach was tailored to determine diffusivity from thin film and from poly-dispersed powdered samples. Subsequently, measured diffusivity was validated by comparing to available literature data, showing good agreement. Ability of this approach to accurately measure diffusivity at a wider range of temperatures provides better insight on temperature dependency of diffusivity. Thus, this approach can be crucial to ensure good accuracy of moisture transfer description/prediction especially when involving elevated temperatures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Highly Accurate Quantitative Analysis Of Enantiomeric Mixtures from Spatially Frequency Encoded 1H NMR Spectra.

    PubMed

    Plainchont, Bertrand; Pitoux, Daisy; Cyrille, Mathieu; Giraud, Nicolas

    2018-02-06

    We propose an original concept to measure accurately enantiomeric excesses on proton NMR spectra, which combines high-resolution techniques based on a spatial encoding of the sample, with the use of optically active weakly orienting solvents. We show that it is possible to simulate accurately dipolar edited spectra of enantiomers dissolved in a chiral liquid crystalline phase, and to use these simulations to calibrate integrations that can be measured on experimental data, in order to perform a quantitative chiral analysis. This approach is demonstrated on a chemical intermediate for which optical purity is an essential criterion. We find that there is a very good correlation between the experimental and calculated integration ratios extracted from G-SERF spectra, which paves the way to a general method of determination of enantiomeric excesses based on the observation of 1 H nuclei.

  15. Accurate quantitative CF-LIBS analysis of both major and minor elements in alloys via iterative correction of plasma temperature and spectral intensity

    NASA Astrophysics Data System (ADS)

    Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA

    2018-03-01

    The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.

  16. Electronically excited and ionized states in condensed phase: Theory and applications

    NASA Astrophysics Data System (ADS)

    Sadybekov, Arman

    Predictive modeling of chemical processes in silico is a goal of XXI century. While robust and accurate methods exist for ground-state properties, reliable methods for excited states are still lacking and require further development. Electronically exited states are formed by interactions of matter with light and are responsible for key processes in solar energy harvesting, vision, artificial sensors, and photovoltaic applications. The greatest challenge to overcome on our way to a quantitative description of light-induced processes is accurate inclusion of the effect of the environment on excited states. All above mentioned processes occur in solution or solid state. Yet, there are few methodologies to study excited states in condensed phase. Application of highly accurate and robust methods, such as equation-of-motion coupled-cluster theory EOM-CC, is limited by a high computational cost and scaling precluding full quantum mechanical treatment of the entire system. In this thesis we present successful application of the EOM-CC family of methods to studies of excited states in liquid phase and build hierarchy of models for inclusion of the solvent effects. In the first part of the thesis we show that a simple gasphase model is sufficient to quantitatively analyze excited states in liquid benzene, while the latter part emphasizes the importance of explicit treatment of the solvent molecules in the case of glycine in water solution. In chapter 2, we use a simple dimer model to describe exciton formation in liquid and solid benzene. We show that sampling of dimer structures extracted from the liquid benzene is sufficient to correctly predict exited-state properties of the liquid. Our calculations explain experimentally observed features, which helped to understand the mechanism of the excimer formation in liquid benzene. Furthermore, we shed light on the difference between dimer configurations in the first solvation shell of liquid benzene and in unit cell of solid benzene and discussed the impact of these differences on the formation of the excimer state. In chapter 3, we present a theoretical approach for calculating core-level states in condensed phase. The approach is based on EOM-CC and effective fragment potential (EFP) method. By introducing an approximate treatment of double excitations in the EOM-CCSD (EOM-CC with single and double substitutions) ansatz, we addressed poor convergence issues that are encountered for the core-level states and significantly reduced computational costs. While the approximations introduce relatively large errors in the absolute values of transition energies, the errors are systematic. Consequently, chemical shifts, changes in ionization energies relative to the reference systems, are reproduced reasonably well. By using different protonation forms of solvated glycine as a benchmark system, we showed that our protocol is capable of reproducing the experimental chemical shifts with a quantitative accuracy. The results demonstrate that chemical shifts are very sensitive to the solvent interactions and that explicit treatment of solvent, such as EFP, is essential for achieving quantitative accuracy. In chapter 4, we outline future directions and discuss possible applications of the developed computational protocol for prediction of core chemical shifts in larger systems.

  17. Doppler-broadened NICE-OHMS beyond the cavity-limited weak absorption condition - II: Experimental verification

    NASA Astrophysics Data System (ADS)

    Hausmaninger, Thomas; Silander, Isak; Ma, Weiguang; Axner, Ove

    2016-01-01

    Doppler-broadened (Db) noise-immune cavity-enhanced optical heterodyne molecular spectrometry (NICE-OHMS) is normally described by an expression, here termed the conventional (CONV) description, that is restricted to the conventional cavity-limited weak absorption condition (CCLWA), i.e. when the single pass absorbance is significantly smaller than the empty cavity losses, i.e. when α0 L < < π / F. To describe NICE-OHMS signals beyond this limit two simplified extended descriptions (termed the extended locking and extended transmission description, ELET, and the extended locking and full transmission description, ELFT), which are assumed to be valid under the relaxed cavity-limited weak absorption condition (RCLWA), i.e. when α0 L < π / F, and a full description (denoted FULL), presumed to be valid also when the α0 L < π / F condition does not hold, have recently been derived in an accompanying work (Ma W, et al. Doppler-broadened NICE-OHMS beyond the cavity-limited weak absorption condition - I. Theoretical Description. J Quant Spectrosc Radiat Transfer, 2015, http://dx.doi.org/10.1016/j.jqsrt.2015.09.007). The present work constitutes an experimental verification and assessment of the validity of these, performed in the Doppler limit for a set of Fα0 L / π values (up to 3.5); it is shown under which conditions the various descriptions are valid. It is concluded that for samples with Fα0 L / π up to 0.01, all descriptions replicate the data well. It is shown that the CONV description is adequate and provides accurate assessments of the signal strength (and thereby the analyte concentration) up to Fα0 L / π of around 0.1, while the ELET is accurate for Fα0 L / π up to around 0.3. The ELFT description mimics the Db NICE-OHMS signal well for Fα0 L / π up to around unity, while the FULL description is adequate for all Fα0 L / π values investigated. Access to these descriptions both increases considerably the dynamic range of the technique and facilitates calibration using certified reference gases, which thereby significantly broadens the applicability of the Db NICE-OHMS technique.

  18. SINGLE INSTITUTION VARIABILITY IN INTENSITY MODULATED RADIATION TARGET DELINEATION FOR CANINE NASAL NEOPLASIA.

    PubMed

    Christensen, Neil I; Forrest, Lisa J; White, Pamela J; Henzler, Margaret; Turek, Michelle M

    2016-11-01

    Contouring variability is a significant barrier to the accurate delivery and reporting of radiation therapy. The aim of this descriptive study was to determine the variation in contouring radiation targets and organs at risk by participants within our institution. Further, we also aimed to determine if all individuals contoured the same normal tissues. Two canine nasal tumor datasets were selected and contoured by two ACVR-certified radiation oncologists and two radiation oncology residents from the same institution. Eight structures were consistently contoured including the right and left eye, the right and left lens, brain, the gross tumor volume (GTV), clinical target volume (CTV), and planning target volume (PTV). Spinal cord, hard and soft palate, and bulla were contoured on 50% of datasets. Variation in contouring occurred in both targets and normal tissues at risk and was particularly significant for the GTV, CTV, and PTV. The mean metric score and dice similarity coefficient were below the threshold criteria in 37.5-50% and 12.5-50% of structures, respectively, quantitatively indicating contouring variation. This study refutes our hypothesis that minimal variation in target and normal tissue delineation occurs. The variation in contouring may contribute to different tumor response and toxicity for any given patient. Our results also highlight the difficulty associated with replication of published radiation protocols or treatments, as even with replete contouring description the outcome of treatment is still fundamentally influenced by the individual contouring the patient. © 2016 American College of Veterinary Radiology.

  19. High performance thin layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) for the qualitative and quantitative analysis of Calendula officinalis-advantages and limitations.

    PubMed

    Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana

    2014-09-01

    Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. [Doppler echocardiography of tricuspid insufficiency. Methods of quantification].

    PubMed

    Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P

    1994-01-01

    Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.

  1. Finding the bottom and using it

    PubMed Central

    Sandoval, Ruben M.; Wang, Exing; Molitoris, Bruce A.

    2014-01-01

    Maximizing 2-photon parameters used in acquiring images for quantitative intravital microscopy, especially when high sensitivity is required, remains an open area of investigation. Here we present data on correctly setting the black level of the photomultiplier tube amplifier by adjusting the offset to allow for accurate quantitation of low intensity processes. When the black level is set too high some low intensity pixel values become zero and a nonlinear degradation in sensitivity occurs rendering otherwise quantifiable low intensity values virtually undetectable. Initial studies using a series of increasing offsets for a sequence of concentrations of fluorescent albumin in vitro revealed a loss of sensitivity for higher offsets at lower albumin concentrations. A similar decrease in sensitivity, and therefore the ability to correctly determine the glomerular permeability coefficient of albumin, occurred in vivo at higher offset. Finding the offset that yields accurate and linear data are essential for quantitative analysis when high sensitivity is required. PMID:25313346

  2. Quantitative fluorescence tomography using a trimodality system: in vivo validation

    PubMed Central

    Lin, Yuting; Barber, William C.; Iwanczyk, Jan S.; Roeck, Werner W.; Nalcioglu, Orhan; Gulsen, Gultekin

    2010-01-01

    A fully integrated trimodality fluorescence, diffuse optical, and x-ray computed tomography (FT∕DOT∕XCT) system for small animal imaging is reported in this work. The main purpose of this system is to obtain quantitatively accurate fluorescence concentration images using a multimodality approach. XCT offers anatomical information, while DOT provides the necessary background optical property map to improve FT image accuracy. The quantitative accuracy of this trimodality system is demonstrated in vivo. In particular, we show that a 2-mm-diam fluorescence inclusion located 8 mm deep in a nude mouse can only be localized when functional a priori information from DOT is available. However, the error in the recovered fluorophore concentration is nearly 87%. On the other hand, the fluorophore concentration can be accurately recovered within 2% error when both DOT functional and XCT structural a priori information are utilized together to guide and constrain the FT reconstruction algorithm. PMID:20799770

  3. Development of a standardized job description for healthcare managers of metabolic syndrome management programs in Korean community health centers.

    PubMed

    Lee, Youngjin; Choo, Jina; Cho, Jeonghyun; Kim, So-Nam; Lee, Hye-Eun; Yoon, Seok-Jun; Seomun, GyeongAe

    2014-03-01

    This study aimed to develop a job description for healthcare managers of metabolic syndrome management programs using task analysis. Exploratory research was performed by using the Developing a Curriculum method, the Intervention Wheel model, and focus group discussions. Subsequently, we conducted a survey of 215 healthcare workers from 25 community health centers to verify that the job description we created was accurate. We defined the role of healthcare managers. Next, we elucidated the tasks of healthcare managers and performed needs analysis to examine the frequency, importance, and difficulty of each of their duties. Finally, we verified that our job description was accurate. Based on the 8 duties, 30 tasks, and 44 task elements assigned to healthcare managers, we found that the healthcare managers functioned both as team coordinators responsible for providing multidisciplinary health services and nurse specialists providing health promotion services. In terms of importance and difficulty of tasks performed by the healthcare managers, which were measured using a determinant coefficient, the highest-ranked task was planning social marketing (15.4), while the lowest-ranked task was managing human resources (9.9). A job description for healthcare managers may provide basic data essential for the development of a job training program for healthcare managers working in community health promotion programs. Copyright © 2014. Published by Elsevier B.V.

  4. Quantitative description of respiration processes in meso-eutrophic and eutrophic freshwater environments.

    PubMed

    Kiersztyn, Bartosz; Kauppinen, Elsi S; Kaliński, Tomasz; Chróst, Ryszard; Siuda, Waldemar

    2018-06-01

    We propose a modification of measurement methodology allowing the overall respiration rate (V Resp ) close to the in situ conditions; size of the labile, respirable organic matter pool (OM Resp ); and its turnover time (Tt) to be calculated. In addition to the respiration of dissolved substrates by free-living bacteria, the respiration of attached bacteria and other planktonic organisms is also taken into account. In case study we evaluated the modified, quantitative description of respiration processes in surface waters of lakes of different trophic status: mezzo-eutrophic and eutrophic. In both types of studied environments, V Resp oscillated between 1.0 μmol C l -1  h -1 and 3.0 μmol C l -1  h -1 , and the size of the OM Resp pool varied from 39.3 μM C to 828.7 μM C. Despite of higher OM Resp concentrations in eutrophic lakes, we found a lower susceptibility of OM to respiration processes in eutrophic than in meso-eutrophic lakes but similar V Resp in both types of lakes. We conclude that the proposed method allows a fast quantitative description of labile organic matter utilization by aerobic aquatic microorganisms. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Quantitative study of Xanthosoma violaceum leaf surfaces using RIMAPS and variogram techniques.

    PubMed

    Favret, Eduardo A; Fuentes, Néstor O; Molina, Ana M

    2006-08-01

    Two new imaging techniques (rotated image with maximum averaged power spectrum (RIMAPS) and variogram) are presented for the study and description of leaf surfaces. Xanthosoma violaceum was analyzed to illustrate the characteristics of both techniques. Both techniques produce a quantitative description of leaf surface topography. RIMAPS combines digitized images rotation with Fourier transform, and it is used to detect patterns orientation and characteristics of surface topography. Variogram relates the mathematical variance of a surface with the area of the sample window observed. It gives the typical scale lengths of the surface patterns. RIMAPS detects the morphological variations of the surface topography pattern between fresh and dried (herbarium) samples of the leaf. The variogram method finds the characteristic dimensions of the leaf microstructure, i.e., cell length, papillae diameter, etc., showing that there are not significant differences between dry and fresh samples. The results obtained show the robustness of RIMAPS and variogram analyses to detect, distinguish, and characterize leaf surfaces, as well as give scale lengths. Both techniques are tools for the biologist to study variations of the leaf surface when different patterns are present. The use of RIMAPS and variogram opens a wide spectrum of possibilities by providing a systematic, quantitative description of the leaf surface topography.

  6. Reporting and Interpreting Quantitative Research Findings: What Gets Reported and Recommendations for the Field

    ERIC Educational Resources Information Center

    Larson-Hall, Jenifer; Plonsky, Luke

    2015-01-01

    This paper presents a set of guidelines for reporting on five types of quantitative data issues: (1) Descriptive statistics, (2) Effect sizes and confidence intervals, (3) Instrument reliability, (4) Visual displays of data, and (5) Raw data. Our recommendations are derived mainly from various professional sources related to L2 research but…

  7. Contextualising Mathematics Self-Efficacy and Performance among Rural High School Students in the Philippines

    ERIC Educational Resources Information Center

    Causapin, Mark

    2016-01-01

    This paper expands the scope of typical quantitative studies on mathematics self-efficacy by including a short ethnography of the students' daily classroom experiences. It attempts to provide a "thicker description" and a context in which these beliefs could be interpreted. Using both qualitative and quantitative sets of data, it is…

  8. Educational Preparation for the Role of the School Nurse: Perceptions of School Nurses in Washington State

    ERIC Educational Resources Information Center

    Newell, Mary E.

    2013-01-01

    The purpose of this quantitative research study was to identify the perceptions of currently practicing school nurses regarding their baccalaureate nursing education and determine if they felt adequately prepared to effectively practice in the role of a school nurse. A descriptive, quantitative on-line survey was conducted of Washington State…

  9. Students' Misconceptions about the Ozone Layer and the Effect of Internet-Based Media on It

    ERIC Educational Resources Information Center

    Gungordu, Nahide; Yalcin-Celik, Ayse; Kilic, Ziya

    2017-01-01

    In this study, students' misconceptions about the ozone layer were investigated, looking specifically at the effect internet-based media has on the formation of these misconceptions. Quantitative and qualitative research approaches were used to perform the research. As part of the quantitative portion of the research, the descriptive survey…

  10. A Quantitative Features Analysis of Recommended No- and Low-Cost Preschool E-Books

    ERIC Educational Resources Information Center

    Parette, Howard P.; Blum, Craig; Luthin, Katie

    2015-01-01

    In recent years, recommended e-books have drawn increasing attention from early childhood education professionals. This study applied a quantitative descriptive features analysis of cost (n = 70) and no-cost (n = 60) e-books recommended by the Texas Computer Education Association. While t tests revealed no statistically significant differences…

  11. Arterial pressure transfer characteristics: effects of travel time.

    PubMed

    Westerhof, Berend E; Guelen, Ilja; Stok, Wim J; Wesseling, Karel H; Spaan, Jos A E; Westerhof, Nico; Bos, Willem Jan; Stergiopulos, Nikos

    2007-02-01

    We investigated the quantitative contribution of all local conduit arterial, blood, and distal load properties to the pressure transfer function from brachial artery to aorta. The model was based on anatomical data, Young's modulus, wall viscosity, blood viscosity, and blood density. A three-element windkessel represented the distal arterial tree. Sensitivity analysis was performed in terms of frequency and magnitude of the peak of the transfer function and in terms of systolic, diastolic, and pulse pressure in the aorta. The root mean square error (RMSE) described the accuracy in wave-shape prediction. The percent change of these variables for a 25% alteration of each of the model parameters was calculated. Vessel length and diameter are found to be the most important parameters determining pressure transfer. Systolic and diastolic pressure changed <3% and RMSE <1.8 mmHg for a 25% change in vessel length and diameter. To investigate how arterial tapering influences the pressure transfer, a single uniform lossless tube was modeled. This simplification introduced only small errors in systolic and diastolic pressures (1% and 0%, respectively), and wave shape was less well described (RMSE, approximately 2.1 mmHg). Local (arm) vasodilation affects the transfer function little, because it has limited effect on the reflection coefficient. Since vessel length and diameter translate into travel time, this parameter can describe the transfer accurately. We suggest that with a, preferably, noninvasively measured travel time, an accurate individualized description of pressure transfer can be obtained.

  12. Design of a positioning system for a holographic otoscope

    NASA Astrophysics Data System (ADS)

    Dobrev, I.; Flores Moreno, J. M.; Furlong, C.; Harrington, E. J.; Rosowski, J. J.; Scarpino, C.

    2010-08-01

    Current ear examination procedures provide mostly qualitative information which results in insufficient or erroneous description of the patient's hearing. Much more quantitative and accurate results can be achieved with a holographic otoscope system currently under development. Various ways of accurate positioning and stabilization of the system in real-life conditions are being investigated by this project in an attempt to bring this new technology to the hospitals and clinics, in order to improve the quality of the treatments and operations of the human ear. The project is focused at developing a mechatronic system capable of positioning the holographic otoscope to the patient's ear and maintaining its relative orientation during the examination. The system will be able to be guided by the examiner, but it will maintain the chosen position automatically. To achieve that, various trajectories are being measured for existing otoscopes being guided by doctors in real medical conditions. Based on that, various kinematic configurations are to be synthesized and their stability and accuracy will be simulated and optimized with FEA. For simplification, the mechanism will contain no actuators, but only adjustable friction elements in a haptic feedback control system. This renders the positioning system safe and easily applicable to current examination rooms. Other means of stabilization of the system are being investigated such as custom designed packaging of all of the otoscope subsystems, interferometrically compensating for the heartbeat induced vibration of the tympanic membrane as well as methods for monitoring and active response to the motion of the patient's head.

  13. Untangling the Effect of Head Acceleration on Brain Responses to Blast Waves

    PubMed Central

    Mao, Haojie; Unnikrishnan, Ginu; Rakesh, Vineet; Reifman, Jaques

    2015-01-01

    Multiple injury-causing mechanisms, such as wave propagation, skull flexure, cavitation, and head acceleration, have been proposed to explain blast-induced traumatic brain injury (bTBI). An accurate, quantitative description of the individual contribution of each of these mechanisms may be necessary to develop preventive strategies against bTBI. However, to date, despite numerous experimental and computational studies of bTBI, this question remains elusive. In this study, using a two-dimensional (2D) rat head model, we quantified the contribution of head acceleration to the biomechanical response of brain tissues when exposed to blast waves in a shock tube. We compared brain pressure at the coup, middle, and contre-coup regions between a 2D rat head model capable of simulating all mechanisms (i.e., the all-effects model) and an acceleration-only model. From our simulations, we determined that head acceleration contributed 36–45% of the maximum brain pressure at the coup region, had a negligible effect on the pressure at the middle region, and was responsible for the low pressure at the contre-coup region. Our findings also demonstrate that the current practice of measuring rat brain pressures close to the center of the brain would record only two-thirds of the maximum pressure observed at the coup region. Therefore, to accurately capture the effects of acceleration in experiments, we recommend placing a pressure sensor near the coup region, especially when investigating the acceleration mechanism using different experimental setups. PMID:26458125

  14. Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain

    PubMed Central

    Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.

    2011-01-01

    We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568

  15. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  16. Quantitative Large-Scale Three-Dimensional Imaging of Human Kidney Biopsies: A Bridge to Precision Medicine in Kidney Disease.

    PubMed

    Winfree, Seth; Dagher, Pierre C; Dunn, Kenneth W; Eadon, Michael T; Ferkowicz, Michael; Barwinska, Daria; Kelly, Katherine J; Sutton, Timothy A; El-Achkar, Tarek M

    2018-06-05

    Kidney biopsy remains the gold standard for uncovering the pathogenesis of acute and chronic kidney diseases. However, the ability to perform high resolution, quantitative, molecular and cellular interrogation of this precious tissue is still at a developing stage compared to other fields such as oncology. Here, we discuss recent advances in performing large-scale, three-dimensional (3D), multi-fluorescence imaging of kidney biopsies and quantitative analysis referred to as 3D tissue cytometry. This approach allows the accurate measurement of specific cell types and their spatial distribution in a thick section spanning the entire length of the biopsy. By uncovering specific disease signatures, including rare occurrences, and linking them to the biology in situ, this approach will enhance our understanding of disease pathogenesis. Furthermore, by providing accurate quantitation of cellular events, 3D cytometry may improve the accuracy of prognosticating the clinical course and response to therapy. Therefore, large-scale 3D imaging and cytometry of kidney biopsy is poised to become a bridge towards personalized medicine for patients with kidney disease. © 2018 S. Karger AG, Basel.

  17. Improved sample preparation of glyphosate and methylphosphonic acid by EPA method 6800A and time-of-flight mass spectrometry using novel solid-phase extraction.

    PubMed

    Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip

    2012-02-01

    The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  19. Qualitative and quantitative descriptions of glenohumeral motion.

    PubMed

    Hill, A M; Bull, A M J; Wallace, A L; Johnson, G R

    2008-02-01

    Joint modelling plays an important role in qualitative and quantitative descriptions of both normal and abnormal joints, as well as predicting outcomes of alterations to joints in orthopaedic practice and research. Contemporary efforts in modelling have focussed upon the major articulations of the lower limb. Well-constrained arthrokinematics can form the basis of manageable kinetic and dynamic mathematical predictions. In order to contain computation of shoulder complex modelling, glenohumeral joint representations in both limited and complete shoulder girdle models have undergone a generic simplification. As such, glenohumeral joint models are often based upon kinematic descriptions of inadequate degrees of freedom (DOF) for clinical purposes and applications. Qualitative descriptions of glenohumeral motion range from the parody of a hinge joint to the complex realism of a spatial joint. In developing a model, a clear idea of intention is required in order to achieve a required application. Clinical applicability of a model requires both descriptive and predictive output potentials, and as such, a high level of validation is required. Without sufficient appreciation of the clinical intention of the arthrokinematic foundation to a model, error is all too easily introduced. Mathematical description of joint motion serves to quantify all relevant clinical parameters. Commonly, both the Euler angle and helical (screw) axis methods have been applied to the glenohumeral joint, although concordance between these methods and classical anatomical appreciation of joint motion is limited, resulting in miscommunication between clinician and engineer. Compounding these inconsistencies in motion quantification is gimbal lock and sequence dependency.

  20. Communicating Experimental Findings in Single Case Design Research: How to Use Celeration Values and Celeration Multipliers to Measure Direction, Magnitude, and Change of Slope

    ERIC Educational Resources Information Center

    Datchuk, Shawn M.; Kubina, Richard M., Jr.

    2011-01-01

    The accumulation of scientific knowledge greatly depends upon the critical review of experimental findings by ones peers. In single case design research, experimenters present findings with graphical displays of data and narrative description of a visual analysis. To aid in efficient and accurate description of experimental findings, the research…

  1. A Project Manager’s Personal Attributes as Predictors for Success

    DTIC Science & Technology

    2007-03-01

    Northouse (2004) explains that leadership is highly a researched topic with much written. Yet, a definitive description of this phenomenon is difficult to...express because of its complexity. Even though leadership has varied descriptions and conceptualizations, Northouse states that the concept of...characteristic of leadership is not an accurate predictor of performance. Leadership is a complex, multi-faceted attribute ( Northouse , 2004) and specific

  2. Why do patients in acute care hospitals fall? Can falls be prevented?

    PubMed

    Dykes, Patricia C; Carroll, Diane L; Hurley, Ann C; Benoit, Angela; Middleton, Blackford

    2009-06-01

    Obtain the views of nurses and assistants as to why patients in acute care hospitals fall. Despite a large quantitative evidence base for guiding fall risk assessment and not needing highly technical, scarce, or expensive equipment to prevent falls, falls are serious problems in hospitals. Basic content analysis methods were used to interpret descriptive data from 4 focus groups with nurses (n = 23) and 4 with assistants (n = 19). A 2-person consensus approach was used for analysis. Positive and negative components of 6 concepts-patient report, information access, signage, environment, teamwork, and involving patient/family-formed 2 core categories: knowledge/ communication and capability/actions that are facilitators or barriers, respectively, to preventing falls. Two conditions are required to reduce patient falls. A patient care plan including current and accurate fall risk status with associated tailored and feasible interventions needs to be easily and immediately accessible to all stakeholders (entire healthcare team, patients, and family). Second, stakeholders must use that information plus their own knowledge and skills and patient and hospital resources to carry out the plan.

  3. CT-based definition of thoracic lymph node stations: an atlas from the University of Michigan.

    PubMed

    Chapet, Olivier; Kong, Feng-Ming; Quint, Leslie E; Chang, Andrew C; Ten Haken, Randall K; Eisbruch, Avraham; Hayman, James A

    2005-09-01

    Accurate delineation of the mediastinal and hilar lymph node regions is essential for a reproducible definition of target volumes used in conformal irradiation of non-small-cell lung cancer. The goal of this work was to generate a consensus to delineate these nodal regions based on definitions from the American Joint Committee on Cancer. A dedicated thoracic radiologist, thoracic surgeon, medical physicist, and three radiation oncologists were gathered to generate a three-dimensional radiologic description for the mediastinal and hilar nodal regions on axial CT scans. This paper proposes an atlas of most of the lymph node stations described by Mountain and Dresler. The CT boundaries of lymph node stations 1-2, 3, 4, 5, 6, 7, 8, 10-11 were defined on axial CT, along with image illustrations. These CT-based illustrative definitions will provide guidelines for clinical practice and studies evaluating incidental radiation in radiotherapy. Studies are ongoing at the University of Michigan to measure quantitatively the incidental nodal radiation received by patients with non-small-cell lung cancer.

  4. Instability of elliptic liquid jets: Temporal linear stability theory and experimental analysis

    NASA Astrophysics Data System (ADS)

    Amini, Ghobad; Lv, Yu; Dolatabadi, Ali; Ihme, Matthias

    2014-11-01

    The instability dynamics of inviscid liquid jets issuing from elliptical orifices is studied, and effects of the surrounding gas and the liquid surface tension on the stability behavior are investigated. A dispersion relation for the zeroth azimuthal (axisymmetric) instability mode is derived. Consistency of the analysis is confirmed by demonstrating that these equations reduce to the well-known dispersion equations for the limiting cases of round and planar jets. It is shown that the effect of the ellipticity is to increase the growth rate over a large range of wavenumbers in comparison to those of a circular jet. For higher Weber numbers, at which capillary forces have a stabilizing effect, the growth rate decreases with increasing ellipticity. Similar to circular and planar jets, increasing the density ratio between gas and liquid increases the growth of disturbances significantly. These theoretical investigations are complemented by experiments to validate the local linear stability results. Comparisons of predicted growth rates with measurements over a range of jet ellipticities confirm that the theoretical model provides a quantitatively accurate description of the instability dynamics in the Rayleigh and first wind-induced regimes.

  5. Predicting the electronic properties of aqueous solutions from first-principles

    NASA Astrophysics Data System (ADS)

    Schwegler, Eric; Pham, Tuan Anh; Govoni, Marco; Seidel, Robert; Bradforth, Stephen; Galli, Giulia

    Predicting the electronic properties of aqueous liquids has been a long-standing challenge for quantum-mechanical methods. Yet it is a crucial step in understanding and predicting the key role played by aqueous solutions and electrolytes in a wide variety of emerging energy and environmental technologies, including battery and photoelectrochemical cell design. Here we propose an efficient and accurate approach to predict the electronic properties of aqueous solutions, based on the combination of first-principles methods and experimental validation using state-of-the-art spectroscopic measurements. We present results for the photoelectron spectra of a broad range of solvated ions, showing that first-principles molecular dynamics simulations and electronic structure calculations using dielectric hybrid functionals provide a quantitative description of their electronic properties, including excitation energies, of the solvent and solutes. The proposed computational framework is general and applicable to other liquids, thereby offering great promise in understanding and engineering solutions and liquid electrolytes for a variety of important energy technologies. Part of this work was performed under the auspices of the U.S. Department of Energy at LLNL under Contract DE-AC52-07A27344.

  6. Why Do Patients in Acute Care Hospitals Fall? Can Falls Be Prevented?

    PubMed Central

    Dykes, Patricia C.; Carroll, Diane L.; Hurley, Ann C.; Benoit, Angela; Middleton, Blackford

    2011-01-01

    Objective Obtain the views of nurses and assistants as to why patients in acute care hospitals fall. Background Despite a large quantitative evidence base for guiding fall risk assessment and not needing highly technical, scarce, or expensive equipment to prevent falls, falls are serious problems in hospitals. Methods Basic content analysis methods were used to interpret descriptive data from 4 focus groups with nurses (n = 23) and 4 with assistants (n = 19). A 2-person consensus approach was used for analysis. Results Positive and negative components of 6 concepts—patient report, information access, signage, environment, teamwork, and involving patient/family—formed 2 core categories: knowledge/communication and capability/actions that are facilitators or barriers, respectively, to preventing falls. Conclusion Two conditions are required to reduce patient falls. A patient care plan including current and accurate fall risk status with associated tailored and feasible interventions needs to be easily and immediately accessible to all stakeholders (entire healthcare team, patients, and family). Second, stakeholders must use that information plus their own knowledge and skills and patient and hospital resources to carry out the plan. PMID:19509605

  7. Classification of Normal and Apoptotic Cells from Fluorescence Microscopy Images Using Generalized Polynomial Chaos and Level Set Function.

    PubMed

    Du, Yuncheng; Budman, Hector M; Duever, Thomas A

    2016-06-01

    Accurate automated quantitative analysis of living cells based on fluorescence microscopy images can be very useful for fast evaluation of experimental outcomes and cell culture protocols. In this work, an algorithm is developed for fast differentiation of normal and apoptotic viable Chinese hamster ovary (CHO) cells. For effective segmentation of cell images, a stochastic segmentation algorithm is developed by combining a generalized polynomial chaos expansion with a level set function-based segmentation algorithm. This approach provides a probabilistic description of the segmented cellular regions along the boundary, from which it is possible to calculate morphological changes related to apoptosis, i.e., the curvature and length of a cell's boundary. These features are then used as inputs to a support vector machine (SVM) classifier that is trained to distinguish between normal and apoptotic viable states of CHO cell images. The use of morphological features obtained from the stochastic level set segmentation of cell images in combination with the trained SVM classifier is more efficient in terms of differentiation accuracy as compared with the original deterministic level set method.

  8. Molecular description of steady supersonic free jets

    NASA Astrophysics Data System (ADS)

    Montero, S.

    2017-09-01

    A detailed analysis of the non-local thermal equilibrium (n-LTE) problem in the paraxial zone of silence of supersonic free jets is reported. The study is based on a hybrid approach that combines Navier-Stokes equations with a kinetic equation derived from the generalized Boltzmann (Waldmann-Snider) equation. The resulting system is solved for those flow quantities not easily amenable to experimental measure (translational temperature, flow velocity, and entropy) in terms of the quantities that can be measured accurately (distance, number density, population of rotational states, and their gradients). The reported solutions are essentially exact and are formulated in terms of macroscopic quantities, as well as in terms of elementary collision processes. Emphasis is made on the influence of dissipative effects onto the flow (viscous and diabatic) and of the breakdown of thermal equilibrium onto the evolution of entropy and translational temperature. The influence of inelastic collisions onto these effects is analysed in depth. The reported equations are aimed at optimizing the experimental knowledge of the n-LTE problem and its quantitative interpretation in terms of state-to-state rates for inelastic collisions.

  9. A Taphonomic Study Exploring the Differences in Decomposition Rate and Manner between Frozen and Never Frozen Domestic Pigs (Sus scrofa).

    PubMed

    Roberts, Lindsey G; Dabbs, Gretchen R

    2015-05-01

    This research examined differences in decomposition rate and manner of domestic pig subjects (Sus scrofa) in never frozen (control) and previously frozen (experimental) research conditions. Eight control and experimental subjects were placed in an identical outdoor research environment. Daily quantitative and qualitative measurements were collected: abdominal circumference, total body score (TBS), temperature, photographs, descriptive decomposition stages, and visual observations. Field necropsies were performed at accumulated degree days (ADD) between 50 and 300 (Celsius). Paired samples t-tests of ADD to TBS >3.0, TBS >9.5, and TBS >16.0 indicate the rate of decomposition of experimental subjects was significantly slower than controls at both TBS >3 and >9.5 (p = 0.003 and p = 0.002, respectively). A suite of qualitative indicators of predecomposition freezing is also reported. The differences between experimental and control subjects suggest previously frozen subjects should not be used in taphonomic research, as results do not accurately reflect the "normal" taphonomic condition. © 2015 American Academy of Forensic Sciences.

  10. Discourse Markers in Composition Writings: The Case of Iranian Learners of English as a Foreign Language

    ERIC Educational Resources Information Center

    Jalilifar, Alireza

    2008-01-01

    The aim of this study was to investigate discourse markers in descriptive compositions of 90 Iranian students who were selected from two universities. Without any instruction, they were given a topic to write a descriptive composition per week for 8 weeks. 598 compositions were collected, and they were analyzed qualitatively and quantitatively by…

  11. TPS as an Effective Technique to Enhance the Students' Achievement on Writing Descriptive Text

    ERIC Educational Resources Information Center

    Sumarsih, M. Pd.; Sanjaya, Dedi

    2013-01-01

    Students' achievement in writing descriptive text is very low, in this study Think Pair Share (TPS) is applied to solve the problem. Action research is conducted for the result. Additionally, qualitative and quantitative techniques are applied in this research. The subject of this research is grade VIII in Junior High School in Indonesia. From…

  12. 75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... respirators (500 and 1000 for protocols 1 and 2, respectively). However, OSHA could not evaluate the results... the values of these descriptive statistics for revised PortaCount[supreg] QNFT protocols 1 (at RFFs of 100 and 500) and 2 (at RFFs of 200 and 1000). Table 2--Descriptive Statistics for RFFs of 100 and 200...

  13. Description of Adsorption in Liquid Chromatography under Nonideal Conditions.

    PubMed

    Ortner, Franziska; Ruppli, Chantal; Mazzotti, Marco

    2018-05-15

    A thermodynamically consistent description of binary adsorption in reversed-phase chromatography is presented, accounting for thermodynamic nonidealities in the liquid and adsorbed phases. The investigated system involves the adsorbent Zorbax 300SB-C18, as well as phenetole and 4- tert-butylphenol as solutes and methanol and water as inert components forming the eluent. The description is based on adsorption isotherms, which are a function of the liquid-phase activities, to account for nonidealities in the liquid phase. Liquid-phase activities are calculated with a UNIQUAC model established in this work, based on experimental phase equilibrium data. The binary interaction in the adsorbed phase is described by the adsorbed solution theory, assuming an ideal (ideal adsorbed solution theory) or real (real adsorbed solution theory) adsorbed phase. Implementation of the established adsorption model in a chromatographic code achieves a quantitative description of experimental elution profiles, with feed compositions exploiting the entire miscible region, and involving a broad range of different eluent compositions (methanol/water). The quantitative agreement of the model and experimental data serves as a confirmation of the underlying physical (thermodynamic) concepts and of their applicability to a broad range of operating conditions.

  14. Quantitative interpretation of the magnetic susceptibility frequency dependence

    NASA Astrophysics Data System (ADS)

    Ustra, Andrea; Mendonça, Carlos A.; Leite, Aruã; Jovane, Luigi; Trindade, Ricardo I. F.

    2018-05-01

    Low-field mass-specific magnetic susceptibility (MS) measurements using multifrequency alternating fields are commonly used to evaluate concentration of ferrimagnetic particles in the transition of superparamagnetic (SP) to stable single domain (SSD). In classical palaeomagnetic analyses, this measurement serves as a preliminary assessment of rock samples providing rapid, non-destructive, economical and easy information of magnetic properties. The SP-SSD transition is relevant in environmental studies because it has been associated with several geological and biogeochemical processes affecting magnetic mineralogy. MS is a complex function of mineral-type and grain-size distribution, as well as measuring parameters such as external field magnitude and frequency. In this work, we propose a new technique to obtain quantitative information on grain-size variations of magnetic particles in the SP-SSD transition by inverting frequency-dependent susceptibility. We introduce a descriptive parameter named as `limiting frequency effect' that provides an accurate estimation of MS loss with frequency. Numerical simulations show the methodology capability in providing data fitting and model parameters in many practical situations. Real-data applications with magnetite nanoparticles and core samples from sediments of Poggio le Guaine section of Umbria-Marche Basin (Italy) provide additional information not clearly recognized when interpreting cruder MS data. Caution is needed when interpreting frequency dependence in terms of single relaxation processes, which are not universally applicable and depend upon the nature of magnetic mineral in the material. Nevertheless, the proposed technique is a promising tool for SP-SSD content analyses.

  15. A Pilot Study on Integrating Videography and Environmental Microbial Sampling to Model Fecal Bacterial Exposures in Peri-Urban Tanzania.

    PubMed

    Julian, Timothy R; Pickering, Amy J

    2015-01-01

    Diarrheal diseases are a leading cause of under-five mortality and morbidity in sub-Saharan Africa. Quantitative exposure modeling provides opportunities to investigate the relative importance of fecal-oral transmission routes (e.g. hands, water, food) responsible for diarrheal disease. Modeling, however, requires accurate descriptions of individuals' interactions with the environment (i.e., activity data). Such activity data are largely lacking for people in low-income settings. In the present study, we collected activity data and microbiological sampling data to develop a quantitative microbial exposure model for two female caretakers in peri-urban Tanzania. Activity data were combined with microbiological data of contacted surfaces and fomites (e.g. broom handle, soil, clothing) to develop example exposure profiles describing second-by-second estimates of fecal indicator bacteria (E. coli and enterococci) concentrations on the caretaker's hands. The study demonstrates the application and utility of video activity data to quantify exposure factors for people in low-income countries and apply these factors to understand fecal contamination exposure pathways. This study provides both a methodological approach for the design and implementation of larger studies, and preliminary data suggesting contacts with dirt and sand may be important mechanisms of hand contamination. Increasing the scale of activity data collection and modeling to investigate individual-level exposure profiles within target populations for specific exposure scenarios would provide opportunities to identify the relative importance of fecal-oral disease transmission routes.

  16. Concentration of Nicotine and Glycols in 27 Electronic Cigarette Formulations.

    PubMed

    Peace, Michelle R; Baird, Tyson R; Smith, Nathaniel; Wolf, Carl E; Poklis, Justin L; Poklis, Alphonse

    2016-07-01

    Personal battery-powered vaporizers or electronic cigarettes were developed to deliver a nicotine vapor such that smokers could simulate smoking tobacco without the inherent pathology of inhaled tobacco smoke. Electronic cigarettes and their e-cigarette liquid formulations are virtually unregulated. These formulations are typically composed of propylene glycol and/or glycerin, flavoring components and an active drug, such as nicotine. Twenty-seven e-cigarette liquid formulations that contain nicotine between 6 and 22 mg/L were acquired within the USA and analyzed by various methods to determine their contents. They were screened by Direct Analysis in Real Time™ Mass Spectrometry (DART-MS). Nicotine was confirmed and quantitated by high-performance liquid chromatography-tandem mass spectrometry, and the glycol composition was confirmed and quantitated by gas chromatography-mass spectrometry. The DART-MS screening method was able to consistently identify the exact mass peaks resulting from the protonated molecular ion of nicotine, glycol and a number of flavor additives within 5 mmu. Nicotine concentrations were determined to range from 45 to 131% of the stated label concentration, with 18 of the 27 have >10% variance. Glycol composition was generally accurate to the product description, with only one exception where the propylene glycol to glycerin percentage ratio was stated as 50:50 and the determined concentration of propylene glycol to glycerin was 81:19 (% v/v). No unlabeled glycols were detected in these formulations. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Arabidopsis phenotyping through Geometric Morphometrics.

    PubMed

    Manacorda, Carlos A; Asurmendi, Sebastian

    2018-06-18

    Recently, much technical progress was achieved in the field of plant phenotyping. High-throughput platforms and the development of improved algorithms for rosette image segmentation make it now possible to extract shape and size parameters for genetic, physiological and environmental studies on a large scale. The development of low-cost phenotyping platforms and freeware resources make it possible to widely expand phenotypic analysis tools for Arabidopsis. However, objective descriptors of shape parameters that could be used independently of platform and segmentation software used are still lacking and shape descriptions still rely on ad hoc or even sometimes contradictory descriptors, which could make comparisons difficult and perhaps inaccurate. Modern geometric morphometrics is a family of methods in quantitative biology proposed to be the main source of data and analytical tools in the emerging field of phenomics studies. Based on the location of landmarks (corresponding points) over imaged specimens and by combining geometry, multivariate analysis and powerful statistical techniques, these tools offer the possibility to reproducibly and accurately account for shape variations amongst groups and measure them in shape distance units. Here, a particular scheme of landmarks placement on Arabidopsis rosette images is proposed to study shape variation in the case of viral infection processes. Shape differences between controls and infected plants are quantified throughout the infectious process and visualized. Quantitative comparisons between two unrelated ssRNA+ viruses are shown and reproducibility issues are assessed. Combined with the newest automated platforms and plant segmentation procedures, geometric morphometric tools could boost phenotypic features extraction and processing in an objective, reproducible manner.

  18. A unified model of density limit in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Zanca, P.; Sattin, F.; Escande, D. F.; Pucella, G.; Tudisco, O.

    2017-05-01

    In this work we identify by analytical and numerical means the conditions for the existence of a magnetic and thermal equilibrium of a cylindrical plasma, in the presence of Ohmic and/or additional power sources, heat conduction and radiation losses by light impurities. The boundary defining the solutions’ space having realistic temperature profile with small edge value takes mathematically the form of a density limit (DL). Compared to previous similar analyses the present work benefits from dealing with a more accurate set of equations. This refinement is elementary, but decisive, since it discloses a tenuous dependence of the DL on the thermal transport for configurations with an applied electric field. Thanks to this property, the DL scaling law is recovered almost identical for two largely different devices such as the ohmic tokamak and the reversed field pinch. In particular, they have in common a Greenwald scaling, linearly depending on the plasma current, quantitatively consistent with experimental results. In the tokamak case the DL dependence on any additional heating approximately follows a 0.5 power law, which is compatible with L-mode experiments. For a purely externally heated configuration, taken as a cylindrical approximation of the stellarator, the DL dependence on transport is found stronger. By adopting suitable transport models, DL takes on a Sudo-like form, in fair agreement with LHD experiments. Overall, the model provides a good zeroth-order quantitative description of the DL, applicable to widely different configurations.

  19. Student nurse-educators' construction of teacher identity from a self-evaluation perspective: A quantitative case study.

    PubMed

    Mukumbang, Ferdinand C; Alindekane, Leka Marcel

    2017-04-01

    The aim of this study was to explore the teacher identity formation dynamics of student nurse-educators about the subject matter, pedagogy and didactics. A case study using descriptive quantitative design was employed. Using a cross-sectional approach, data were collected in 2014 using a self-administered questionnaire. Participants were asked to self-evaluate their teaching competencies on the nursing subject matter, pedagogical expertise and didactical expertise. Using descriptive analysis we determined the central tendencies of the constructs. The descriptive analysis revealed a very small variance (0.0011) and standard deviation (0.04) among the means of the three constructs, which indicates a fair balance in the contribution of the subject matter, pedagogy and didactics towards teacher identity formation. Nursing student-educators can achieve a balanced combination of subject matter expert, pedagogical expert and didactical expert combination during the formation of their teacher identity. This could be indicative of how effective the training programme is in helping the students achieve a balanced teacher identity.

  20. Quantitative prediction of phase transformations in silicon during nanoindentation

    NASA Astrophysics Data System (ADS)

    Zhang, Liangchi; Basak, Animesh

    2013-08-01

    This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.

  1. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Determination of exposure multiples of human metabolites for MIST assessment in preclinical safety species without using reference standards or radiolabeled compounds.

    PubMed

    Ma, Shuguang; Li, Zhiling; Lee, Keun-Joong; Chowdhury, Swapan K

    2010-12-20

    A simple, reliable, and accurate method was developed for quantitative assessment of metabolite coverage in preclinical safety species by mixing equal volumes of human plasma with blank plasma of animal species and vice versa followed by an analysis using high-resolution full-scan accurate mass spectrometry. This approach provided comparable results (within (±15%) to those obtained from regulated bioanalysis and did not require synthetic standards or radiolabeled compounds. In addition, both qualitative and quantitative data were obtained from a single LC-MS analysis on all metabolites and, therefore, the coverage of any metabolite of interest can be obtained.

  3. The Definition, Rationale, and Effects of Thresholding in OCT Angiography.

    PubMed

    Cole, Emily D; Moult, Eric M; Dang, Sabin; Choi, WooJhon; Ploner, Stefan B; Lee, ByungKun; Louzada, Ricardo; Novais, Eduardo; Schottenhamml, Julia; Husvogt, Lennart; Maier, Andreas; Fujimoto, James G; Waheed, Nadia K; Duker, Jay S

    2017-01-01

    To examine the definition, rationale, and effects of thresholding in OCT angiography (OCTA). A theoretical description of OCTA thresholding in combination with qualitative and quantitative analysis of the effects of OCTA thresholding in eyes from a retrospective case series. Four eyes were qualitatively examined: 1 from a 27-year-old control, 1 from a 78-year-old exudative age-related macular degeneration (AMD) patient, 1 from a 58-year-old myopic patient, and 1 from a 77-year-old nonexudative AMD patient with geographic atrophy (GA). One eye from a 75-year-old nonexudative AMD patient with GA was quantitatively analyzed. A theoretical thresholding model and a qualitative and quantitative description of the dependency of OCTA on thresholding level. Due to the presence of system noise, OCTA thresholding is a necessary step in forming OCTA images; however, thresholding can complicate the relationship between blood flow and OCTA signal. Thresholding in OCTA can cause significant artifacts, which should be considered when interpreting and quantifying OCTA images.

  4. The history of facial palsy and spasm

    PubMed Central

    Sajadi, Mohamad-Reza M.; Tabatabaie, Seyed Mahmoud

    2011-01-01

    Although Sir Charles Bell was the first to provide the anatomic basis for the condition that bears his name, in recent years researchers have shown that other European physicians provided earlier clinical descriptions of peripheral cranial nerve 7 palsy. In this article, we describe the history of facial distortion by Greek, Roman, and Persian physicians, culminating in Razi's detailed description in al-Hawi. Razi distinguished facial muscle spasm from paralysis, distinguished central from peripheral lesions, gave the earliest description of loss of forehead wrinkling, and gave the earliest known description of bilateral facial palsy. In doing so, he accurately described the clinical hallmarks of a condition that we recognize as Bell palsy. PMID:21747074

  5. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    ERIC Educational Resources Information Center

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  6. A Deeper Look at How Teachers Say What They Say: A Quantitative Modality Analysis of Teacher-to-Teacher Talk

    ERIC Educational Resources Information Center

    Kosko, Karl W.; Herbst, Patricio

    2012-01-01

    Analysis of teacher-to-teacher talk provides researchers with useful information regarding the teaching profession and teachers' perspectives. This article provides a description of a method, with accompanying example, examining teacher-to-teacher talk by incorporating semantic modality and examining trends of its usage in a quantitative manner.…

  7. A quantitative topographic analysis of the Sky Islands: a closer examination of the topography-biodiversity relationship in the Madrean Archipelago

    Treesearch

    David Coblentz; Kurt H. Riitters

    2005-01-01

    The relationship between topography and biodiversity is well documented in the Madrean Archipelago. However, despite this recognition, most biogeographical studies concerning the role of topography have relied primarily on a qualitative description of the landscape. Using an algorithm that operates on a high-resolution digital elevation model we present a quantitative...

  8. Tuition Reductions: A Quantitative Analysis of the Prevalence, Circumstances and Outcomes of an Emerging Pricing Strategy in Higher Education

    ERIC Educational Resources Information Center

    Kottich, Sarah

    2017-01-01

    This study analyzed tuition reductions in the private not-for-profit sector of higher education, utilizing a quantitative descriptive and correlational approach with secondary data analysis. It resulted in a listing of 45 institutions with verified tuition reductions from 2007 to 2017, more than previously thought. It found that the…

  9. Reference charts for young stands — a quantitative methodology for assessing tree performance

    Treesearch

    Lance A. Vickers; David R. Larsen; Benjamin O. Knapp; John M. Kabrick; Daniel C. Dey

    2017-01-01

    Reference charts have long been used in the medical field for quantitative clinical assessment of juvenile development by plotting distribution quantiles for a selected attribute (e.g., height) against age for specified peer populations.We propose that early stand dynamics is an area of study that could benefit from the descriptions and analyses offered by similar...

  10. An Exploratory Analysis of the Navy Personnel Support Delivery Model

    DTIC Science & Technology

    2017-09-01

    technology-competent generation. Our efforts are focused on providing a quantitative effort to understanding past trends in Personnel Support Detachment (PSD... quantitative effort to understanding past trends in Personnel Support Detachment (PSD) and Customer Service Desk (CSD) transactions that may aid manpower...56 xiv THIS PAGE INTENTIONALLY LEFT BLANK xv LIST OF TABLES Table 1. Final Dataset Column Names and Descriptions

  11. Safety First: A Quantitative Study on Teachers' Perceptions of School Climate in Rural Louisiana Schools

    ERIC Educational Resources Information Center

    Brumfield-Sanders, Tongia M.

    2017-01-01

    The purpose of this descriptive quantitative study was to explore the perceptions of school safety among middle and high school teachers in rural Louisiana. In order to achieve this objective, a specific research question was formulated pertaining to teacher perceptions. The Safe Communities Safe Schools (SCSS) survey was used to assess teachers'…

  12. An accurate density functional theory for the vapor-liquid interface of associating chain molecules based on the statistical associating fluid theory for potentials of variable range

    NASA Astrophysics Data System (ADS)

    Gloor, Guy J.; Jackson, George; Blas, Felipe J.; del Río, Elvira Martín; de Miguel, Enrique

    2004-12-01

    A Helmholtz free energy density functional is developed to describe the vapor-liquid interface of associating chain molecules. The functional is based on the statistical associating fluid theory with attractive potentials of variable range (SAFT-VR) for the homogenous fluid [A. Gil-Villegas, A. Galindo, P. J. Whitehead, S. J. Mills, G. Jackson, and A. N. Burgess, J. Chem. Phys. 106, 4168 (1997)]. A standard perturbative density functional theory (DFT) is constructed by partitioning the free energy density into a reference term (which incorporates all of the short-range interactions, and is treated locally) and an attractive perturbation (which incorporates the long-range dispersion interactions). In our previous work [F. J. Blas, E. Martín del Río, E. de Miguel, and G. Jackson, Mol. Phys. 99, 1851 (2001); G. J. Gloor, F. J. Blas, E. Martín del Río, E. de Miguel, and G. Jackson, Fluid Phase Equil. 194, 521 (2002)] we used a mean-field version of the theory (SAFT-HS) in which the pair correlations were neglected in the attractive term. This provides only a qualitative description of the vapor-liquid interface, due to the inadequate mean-field treatment of the vapor-liquid equilibria. Two different approaches are used to include the correlations in the attractive term: in the first, the free energy of the homogeneous fluid is partitioned such that the effect of correlations are incorporated in the local reference term; in the second, a density averaged correlation function is incorporated into the perturbative term in a similar way to that proposed by Toxvaerd [S. Toxvaerd, J. Chem. Phys. 64, 2863 (1976)]. The latter is found to provide the most accurate description of the vapor-liquid surface tension on comparison with new simulation data for a square-well fluid of variable range. The SAFT-VR DFT is used to examine the effect of molecular chain length and association on the surface tension. Different association schemes (dimerization, straight and branched chain formation, and network structures) are examined separately. The surface tension of the associating fluid is found to be bounded between the nonassociating and fully associated limits (both of which correspond to equivalent nonassociating systems). The temperature dependence of the surface tension is found to depend strongly on the balance between the strength and range of the association, and on the particular association scheme. In the case of a system with a strong but very localized association interaction, the surface tension exhibits the characteristic "s shaped" behavior with temperature observed in fluids such as water and alkanols. The various types of curves observed in real substances can be reproduced by the theory. It is very gratifying that a DFT based on SAFT-VR free energy can provide an accurate quantitative description of the surface tension of both the model and experimental systems.

  13. 40 CFR 35.6555 - Competition.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... specifications must include a clear and accurate description of the technical requirements and the qualitative... innovative technologies. (2) The recipient must avoid the use of detailed product specifications if at all...

  14. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.

  15. A new liquid chromatography-mass spectrometry-based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous systems.

    PubMed

    Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A

    2015-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.

  16. Performance Evaluation and Quantitative Accuracy of Multipinhole NanoSPECT/CT Scanner for Theranostic Lu-177 Imaging

    NASA Astrophysics Data System (ADS)

    Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung

    2018-06-01

    SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.

  17. Multi-laboratory comparison of quantitative PCR assays for detection and quantification of Fusarium virguliforme from soybean roots and soil

    USDA-ARS?s Scientific Manuscript database

    Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...

  18. ACVP-02: Plasma SIV/SHIV RNA Viral Load Measurements through the AIDS and Cancer Virus Program Quantitative Molecular Diagnostics Core | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The SIV plasma viral load assay performed by the Quantitative Molecular Diagnostics Core (QMDC) utilizes reagents specifically designed to detect and accurately quantify the full range of SIV/SHIV viral variants and clones in common usage in the rese

  19. Water immersion facility general description, spacecraft design division, crew station branch

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The Water Immersion Facility provides an accurate, safe, neutral buoyancy simulation of zero gravity conditions for development of equipment and procedures, and the training of crews. A detailed description is given of some of the following systems: (1) water tank and support equipment; (2) communications systems; (3) environmental control and liquid cooled garment system (EcS/LCG); (4) closed circuit television system; and (5) medical support system.

  20. High precision analytical description of the allowed β spectrum shape

    NASA Astrophysics Data System (ADS)

    Hayen, Leendert; Severijns, Nathal; Bodek, Kazimierz; Rozpedzik, Dagmara; Mougeot, Xavier

    2018-01-01

    A fully analytical description of the allowed β spectrum shape is given in view of ongoing and planned measurements. Its study forms an invaluable tool in the search for physics beyond the standard electroweak model and the weak magnetism recoil term. Contributions stemming from finite size corrections, mass effects, and radiative corrections are reviewed. Particular focus is placed on atomic and chemical effects, where the existing description is extended and analytically provided. The effects of QCD-induced recoil terms are discussed, and cross-checks were performed for different theoretical formalisms. Special attention was given to a comparison of the treatment of nuclear structure effects in different formalisms. Corrections were derived for both Fermi and Gamow-Teller transitions, and methods of analytical evaluation thoroughly discussed. In its integrated form, calculated f values were in agreement with the most precise numerical results within the aimed for precision. The need for an accurate evaluation of weak magnetism contributions was stressed, and the possible significance of the oft-neglected induced pseudoscalar interaction was noted. Together with improved atomic corrections, an analytical description was presented of the allowed β spectrum shape accurate to a few parts in 10-4 down to 1 keV for low to medium Z nuclei, thereby extending the work by previous authors by nearly an order of magnitude.

  1. Technical description of endoscopic ultrasonography with fine-needle aspiration for the staging of lung cancer.

    PubMed

    Kramer, Henk; van Putten, John W G; Douma, W Rob; Smidt, Alie A; van Dullemen, Hendrik M; Groen, Harry J M

    2005-02-01

    Endoscopic ultrasonography (EUS) is a novel method for staging of the mediastinum in lung cancer patients. The recent development of linear scanners enables safe and accurate fine-needle aspiration (FNA) of mediastinal and upper abdominal structures under real-time ultrasound guidance. However, various methods and equipment for mediastinal EUS-FNA are being used throughout the world, and a detailed description of the procedures is lacking. A thorough description of linear EUS-FNA is needed. A step-by-step description of the linear EUS-FNA procedure as performed in our hospital will be provided. Ultrasonographic landmarks will be shown on images. The procedure will be related to published literature, with a systematic literature search. EUS-FNA is an outpatient procedure under conscious sedation. The typical linear EUS-FNA procedure starts with examination of the retroperitoneal area. After this, systematic scanning of the mediastinum is performed at intervals of 1-2cm. Abnormalities are noted, and FNA of the abnormalities can be performed. Specimens are assessed for cellularity on-site. The entire procedure takes 45-60 min. EUS-FNA is minimally invasive, accurate, and fast. Anatomical areas can be reached that are inaccessible for cervical mediastinoscopy. EUS-FNA is useful for the staging of lung cancer or the assessment and diagnosis of abnormalities in the posterior mediastinum.

  2. Systematic Standardized and Individualized Assessment of Masticatory Cycles Using Electromagnetic 3D Articulography and Computer Scripts

    PubMed Central

    Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Dias, Fernando José

    2017-01-01

    Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (−0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments. PMID:29075647

  3. Systematic Standardized and Individualized Assessment of Masticatory Cycles Using Electromagnetic 3D Articulography and Computer Scripts.

    PubMed

    Fuentes, Ramón; Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Kuramochi, Gisaku; Dias, Fernando José

    2017-01-01

    Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (-0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments.

  4. Linking Policy | Smokefree 60+

    Cancer.gov

    Links to individual pages within the Smokefree 60+ website are permissible, provided attribution is made to 60plus.smokefree.gov and any descriptive notes accurately reflect the content of the linked page(s).

  5. Quantitative Phase Imaging in a Volume Holographic Microscope

    NASA Astrophysics Data System (ADS)

    Waller, Laura; Luo, Yuan; Barbastathis, George

    2010-04-01

    We demonstrate a method for quantitative phase imaging in a Volume Holographic Microscope (VHM) from a single exposure, describe the properties of the system and show experimental results. The VHM system uses a multiplexed volume hologram (VH) to laterally separate images from different focal planes. This 3D intensity information is then used to solve the transport of intensity (TIE) equation and recover phase quantitatively. We discuss the modifications to the technique that were made in order to give accurate results.

  6. On some approaches to model reversible magnetization processes

    NASA Astrophysics Data System (ADS)

    Chwastek, K.; Baghel, A. P. S.; Sai Ram, B.; Borowik, B.; Daniel, L.; Kulkarni, S. V.

    2018-04-01

    This paper focuses on the problem of how reversible magnetization processes are taken into account in contemporary descriptions of hysteresis curves. For comparison, three versions of the phenomenological T(x) model based on hyperbolic tangent mapping are considered. Two of them are based on summing the output of the hysteresis operator with a linear or nonlinear mapping. The third description is inspired by the concept of the product Preisach model. Total susceptibility is modulated with a magnetization-dependent function. The models are verified using measurement data for grain-oriented electrical steel. The proposed third description represents minor loops most accurately.

  7. High Accuracy Transistor Compact Model Calibrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hembree, Charles E.; Mar, Alan; Robertson, Perry J.

    2015-09-01

    Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirementsmore » require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.« less

  8. 28 CFR 20.22 - Certification of compliance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Local Criminal History Record Information Systems § 20.22 Certification of compliance. (a) Each State to... development of complete and accurate criminal history record information; (4) A description of existing system...

  9. 28 CFR 20.22 - Certification of compliance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Local Criminal History Record Information Systems § 20.22 Certification of compliance. (a) Each State to... development of complete and accurate criminal history record information; (4) A description of existing system...

  10. 28 CFR 20.22 - Certification of compliance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Local Criminal History Record Information Systems § 20.22 Certification of compliance. (a) Each State to... development of complete and accurate criminal history record information; (4) A description of existing system...

  11. 28 CFR 20.22 - Certification of compliance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Local Criminal History Record Information Systems § 20.22 Certification of compliance. (a) Each State to... development of complete and accurate criminal history record information; (4) A description of existing system...

  12. 28 CFR 20.22 - Certification of compliance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Local Criminal History Record Information Systems § 20.22 Certification of compliance. (a) Each State to... development of complete and accurate criminal history record information; (4) A description of existing system...

  13. Physiologically based pharmacokinetic modeling of (18)F-SiFAlin-Asp3-PEG1-TATE in AR42J tumor bearing mice.

    PubMed

    Maaß, Christian; Rivas, Jose Ricardo Avelar; Attarwala, Ali Asgar; Hardiansyah, Deni; Niedermoser, Sabrina; Litau, Shanna; Wängler, Carmen; Wängler, Björn; Glatting, Gerhard

    2016-04-01

    Peptide receptor radionuclide therapy (PRRT) is commonly performed in the treatment of neuroendocrine tumors (NET), where somatostatin analogs (DOTATATE) are radiolabeled with (90)Y, (68)Ga or (111)In for pre-therapeutic and therapeutic purposes. Quantitative evaluation of the biokinetic data can be performed by using physiologically based pharmacokinetic (PBPK) models. Knowledge about the biodistribution in a pre-clinical setting would allow optimizing the translation from bench to bedside. The aim of this study was to develop a PBPK model to describe the biodistribution of a novel sst2-targeting radiotracer. Biokinetic data of six mice after injection of (18)F-SiFAlin-Asp3-PEG1-TATE were investigated using two PBPK models. The PBPK models describe the biodistribution of the tracer in the tumor, kidneys, liver, remainder and whole body via blood flow to these organs via absorption, distribution, metabolism and excretion. A recently published sst2 PBPK model for humans (model 1) was used to describe the data. Physiological information in this model was adapted to that of a mouse. Model 1 was further modified by implementing receptor-mediated endocytosis (model 2). Model parameters were fitted to the biokinetic data of each mouse. Model selection was performed by calculating Akaike weights wi using the corrected Akaike Information Criterion (AICc). The implementation of receptor-mediated endocytosis considerably improved the description of the biodistribution (Akaike weights w1=0% and w2=100% for model 1 and 2, respectively). The resulting time-integrated activity coefficients determined by model 2 were for tumor (0.05 ± 0.02) h, kidneys (0.11 ± 0.01) h and liver (0.02 ± 0.01) h. Simply downscaling a human PBPK model does not allow for an accurate description of (18)F-SiFAlin-Asp3-PEG1-TATE in mice. Biokinetics of this tracer can be accurately and adequately described using a physiologically based pharmacokinetic model including receptor-mediated endocytosis. Thus, an optimized translation from bench to bedside is possible. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Quantitative disease resistance: to better understand parasite-mediated selection on major histocompatibility complex

    PubMed Central

    Westerdahl, Helena; Asghar, Muhammad; Hasselquist, Dennis; Bensch, Staffan

    2012-01-01

    We outline a descriptive framework of how candidate alleles of the immune system associate with infectious diseases in natural populations of animals. Three kinds of alleles can be separated when both prevalence of infection and infection intensity are measured—qualitative disease resistance, quantitative disease resistance and susceptibility alleles. Our descriptive framework demonstrates why alleles for quantitative resistance and susceptibility cannot be separated based on prevalence data alone, but are distinguishable on infection intensity. We then present a case study to evaluate a previous finding of a positive association between prevalence of a severe avian malaria infection (GRW2, Plasmodium ashfordi) and a major histocompatibility complex (MHC) class I allele (B4b) in great reed warblers Acrocephalus arundinaceus. Using the same dataset, we find that individuals with allele B4b have lower GRW2 infection intensities than individuals without this allele. Therefore, allele B4b provides quantitative resistance rather than increasing susceptibility to infection. This implies that birds carrying B4b can mount an immune response that suppresses the acute-phase GRW2 infection, while birds without this allele cannot and may die. We argue that it is important to determine whether MHC alleles related to infections are advantageous (quantitative and qualitative resistance) or disadvantageous (susceptibility) to obtain a more complete picture of pathogen-mediated balancing selection. PMID:21733902

  15. Quantitative disease resistance: to better understand parasite-mediated selection on major histocompatibility complex.

    PubMed

    Westerdahl, Helena; Asghar, Muhammad; Hasselquist, Dennis; Bensch, Staffan

    2012-02-07

    We outline a descriptive framework of how candidate alleles of the immune system associate with infectious diseases in natural populations of animals. Three kinds of alleles can be separated when both prevalence of infection and infection intensity are measured--qualitative disease resistance, quantitative disease resistance and susceptibility alleles. Our descriptive framework demonstrates why alleles for quantitative resistance and susceptibility cannot be separated based on prevalence data alone, but are distinguishable on infection intensity. We then present a case study to evaluate a previous finding of a positive association between prevalence of a severe avian malaria infection (GRW2, Plasmodium ashfordi) and a major histocompatibility complex (MHC) class I allele (B4b) in great reed warblers Acrocephalus arundinaceus. Using the same dataset, we find that individuals with allele B4b have lower GRW2 infection intensities than individuals without this allele. Therefore, allele B4b provides quantitative resistance rather than increasing susceptibility to infection. This implies that birds carrying B4b can mount an immune response that suppresses the acute-phase GRW2 infection, while birds without this allele cannot and may die. We argue that it is important to determine whether MHC alleles related to infections are advantageous (quantitative and qualitative resistance) or disadvantageous (susceptibility) to obtain a more complete picture of pathogen-mediated balancing selection.

  16. Molecular acidity: An accurate description with information-theoretic approach in density functional reactivity theory.

    PubMed

    Cao, Xiaofang; Rong, Chunying; Zhong, Aiguo; Lu, Tian; Liu, Shubin

    2018-01-15

    Molecular acidity is one of the important physiochemical properties of a molecular system, yet its accurate calculation and prediction are still an unresolved problem in the literature. In this work, we propose to make use of the quantities from the information-theoretic (IT) approach in density functional reactivity theory and provide an accurate description of molecular acidity from a completely new perspective. To illustrate our point, five different categories of acidic series, singly and doubly substituted benzoic acids, singly substituted benzenesulfinic acids, benzeneseleninic acids, phenols, and alkyl carboxylic acids, have been thoroughly examined. We show that using IT quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy, information gain, Onicescu information energy, and relative Rényi entropy, one is able to simultaneously predict experimental pKa values of these different categories of compounds. Because of the universality of the quantities employed in this work, which are all density dependent, our approach should be general and be applicable to other systems as well. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Tomosynthesis can facilitate accurate measurement of joint space width under the condition of the oblique incidence of X-rays in patients with rheumatoid arthritis.

    PubMed

    Ono, Yohei; Kashihara, Rina; Yasojima, Nobutoshi; Kasahara, Hideki; Shimizu, Yuka; Tamura, Kenichi; Tsutsumi, Kaori; Sutherland, Kenneth; Koike, Takao; Kamishima, Tamotsu

    2016-06-01

    Accurate evaluation of joint space width (JSW) is important in the assessment of rheumatoid arthritis (RA). In clinical radiography of bilateral hands, the oblique incidence of X-rays is unavoidable, which may cause perceptional or measurement error of JSW. The objective of this study was to examine whether tomosynthesis, a recently developed modality, can facilitate a more accurate evaluation of JSW than radiography under the condition of oblique incidence of X-rays. We investigated quantitative errors derived from the oblique incidence of X-rays by imaging phantoms simulating various finger joint spaces using radiographs and tomosynthesis images. We then compared the qualitative results of the modified total Sharp score of a total of 320 joints from 20 patients with RA between these modalities. A quantitative error was prominent when the location of the phantom was shifted along the JSW direction. Modified total Sharp scores of tomosynthesis images were significantly higher than those of radiography, that is to say JSW was regarded as narrower in tomosynthesis than in radiography when finger joints were located where the oblique incidence of X-rays is expected in the JSW direction. Tomosynthesis can facilitate accurate evaluation of JSW in finger joints of patients with RA, even with oblique incidence of X-rays. Accurate evaluation of JSW is necessary for the management of patients with RA. Through phantom and clinical studies, we demonstrate that tomosynthesis may achieve more accurate evaluation of JSW.

  18. Quantitative Phase Microscopy for Accurate Characterization of Microlens Arrays

    NASA Astrophysics Data System (ADS)

    Grilli, Simonetta; Miccio, Lisa; Merola, Francesco; Finizio, Andrea; Paturzo, Melania; Coppola, Sara; Vespini, Veronica; Ferraro, Pietro

    Microlens arrays are of fundamental importance in a wide variety of applications in optics and photonics. This chapter deals with an accurate digital holography-based characterization of both liquid and polymeric microlenses fabricated by an innovative pyro-electrowetting process. The actuation of liquid and polymeric films is obtained through the use of pyroelectric charges generated into polar dielectric lithium niobate crystals.

  19. Quantitation and detection of vanadium in biologic and pollution materials

    NASA Technical Reports Server (NTRS)

    Gordon, W. A.

    1974-01-01

    A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.

  20. A Historical Analysis of Internal Review

    DTIC Science & Technology

    1981-03-01

    the background material presented. In such a study as this, the absence of quantitative data forces narrative descriptions and arguments vice... difinitive graphic displays. The chapter seeked to convey a sense of history and development of auditing in general and internal review in particular. In...measure represents the closest feasible way of measuring the accomplishment of an objective that cannot itself be expressed quantitatively . Such a measure

  1. Beyond Technology, an Analysis of the Perceived Impact of Transformational Leadership and Contingent Rewards as Extrinsic Motivation on Virtual Team Member Satisfaction and Leadership Effectiveness: A Quantitative Study

    ERIC Educational Resources Information Center

    Mawanda, Haruna Juko

    2012-01-01

    The primary purpose of this nonexperimental, correlational, and descriptive quantitative study research was to gain an empirical understanding of the effects of transformational leadership and contingent reward as extrinsic motivation on employee satisfaction with leadership and leadership effectiveness in virtual team workplace environments.…

  2. Resistant Behaviors by People with Alzheimer Dementia and Traumatic Brain Injury

    DTIC Science & Technology

    2017-09-01

    participants has completed the information for the research team to have collected quantitative data on caregiver burden and family quality of life for...those adverse behaviors. The combined qualitative, quantitative , and economic analyses will also provide pertinent information regarding the general...other achievements. Include a discussion of stated goals not met. Description shall include pertinent data and graphs in sufficient detail to explain

  3. Sociolinguistically Informed Natural Language Processing: Automating Irony Detection

    DTIC Science & Technology

    2017-10-23

    ML and NLP technologies fail to detect ironic intent empirically. We specifically proposed to assess quantitatively (using the collected dataset...Aim 2. To analyze when existing ML and NLP technologies fail to detect ironic intent empirically. We specifically proposed to assess quantitatively ...of the embedding reddit thread, and the other comments in this thread) constitute 4 sub-reddit (URL) description number of labeled comments politics

  4. A New Approach for the Quantitative Evaluation of Drawings in Children with Learning Disabilities

    ERIC Educational Resources Information Center

    Galli, Manuela; Vimercati, Sara Laura; Stella, Giacomo; Caiazzo, Giorgia; Norveti, Federica; Onnis, Francesca; Rigoldi, Chiara; Albertini, Giorgio

    2011-01-01

    A new method for a quantitative and objective description of drawing and for the quantification of drawing ability in children with learning disabilities (LD) is hereby presented. Twenty-four normally developing children (N) (age 10.6 [plus or minus] 0.5) and 18 children with learning disabilities (LD) (age 10.3 [plus or minus] 2.4) took part to…

  5. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  6. Magnetic Resonance Imaging of Intracranial Hypotension: Diagnostic Value of Combined Qualitative Signs and Quantitative Metrics.

    PubMed

    Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi

    The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.

  7. Accurate expansion of cylindrical paraxial waves for its straightforward implementation in electromagnetic scattering

    NASA Astrophysics Data System (ADS)

    Naserpour, Mahin; Zapata-Rodríguez, Carlos J.

    2018-01-01

    The evaluation of vector wave fields can be accurately performed by means of diffraction integrals, differential equations and also series expansions. In this paper, a Bessel series expansion which basis relies on the exact solution of the Helmholtz equation in cylindrical coordinates is theoretically developed for the straightforward yet accurate description of low-numerical-aperture focal waves. The validity of this approach is confirmed by explicit application to Gaussian beams and apertured focused fields in the paraxial regime. Finally we discuss how our procedure can be favorably implemented in scattering problems.

  8. Quantitative measurements of electromechanical response with a combined optical beam and interferometric atomic force microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Labuda, Aleksander; Proksch, Roger

    An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement.more » The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.« less

  9. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  10. Separation and quantitation of polyethylene glycols 400 and 3350 from human urine by high-performance liquid chromatography.

    PubMed

    Ryan, C M; Yarmush, M L; Tompkins, R G

    1992-04-01

    Polyethylene glycol 3350 (PEG 3350) is useful as an orally administered probe to measure in vivo intestinal permeability to macromolecules. Previous methods to detect polyethylene glycol (PEG) excreted in the urine have been hampered by inherent inaccuracies associated with liquid-liquid extraction and turbidimetric analysis. For accurate quantitation by previous methods, radioactive labels were required. This paper describes a method to separate and quantitate PEG 3350 and PEG 400 in human urine that is independent of radioactive labels and is accurate in clinical practice. The method uses sized regenerated cellulose membranes and mixed ion-exchange resin for sample preparation and high-performance liquid chromatography with refractive index detection for analysis. The 24-h excretion for normal individuals after an oral dose of 40 g of PEG 3350 and 5 g of PEG 400 was 0.12 +/- 0.04% of the original dose of PEG 3350 and 26.3 +/- 5.1% of the original dose of PEG 400.

  11. A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.

    PubMed

    Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R

    2011-10-01

    It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.

  12. Quantitative real-time reverse transcription polymerase chain reaction: normalization to rRNA or single housekeeping genes is inappropriate for human tissue biopsies.

    PubMed

    Tricarico, Carmela; Pinzani, Pamela; Bianchi, Simonetta; Paglierani, Milena; Distante, Vito; Pazzagli, Mario; Bustin, Stephen A; Orlando, Claudio

    2002-10-15

    Careful normalization is essential when using quantitative reverse transcription polymerase chain reaction assays to compare mRNA levels between biopsies from different individuals or cells undergoing different treatment. Generally this involves the use of internal controls, such as mRNA specified by a housekeeping gene, ribosomal RNA (rRNA), or accurately quantitated total RNA. The aim of this study was to compare these methods and determine which one can provide the most accurate and biologically relevant quantitative results. Our results show significant variation in the expression levels of 10 commonly used housekeeping genes and 18S rRNA, both between individuals and between biopsies taken from the same patient. Furthermore, in 23 breast cancers samples mRNA and protein levels of a regulated gene, vascular endothelial growth factor (VEGF), correlated only when normalized to total RNA, as did microvessel density. Finally, mRNA levels of VEGF and the most popular housekeeping gene, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), were significantly correlated in the colon. Our results suggest that the use of internal standards comprising single housekeeping genes or rRNA is inappropriate for studies involving tissue biopsies.

  13. The Effect of Auditory and Visual Motion Picture Descriptive Modalities in Teaching Perceptual-Motor Skills Used in the Grading of Cereal Grains.

    ERIC Educational Resources Information Center

    Hannemann, James William

    This study was designed to discover whether a student learns to imitate the skills demonstrated in a motion picture more accurately when the supportive descriptive terminology is presented in an auditory (spoken) form or in a visual (captions) form. A six-minute color 16mm film was produced--"Determining the Test Weight per Bushel of Yellow Corn".…

  14. Sexing chick mRNA: A protocol based on quantitative real-time polymerase chain reaction.

    PubMed

    Wan, Z; Lu, Y; Rui, L; Yu, X; Li, Z

    2017-03-01

    The accurate identification of sex in birds is important for research on avian sex determination and differentiation. Polymerase chain reaction (PCR)-based methods have been widely applied for the molecular sexing of birds. However, these methods have used genomic DNA. Here, we present the first sexing protocol for chick mRNA based on real-time quantitative PCR. We demonstrate that this method can accurately determine sex using mRNA from chick gonads and other tissues, such as heart, liver, spleen, lung, and muscle. The strategy of this protocol also may be suitable for other species in which sex is determined by the inheritance of sex chromosomes (ZZ male and ZW female). © 2016 Poultry Science Association Inc.

  15. Film/chemistry selection for the earth resources technology satellite /ERTS/ ground data handling system

    NASA Technical Reports Server (NTRS)

    Shaffer, R. M.

    1973-01-01

    A detailed description is given of the methods of choose the duplication film and chemistry currently used in the NASA-ERTS Ground Data Handling System. The major ERTS photographic duplication goals are given as background information to justify the specifications for the desirable film/chemistry combination. Once these specifications were defined, a quantitative evaluation program was designed and implemented to determine if any recommended combinations could meet the ERTS laboratory specifications. The specifications include tone reproduction, granularity, MTF and cosmetic effects. A complete description of the techniques used to measure the test response variables is given. It is anticipated that similar quantitative techniques could be used on other programs to determine the optimum film/chemistry consistent with the engineering goals of the program.

  16. Growth of wormlike micelles in nonionic surfactant solutions: Quantitative theory vs. experiment.

    PubMed

    Danov, Krassimir D; Kralchevsky, Peter A; Stoyanov, Simeon D; Cook, Joanne L; Stott, Ian P; Pelan, Eddie G

    2018-06-01

    Despite the considerable advances of molecular-thermodynamic theory of micelle growth, agreement between theory and experiment has been achieved only in isolated cases. A general theory that can provide self-consistent quantitative description of the growth of wormlike micelles in mixed surfactant solutions, including the experimentally observed high peaks in viscosity and aggregation number, is still missing. As a step toward the creation of such theory, here we consider the simplest system - nonionic wormlike surfactant micelles from polyoxyethylene alkyl ethers, C i E j . Our goal is to construct a molecular-thermodynamic model that is in agreement with the available experimental data. For this goal, we systematized data for the micelle mean mass aggregation number, from which the micelle growth parameter was determined at various temperatures. None of the available models can give a quantitative description of these data. We constructed a new model, which is based on theoretical expressions for the interfacial-tension, headgroup-steric and chain-conformation components of micelle free energy, along with appropriate expressions for the parameters of the model, including their temperature and curvature dependencies. Special attention was paid to the surfactant chain-conformation free energy, for which a new more general formula was derived. As a result, relatively simple theoretical expressions are obtained. All parameters that enter these expressions are known, which facilitates the theoretical modeling of micelle growth for various nonionic surfactants in excellent agreement with the experiment. The constructed model can serve as a basis that can be further upgraded to obtain quantitative description of micelle growth in more complicated systems, including binary and ternary mixtures of nonionic, ionic and zwitterionic surfactants, which determines the viscosity and stability of various formulations in personal-care and house-hold detergency. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Analyses and descriptions of geochemical samples from the Rich Mountain Roadless Area, Fannin and Gilmer counties, Georgia

    USGS Publications Warehouse

    Sears, C.M.; Foose, M.P.; Day, G.W.; Ericksen, M.S.

    1983-01-01

    Semi-quantitative spectrographic analyses for 31 elements on rock, soil, fine-grained stream sediment, bulk stream sediment, and panned stream sediment samples collected in the Rich Mountain Roadless Area, Fannin and Gilmer Counties, Georgia, are reported here. Atomic absorption analyses for gold and fluorometric analyses for uranium are also reported. Brief descriptions of all rock samples analyzed are included.

  18. Self-descriptions on LinkedIn: Recruitment or friendship identity?

    PubMed

    Garcia, Danilo; Cloninger, Kevin M; Granjard, Alexandre; Molander-Söderholm, Kristian; Amato, Clara; Sikström, Sverker

    2018-04-26

    We used quantitative semantics to find clusters of words in LinkedIn users' self-descriptions to an employer or a friend. Some of these clusters discriminated between worker and friend conditions (e.g., flexible vs. caring) and between LinkedIn users with high and low education (e.g., analytical vs. messy). © 2018 The Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  19. Descriptions and identifications of strangers by youth and adult eyewitnesses.

    PubMed

    Pozzulo, Joanna D; Warren, Kelly L

    2003-04-01

    Two studies varying target gender and mode of target exposure were conducted to compare the quantity, nature, and accuracy of free recall person descriptions provided by youths and adults. In addition, the relation among age, identification accuracy, and number of descriptors reported was considered. Youths (10-14 years) reported fewer descriptors than adults. Exterior facial descriptors (e.g., hair items) were predominant and accurately reported by youths and adults. Accuracy was consistently problematic for youths when reporting body descriptors (e.g., height, weight) and interior facial features. Youths reported a similar number of descriptors when making accurate versus inaccurate identification decisions. This pattern also was consistent for adults. With target-absent lineups, the difference in the number of descriptors reported between adults and youths was greater when making a false positive versus correct rejection.

  20. Inferring phenomenological models of Markov processes from data

    NASA Astrophysics Data System (ADS)

    Rivera, Catalina; Nemenman, Ilya

    Microscopically accurate modeling of stochastic dynamics of biochemical networks is hard due to the extremely high dimensionality of the state space of such networks. Here we propose an algorithm for inference of phenomenological, coarse-grained models of Markov processes describing the network dynamics directly from data, without the intermediate step of microscopically accurate modeling. The approach relies on the linear nature of the Chemical Master Equation and uses Bayesian Model Selection for identification of parsimonious models that fit the data. When applied to synthetic data from the Kinetic Proofreading process (KPR), a common mechanism used by cells for increasing specificity of molecular assembly, the algorithm successfully uncovers the known coarse-grained description of the process. This phenomenological description has been notice previously, but this time it is derived in an automated manner by the algorithm. James S. McDonnell Foundation Grant No. 220020321.

  1. Path generation algorithm for UML graphic modeling of aerospace test software

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  2. A knowledge-based potential with an accurate description of local interactions improves discrimination between native and near-native protein conformations.

    PubMed

    Ferrada, Evandro; Vergara, Ismael A; Melo, Francisco

    2007-01-01

    The correct discrimination between native and near-native protein conformations is essential for achieving accurate computer-based protein structure prediction. However, this has proven to be a difficult task, since currently available physical energy functions, empirical potentials and statistical scoring functions are still limited in achieving this goal consistently. In this work, we assess and compare the ability of different full atom knowledge-based potentials to discriminate between native protein structures and near-native protein conformations generated by comparative modeling. Using a benchmark of 152 near-native protein models and their corresponding native structures that encompass several different folds, we demonstrate that the incorporation of close non-bonded pairwise atom terms improves the discriminating power of the empirical potentials. Since the direct and unbiased derivation of close non-bonded terms from current experimental data is not possible, we obtained and used those terms from the corresponding pseudo-energy functions of a non-local knowledge-based potential. It is shown that this methodology significantly improves the discrimination between native and near-native protein conformations, suggesting that a proper description of close non-bonded terms is important to achieve a more complete and accurate description of native protein conformations. Some external knowledge-based energy functions that are widely used in model assessment performed poorly, indicating that the benchmark of models and the specific discrimination task tested in this work constitutes a difficult challenge.

  3. Proton-driven spin diffusion in rotating solids via reversible and irreversible quantum dynamics

    PubMed Central

    Veshtort, Mikhail; Griffin, Robert G.

    2011-01-01

    Proton-driven spin diffusion (PDSD) experiments in rotating solids have received a great deal of attention as a potential source of distance constraints in large biomolecules. However, the quantitative relationship between the molecular structure and observed spin diffusion has remained obscure due to the lack of an accurate theoretical description of the spin dynamics in these experiments. We start with presenting a detailed relaxation theory of PDSD in rotating solids that provides such a description. The theory applies to both conventional and radio-frequency-assisted PDSD experiments and extends to the non-Markovian regime to include such phenomena as rotational resonance (R2). The basic kinetic equation of the theory in the non-Markovian regime has the form of a memory function equation, with the role of the memory function played by the correlation function. The key assumption used in the derivation of this equation expresses the intuitive notion of the irreversible dissipation of coherences in macroscopic systems. Accurate expressions for the correlation functions and for the spin diffusion constants are given. The theory predicts that the spin diffusion constants governing the multi-site PDSD can be approximated by the constants observed in the two-site diffusion. Direct numerical simulations of PDSD dynamics via reversible Liouville-von Neumann equation are presented to support and compliment the theory. Remarkably, an exponential decay of the difference magnetization can be observed in such simulations in systems consisting of only 12 spins. This is a unique example of a real physical system whose typically macroscopic and apparently irreversible behavior can be traced via reversible microscopic dynamics. An accurate value for the spin diffusion constant can be usually obtained through direct simulations of PDSD in systems consisting of two 13C nuclei and about ten 1H nuclei from their nearest environment. Spin diffusion constants computed by this method are in excellent agreement with the spin diffusion constants obtained through equations given by the relaxation theory of PDSD. The constants resulting from these two approaches were also in excellent agreement with the results of 2D rotary resonance recoupling proton-driven spin diffusion (R3-PDSD) experiments performed in three model compounds, where magnetization exchange occurred over distances up to 4.9 Å. With the methodology presented, highly accurate internuclear distances can be extracted from such data. Relayed transfer of magnetization between distant nuclei appears to be the main (and apparently resolvable) source of uncertainty in such measurements. The non-Markovian kinetic equation was applied to the analysis of the R2 spin dynamics. The conventional semi-phenomenological treatment of relxation in R2 has been shown to be equivalent to the assumption of the Lorentzian spectral density function in the relaxatoin theory of PDSD. As this assumption is a poor approximation in real physical systems, the conventional R2 treatment is likely to carry a significant model error that has not been recognized previously. The relaxation theory of PDSD appears to provide an accurate, parameter-free alternative. Predictions of this theory agreed well with the full quantum mechanical simulations of the R2 dynamics in the few simple model systems we considered. PMID:21992326

  4. Embryonic development of lake whitefish Coregonus clupeaformis: a staging series, analysis of growth and effects of fixation.

    PubMed

    Sreetharan, S; Thome, C; Mitz, C; Eme, J; Mueller, C A; Hulley, E N; Manzon, R G; Somers, C M; Boreham, D R; Wilson, J Y

    2015-09-01

    A reference staging series of 18 morphological stages of laboratory reared lake whitefish Coregonus clupeaformis is provided. The developmental processes of blastulation, gastrulation, neurulation as well as development of the eye, circulatory system, chromatophores and mouth are included and accompanied by detailed descriptions and live imaging. Quantitative measurements of embryo size and mass were taken at each developmental stage. Eggs were 3·19 ± 0·16 mm (mean ± s.d.) in diameter at fertilization and embryos reached a total length (LT ) of 14·25 ± 0·41 mm at hatch. Separated yolk and embryo dry mass were 0·25 ± 0·08 mg and 1·39 ± 0·17 mg, respectively, at hatch. The effects of two common preservatives (formalin and ethanol) were examined throughout development and post hatch. Embryo LT significantly decreased following fixation at all points in development. A correction factor to estimate live LT from corresponding fixed LT was determined as live LT = (fixed LT )(1·025) . Eye diameter and yolk area measurements significantly increased in fixed compared with live embryos up to 85-90% development for both measurements. The described developmental stages can be generalized to teleost species, and is particularly relevant for the study of coregonid development due to additionally shared developmental characteristics. The results of this study and staging series are therefore applicable across various research streams encompassing numerous species that require accurate staging of embryos and descriptions of morphological development. © 2015 The Fisheries Society of the British Isles.

  5. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    NASA Astrophysics Data System (ADS)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  6. The Development, Description and Appraisal of an Emergent Multimethod Research Design to Study Workforce Changes in Integrated Care Interventions.

    PubMed

    Busetto, Loraine; Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G; Vrijhoef, Hubertus J M

    2017-03-08

    In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here.

  7. A spectral approach for the quantitative description of cardiac collagen network from nonlinear optical imaging.

    PubMed

    Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia

    2015-01-01

    The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.

  8. Quantitative descriptive analysis of Italian polenta produced with different corn cultivars.

    PubMed

    Zeppa, Giuseppe; Bertolino, Marta; Rolle, Luca

    2012-01-30

    Polenta is a porridge-like dish, generally made by mixing cornmeal with salt water and stirring constantly while cooking over a low heat. It can be eaten plain, straight from the pan, or topped with various foods (cheeses, meat, sausages, fish, etc.). It is most popular in northern Italy but can also be found in Switzerland, Austria, Croatia, Argentina and other countries in Eastern Europe and South America. Despite this diffusion, there are no data concerning the sensory characteristics of this product. A research study was therefore carried out to define the lexicon for a sensory profile of polenta and relationships with corn cultivars. A lexicon with 13 sensory parameters was defined and validated before references were determined. After panel training, the sensory profiles of 12 autochthonous maize cultivars were defined. The results of this research highlighted that quantitative descriptive analysis can also be used for the sensory description of polenta, and that the defined lexicon can be used to describe the sensory qualities of polenta for both basic research, such as maize selection, and product development. Copyright © 2011 Society of Chemical Industry.

  9. Multi-scale modeling of diffusion-controlled reactions in polymers: renormalisation of reactivity parameters.

    PubMed

    Everaers, Ralf; Rosa, Angelo

    2012-01-07

    The quantitative description of polymeric systems requires hierarchical modeling schemes, which bridge the gap between the atomic scale, relevant to chemical or biomolecular reactions, and the macromolecular scale, where the longest relaxation modes occur. Here, we use the formalism for diffusion-controlled reactions in polymers developed by Wilemski, Fixman, and Doi to discuss the renormalisation of the reactivity parameters in polymer models with varying spatial resolution. In particular, we show that the adjustments are independent of chain length. As a consequence, it is possible to match reactions times between descriptions with different resolution for relatively short reference chains and to use the coarse-grained model to make quantitative predictions for longer chains. We illustrate our results by a detailed discussion of the classical problem of chain cyclization in the Rouse model, which offers the simplest example of a multi-scale descriptions, if we consider differently discretized Rouse models for the same physical system. Moreover, we are able to explore different combinations of compact and non-compact diffusion in the local and large-scale dynamics by varying the embedding dimension.

  10. Hispanic nurses' experiences of bias in the workplace.

    PubMed

    Moceri, Joane T

    2014-01-01

    The continuing issue of health inequity for Hispanics highlights the importance of retaining Hispanic nurses in the workplace. This article describes the use of short answers such as "Describe the bias you experienced" and "If a patient refused care, what was the reason given?" to increase understandings about bias through the descriptions of Hispanic nurses. In this study, bias was defined as those implicit negative stereotypes and attitudes that negatively affect judgments about, evaluations of, and actions toward others. For this qualitative component of a descriptive study employing both qualitative and quantitative methods, 111 Hispanic nurses responded to open-ended questions about experiences of bias that were included with a survey tool and demographic questionnaire. Three themes emerged: being overlooked and undervalued, having to prove competency, and living with "only-ness." Respect was an overarching concept. The written descriptions of bias provided depth and understanding to the quantitative findings. Nurse leaders are well positioned to develop and implement strategies to more effectively support Hispanic nurses and to promote nonbiased interactions in the workplace. Retaining Hispanic nurses is a vital component to address issues of health inequity for Hispanic patients.

  11. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    PubMed

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  12. Serial Scanning and Registration of High Resolution Quantitative Computed Tomography Volume Scans for the Determination of Local Bone Density Changes

    NASA Technical Reports Server (NTRS)

    Whalen, Robert T.; Napel, Sandy; Yan, Chye H.

    1996-01-01

    Progress in development of the methods required to study bone remodeling as a function of time is reported. The following topics are presented: 'A New Methodology for Registration Accuracy Evaluation', 'Registration of Serial Skeletal Images for Accurately Measuring Changes in Bone Density', and 'Precise and Accurate Gold Standard for Multimodality and Serial Registration Method Evaluations.'

  13. Nonsurgical Brain Activity Recovery From a Cap Containing Multiple Electroencephalogram Recording Sites

    DTIC Science & Technology

    2006-09-01

    Astin, Director (December, 1965) [2] Agranovich, Y. Ya. “The theory of operators with dominant main diagonal. I.” Positiv - ity, Volume 2 (1998) pages 153...A Spherical Harmonics Solu- tion for Radiative Transfer Problems with Reflecting Boundaries and Internal Sources” Journal of Quantitative Spectroscopy...F. Huxley. “A Quantitative Description of Membrane Current and its Application to Conduction and Excitation in Nerve” Journal of Physiology (1952

  14. Automated bone segmentation from large field of view 3D MR images of the hip joint

    NASA Astrophysics Data System (ADS)

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S.; Schwarz, Raphael; Engstrom, Craig; Crozier, Stuart

    2013-10-01

    Accurate bone segmentation in the hip joint region from magnetic resonance (MR) images can provide quantitative data for examining pathoanatomical conditions such as femoroacetabular impingement through to varying stages of osteoarthritis to monitor bone and associated cartilage morphometry. We evaluate two state-of-the-art methods (multi-atlas and active shape model (ASM) approaches) on bilateral MR images for automatic 3D bone segmentation in the hip region (proximal femur and innominate bone). Bilateral MR images of the hip joints were acquired at 3T from 30 volunteers. Image sequences included water-excitation dual echo stead state (FOV 38.6 × 24.1 cm, matrix 576 × 360, thickness 0.61 mm) in all subjects and multi-echo data image combination (FOV 37.6 × 23.5 cm, matrix 576 × 360, thickness 0.70 mm) for a subset of eight subjects. Following manual segmentation of femoral (head-neck, proximal-shaft) and innominate (ilium+ischium+pubis) bone, automated bone segmentation proceeded via two approaches: (1) multi-atlas segmentation incorporating non-rigid registration and (2) an advanced ASM-based scheme. Mean inter- and intra-rater reliability Dice's similarity coefficients (DSC) for manual segmentation of femoral and innominate bone were (0.970, 0.963) and (0.971, 0.965). Compared with manual data, mean DSC values for femoral and innominate bone volumes using automated multi-atlas and ASM-based methods were (0.950, 0.922) and (0.946, 0.917), respectively. Both approaches delivered accurate (high DSC values) segmentation results; notably, ASM data were generated in substantially less computational time (12 min versus 10 h). Both automated algorithms provided accurate 3D bone volumetric descriptions for MR-based measures in the hip region. The highly computational efficient ASM-based approach is more likely suitable for future clinical applications such as extracting bone-cartilage interfaces for potential cartilage segmentation.

  15. Automated bone segmentation from large field of view 3D MR images of the hip joint.

    PubMed

    Xia, Ying; Fripp, Jurgen; Chandra, Shekhar S; Schwarz, Raphael; Engstrom, Craig; Crozier, Stuart

    2013-10-21

    Accurate bone segmentation in the hip joint region from magnetic resonance (MR) images can provide quantitative data for examining pathoanatomical conditions such as femoroacetabular impingement through to varying stages of osteoarthritis to monitor bone and associated cartilage morphometry. We evaluate two state-of-the-art methods (multi-atlas and active shape model (ASM) approaches) on bilateral MR images for automatic 3D bone segmentation in the hip region (proximal femur and innominate bone). Bilateral MR images of the hip joints were acquired at 3T from 30 volunteers. Image sequences included water-excitation dual echo stead state (FOV 38.6 × 24.1 cm, matrix 576 × 360, thickness 0.61 mm) in all subjects and multi-echo data image combination (FOV 37.6 × 23.5 cm, matrix 576 × 360, thickness 0.70 mm) for a subset of eight subjects. Following manual segmentation of femoral (head-neck, proximal-shaft) and innominate (ilium+ischium+pubis) bone, automated bone segmentation proceeded via two approaches: (1) multi-atlas segmentation incorporating non-rigid registration and (2) an advanced ASM-based scheme. Mean inter- and intra-rater reliability Dice's similarity coefficients (DSC) for manual segmentation of femoral and innominate bone were (0.970, 0.963) and (0.971, 0.965). Compared with manual data, mean DSC values for femoral and innominate bone volumes using automated multi-atlas and ASM-based methods were (0.950, 0.922) and (0.946, 0.917), respectively. Both approaches delivered accurate (high DSC values) segmentation results; notably, ASM data were generated in substantially less computational time (12 min versus 10 h). Both automated algorithms provided accurate 3D bone volumetric descriptions for MR-based measures in the hip region. The highly computational efficient ASM-based approach is more likely suitable for future clinical applications such as extracting bone-cartilage interfaces for potential cartilage segmentation.

  16. Theory of bi-molecular association dynamics in 2D for accurate model and experimental parameterization of binding rates

    PubMed Central

    Yogurtcu, Osman N.; Johnson, Margaret E.

    2015-01-01

    The dynamics of association between diffusing and reacting molecular species are routinely quantified using simple rate-equation kinetics that assume both well-mixed concentrations of species and a single rate constant for parameterizing the binding rate. In two-dimensions (2D), however, even when systems are well-mixed, the assumption of a single characteristic rate constant for describing association is not generally accurate, due to the properties of diffusional searching in dimensions d ≤ 2. Establishing rigorous bounds for discriminating between 2D reactive systems that will be accurately described by rate equations with a single rate constant, and those that will not, is critical for both modeling and experimentally parameterizing binding reactions restricted to surfaces such as cellular membranes. We show here that in regimes of intrinsic reaction rate (ka) and diffusion (D) parameters ka/D > 0.05, a single rate constant cannot be fit to the dynamics of concentrations of associating species independently of the initial conditions. Instead, a more sophisticated multi-parametric description than rate-equations is necessary to robustly characterize bimolecular reactions from experiment. Our quantitative bounds derive from our new analysis of 2D rate-behavior predicted from Smoluchowski theory. Using a recently developed single particle reaction-diffusion algorithm we extend here to 2D, we are able to test and validate the predictions of Smoluchowski theory and several other theories of reversible reaction dynamics in 2D for the first time. Finally, our results also mean that simulations of reactive systems in 2D using rate equations must be undertaken with caution when reactions have ka/D > 0.05, regardless of the simulation volume. We introduce here a simple formula for an adaptive concentration dependent rate constant for these chemical kinetics simulations which improves on existing formulas to better capture non-equilibrium reaction dynamics from dilute to dense systems. PMID:26328828

  17. Intraoperative perception and estimates on extent of resection during awake glioma surgery: overcoming the learning curve.

    PubMed

    Lau, Darryl; Hervey-Jumper, Shawn L; Han, Seunggu J; Berger, Mitchel S

    2018-05-01

    OBJECTIVE There is ample evidence that extent of resection (EOR) is associated with improved outcomes for glioma surgery. However, it is often difficult to accurately estimate EOR intraoperatively, and surgeon accuracy has yet to be reviewed. In this study, the authors quantitatively assessed the accuracy of intraoperative perception of EOR during awake craniotomy for tumor resection. METHODS A single-surgeon experience of performing awake craniotomies for tumor resection over a 17-year period was examined. Retrospective review of operative reports for quantitative estimation of EOR was recorded. Definitive EOR was based on postoperative MRI. Analysis of accuracy of EOR estimation was examined both as a general outcome (gross-total resection [GTR] or subtotal resection [STR]), and quantitatively (5% within EOR on postoperative MRI). Patient demographics, tumor characteristics, and surgeon experience were examined. The effects of accuracy on motor and language outcomes were assessed. RESULTS A total of 451 patients were included in the study. Overall accuracy of intraoperative perception of whether GTR or STR was achieved was 79.6%, and overall accuracy of quantitative perception of resection (within 5% of postoperative MRI) was 81.4%. There was a significant difference (p = 0.049) in accuracy for gross perception over the 17-year period, with improvement over the later years: 1997-2000 (72.6%), 2001-2004 (78.5%), 2005-2008 (80.7%), and 2009-2013 (84.4%). Similarly, there was a significant improvement (p = 0.015) in accuracy of quantitative perception of EOR over the 17-year period: 1997-2000 (72.2%), 2001-2004 (69.8%), 2005-2008 (84.8%), and 2009-2013 (93.4%). This improvement in accuracy is demonstrated by the significantly higher odds of correctly estimating quantitative EOR in the later years of the series on multivariate logistic regression. Insular tumors were associated with the highest accuracy of gross perception (89.3%; p = 0.034), but lowest accuracy of quantitative perception (61.1% correct; p < 0.001) compared with tumors in other locations. Even after adjusting for surgeon experience, this particular trend for insular tumors remained true. The absence of 1p19q co-deletion was associated with higher quantitative perception accuracy (96.9% vs 81.5%; p = 0.051). Tumor grade, recurrence, diagnosis, and isocitrate dehydrogenase-1 (IDH-1) status were not associated with accurate perception of EOR. Overall, new neurological deficits occurred in 8.4% of cases, and 42.1% of those new neurological deficits persisted after the 3-month follow-up. Correct quantitative perception was associated with lower postoperative motor deficits (2.4%) compared with incorrect perceptions (8.0%; p = 0.029). There were no detectable differences in language outcomes based on perception of EOR. CONCLUSIONS The findings from this study suggest that there is a learning curve associated with the ability to accurately assess intraoperative EOR during glioma surgery, and it may take more than a decade to be truly proficient. Understanding the factors associated with this ability to accurately assess EOR will provide safer surgeries while maximizing tumor resection.

  18. On soft clipping of Zernike moments for deblurring and enhancement of optical point spread functions

    NASA Astrophysics Data System (ADS)

    Becherer, Nico; Jödicke, Hanna; Schlosser, Gregor; Hesser, Jürgen; Zeilfelder, Frank; Männer, Reinhard

    2006-02-01

    Blur and noise originating from the physical imaging processes degrade the microscope data. Accurate deblurring techniques require, however, an accurate estimation of the underlying point-spread function (PSF). A good representation of PSFs can be achieved by Zernike Polynomials since they offer a compact representation where low-order coefficients represent typical aberrations of optical wavefronts while noise is represented in higher order coefficients. A quantitative description of the noise distribution (Gaussian) over the Zernike moments of various orders is given which is the basis for the new soft clipping approach for denoising of PSFs. Instead of discarding moments beyond a certain order, those Zernike moments that are more sensitive to noise are dampened according to the measured distribution and the present noise model. Further, a new scheme to combine experimental and theoretical PSFs in Zernike space is presented. According to our experimental reconstructions, using the new improved PSF the correlation between reconstructed and original volume is raised by 15% on average cases and up to 85% in the case of thin fibre structures, compared to reconstructions where a non improved PSF was used. Finally, we demonstrate the advantages of our approach on 3D images of confocal microscopes by generating visually improved volumes. Additionally, we are presenting a method to render the reconstructed results using a new volume rendering method that is almost artifact-free. The new approach is based on a Shear-Warp technique, wavelet data encoding techniques and a recent approach to approximate the gray value distribution by a Super spline model.

  19. An analytical treatment for three neutrino oscillations in the Earth

    NASA Astrophysics Data System (ADS)

    Aguilar-Arevalo, A. A.; D'Olivo, J. C.; Supanitsky, A. D.

    2012-08-01

    A simple, and at the same time accurate, description of the Earth matter effects on the oscillations between three neutrino flavors is given in terms of the Magnus expansion for the evolution operator.

  20. Quantitative characterization of the spatial distribution of particles in materials: Application to materials processing

    NASA Technical Reports Server (NTRS)

    Parse, Joseph B.; Wert, J. A.

    1991-01-01

    Inhomogeneities in the spatial distribution of second phase particles in engineering materials are known to affect certain mechanical properties. Progress in this area has been hampered by the lack of a convenient method for quantitative description of the spatial distribution of the second phase. This study intends to develop a broadly applicable method for the quantitative analysis and description of the spatial distribution of second phase particles. The method was designed to operate on a desktop computer. The Dirichlet tessellation technique (geometrical method for dividing an area containing an array of points into a set of polygons uniquely associated with the individual particles) was selected as the basis of an analysis technique implemented on a PC. This technique is being applied to the production of Al sheet by PM processing methods; vacuum hot pressing, forging, and rolling. The effect of varying hot working parameters on the spatial distribution of aluminum oxide particles in consolidated sheet is being studied. Changes in distributions of properties such as through-thickness near-neighbor distance correlate with hot-working reduction.

  1. Taxonomy of Macromotettixoides with the description of a new species (Tetrigidae, Metrodorinae)

    PubMed Central

    Zha, Ling-Sheng; Yu, Feng-Ming; Boonmee, Saranyaphat; Eungwanichayapant, Prapassorn D.; Wen, Ting-Chi

    2017-01-01

    Abstract Descriptions of the flying organs and generic characteristics of the genus Macromotettixoides Zheng, Wei & Jiang are currently imprecise. Macromotettixoides is reviewed and compared with allied genera. A re-description is undertaken and a determination key is provided to Macromotettixoides. Macromotettixoides parvula Zha & Wen, sp. n. from the Guizhou Karst Region, China, is described and illustrated with photographs. Observations on the ecology and habits of the new species are recorded. Four current species of Hyboella Hancock are transferred to Macromotettixoides. Variations of the flying organs and tegminal sinus in the Tetrigidae are discussed, which will help to describe them accurately. PMID:28228664

  2. A review of quantitative structure-property relationships for the fate of ionizable organic chemicals in water matrices and identification of knowledge gaps.

    PubMed

    Nolte, Tom M; Ragas, Ad M J

    2017-03-22

    Many organic chemicals are ionizable by nature. After use and release into the environment, various fate processes determine their concentrations, and hence exposure to aquatic organisms. In the absence of suitable data, such fate processes can be estimated using Quantitative Structure-Property Relationships (QSPRs). In this review we compiled available QSPRs from the open literature and assessed their applicability towards ionizable organic chemicals. Using quantitative and qualitative criteria we selected the 'best' QSPRs for sorption, (a)biotic degradation, and bioconcentration. The results indicate that many suitable QSPRs exist, but some critical knowledge gaps remain. Specifically, future focus should be directed towards the development of QSPR models for biodegradation in wastewater and sediment systems, direct photolysis and reaction with singlet oxygen, as well as additional reactive intermediates. Adequate QSPRs for bioconcentration in fish exist, but more accurate assessments can be achieved using pharmacologically based toxicokinetic (PBTK) models. No adequate QSPRs exist for bioconcentration in non-fish species. Due to the high variability of chemical and biological species as well as environmental conditions in QSPR datasets, accurate predictions for specific systems and inter-dataset conversions are problematic, for which standardization is needed. For all QSPR endpoints, additional data requirements involve supplementing the current chemical space covered and accurately characterizing the test systems used.

  3. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng

    Quantitative gene expression analysis in intact single cells can be achieved using single molecule- based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple FISH sub-probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific binding of sub-probes and tissue autofluorescence, limiting its accuracy. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on blinking frequencies of photoswitchable dyes. This fluctuation localization imaging-based FISH (fliFISH) uses blinking frequency patterns, emitted frommore » a transcript bound to multiple sub-probes, which are distinct from blinking patterns emitted from partial or nonspecifically bound sub-probes and autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2, and their ratio (Nkx2-2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct blinking frequency patterns, laying the foundation for reliable single-cell transcriptomics.« less

  4. CPTAC Accelerates Precision Proteomics Biomedical Research | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The accurate quantitation of proteins or peptides using Mass Spectrometry (MS) is gaining prominence in the biomedical research community as an alternative method for analyte measurement. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) investigators have been at the forefront in the promotion of reproducible MS techniques, through the development and application of standardized proteomic methods for protein quantitation on biologically relevant samples.

  5. Factors That Contribute to Assay Variation in Quantitative Analysis of Sex Steroid Hormones Using Liquid and Gas Chromatography-Mass Spectrometry

    ERIC Educational Resources Information Center

    Xu, Xia; Veenstra, Timothy D.

    2012-01-01

    The list of physiological events in which sex steroids play a role continues to increase. To decipher the roles that sex steroids play in any condition requires high quality cohorts of samples and assays that provide highly accurate quantitative measures. Liquid and gas chromatography coupled with mass spectrometry (LC-MS and GC-MS) have…

  6. Methods for characterizing convective cryoprobe heat transfer in ultrasound gel phantoms.

    PubMed

    Etheridge, Michael L; Choi, Jeunghwan; Ramadhyani, Satish; Bischof, John C

    2013-02-01

    While cryosurgery has proven capable in treating of a variety of conditions, it has met with some resistance among physicians, in part due to shortcomings in the ability to predict treatment outcomes. Here we attempt to address several key issues related to predictive modeling by demonstrating methods for accurately characterizing heat transfer from cryoprobes, report temperature dependent thermal properties for ultrasound gel (a convenient tissue phantom) down to cryogenic temperatures, and demonstrate the ability of convective exchange heat transfer boundary conditions to accurately describe freezing in the case of single and multiple interacting cryoprobe(s). Temperature dependent changes in the specific heat and thermal conductivity for ultrasound gel are reported down to -150 °C for the first time here and these data were used to accurately describe freezing in ultrasound gel in subsequent modeling. Freezing around a single and two interacting cryoprobe(s) was characterized in the ultrasound gel phantom by mapping the temperature in and around the "iceball" with carefully placed thermocouple arrays. These experimental data were fit with finite-element modeling in COMSOL Multiphysics, which was used to investigate the sensitivity and effectiveness of convective boundary conditions in describing heat transfer from the cryoprobes. Heat transfer at the probe tip was described in terms of a convective coefficient and the cryogen temperature. While model accuracy depended strongly on spatial (i.e., along the exchange surface) variation in the convective coefficient, it was much less sensitive to spatial and transient variations in the cryogen temperature parameter. The optimized fit, convective exchange conditions for the single-probe case also provided close agreement with the experimental data for the case of two interacting cryoprobes, suggesting that this basic characterization and modeling approach can be extended to accurately describe more complicated, multiprobe freezing geometries. Accurately characterizing cryoprobe behavior in phantoms requires detailed knowledge of the freezing medium's properties throughout the range of expected temperatures and an appropriate description of the heat transfer across the probe's exchange surfaces. Here we demonstrate that convective exchange boundary conditions provide an accurate and versatile description of heat transfer from cryoprobes, offering potential advantages over the traditional constant surface heat flux and constant surface temperature descriptions. In addition, although this study was conducted on Joule-Thomson type cryoprobes, the general methodologies should extend to any probe that is based on convective exchange with a cryogenic fluid.

  7. A multi-subject evaluation of uncertainty in anatomical landmark location on shoulder kinematic description.

    PubMed

    Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J

    2009-04-01

    An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.

  8. Improvement of medical content in the curriculum of biomedical engineering based on assessment of students outcomes.

    PubMed

    Abdulhay, Enas; Khnouf, Ruba; Haddad, Shireen; Al-Bashir, Areen

    2017-08-04

    Improvement of medical content in Biomedical Engineering curricula based on a qualitative assessment process or on a comparison with another high-standard program has been approached by a number of studies. However, the quantitative assessment tools have not been emphasized. The quantitative assessment tools can be more accurate and robust in cases of challenging multidisciplinary fields like that of Biomedical Engineering which includes biomedicine elements mixed with technology aspects. The major limitations of the previous research are the high dependence on surveys or pure qualitative approaches as well as the absence of strong focus on medical outcomes without implicit confusion with the technical ones. The proposed work presents the development and evaluation of an accurate/robust quantitative approach to the improvement of the medical content in the challenging multidisciplinary BME curriculum. The work presents quantitative assessment tools and subsequent improvement of curriculum medical content applied, as example for explanation, to the ABET (Accreditation Board for Engineering and Technology, USA) accredited biomedical engineering BME department at Jordan University of Science and Technology. The quantitative results of assessment of curriculum/course, capstone, exit exam, course assessment by student (CAS) as well as of surveys filled by alumni, seniors, employers and training supervisors were, first, mapped to the expected students' outcomes related to the medical field (SOsM). The collected data were then analyzed and discussed to find curriculum weakness points by tracking shortcomings in every outcome degree of achievement. Finally, actions were taken to fill in the gaps of the curriculum. Actions were also mapped to the students' medical outcomes (SOsM). Weighted averages of obtained quantitative values, mapped to SOsM, indicated accurately the achievement levels of all outcomes as well as the necessary improvements to be performed in curriculum. Mapping the improvements to SOsM also helps in the assessment of the following cycle. The suggested assessment tools can be generalized and extended to any other BME department. Robust improvement of medical content in BME curriculum can subsequently be achieved.

  9. 78 FR 72119 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-02

    ... objective, independent, third party to evaluate that the questionnaire has a format and scope that minimizes... impact indicators. These indicators are both quantitative and descriptive and may include, for example...

  10. A Descriptive Study of the Factors Influencing the Degree to Which Fourth-, Fifth-, and Sixth-Grade Virtual Education Students Perceive a Sense of Virtual Community

    ERIC Educational Resources Information Center

    Gerth, Dana A.

    2013-01-01

    Review of literature revealed a shortage of research describing the development of K-12 virtual communities and the absence of a tool to measure sense of virtual community in K-12 virtual education students. The purpose of this descriptive, quantitative study was to examine the perception of a sense of virtual community from the perspective of…

  11. Field Research Facility Data Integration Framework Data Management Plan: Survey Lines Dataset

    DTIC Science & Technology

    2016-08-01

    CHL and its District partners. The beach morphology surveys on which this report focuses provide quantitative measures of the dynamic nature of...topography • volume change 1.4 Data description The morphology surveys are conducted over a series of 26 shore- perpendicular profile lines spaced 50...dataset input data and products. Table 1. FRF survey lines dataset input data and products. Input Data FDIF Product Description ASCII LARC survey text

  12. DigitalHuman (DH): An Integrative Mathematical Model ofHuman Physiology

    NASA Technical Reports Server (NTRS)

    Hester, Robert L.; Summers, Richard L.; lIescu, Radu; Esters, Joyee; Coleman, Thomas G.

    2010-01-01

    Mathematical models and simulation are important tools in discovering the key causal relationships governing physiological processes and improving medical intervention when physiological complexity is a central issue. We have developed a model of integrative human physiology called DigitalHuman (DH) consisting of -5000 variables modeling human physiology describing cardiovascular, renal, respiratory, endocrine, neural and metabolic physiology. Users can view time-dependent solutions and interactively introduce perturbations by altering numerical parameters to investigate new hypotheses. The variables, parameters and quantitative relationships as well as all other model details are described in XML text files. All aspects of the model, including the mathematical equations describing the physiological processes are written in XML open source, text-readable files. Model structure is based upon empirical data of physiological responses documented within the peer-reviewed literature. The model can be used to understand proposed physiological mechanisms and physiological interactions that may not be otherwise intUitively evident. Some of the current uses of this model include the analyses of renal control of blood pressure, the central role of the liver in creating and maintaining insulin resistance, and the mechanisms causing orthostatic hypotension in astronauts. Additionally the open source aspect of the modeling environment allows any investigator to add detailed descriptions of human physiology to test new concepts. The model accurately predicts both qualitative and more importantly quantitative changes in clinically and experimentally observed responses. DigitalHuman provides scientists a modeling environment to understand the complex interactions of integrative physiology. This research was supported by.NIH HL 51971, NSF EPSCoR, and NASA

  13. Quantitative and Comprehensive Decomposition of the Ion Atmosphere around Nucleic Acids

    PubMed Central

    Bai, Yu; Greenfeld, Max; Travers, Kevin; Chu, Vincent B.; Lipfert, Jan; Doniach, Sebastian; Herschlag, Daniel

    2011-01-01

    The ion atmosphere around nucleic acids critically affects biological and physical processes such as chromosome packing, RNA folding, and molecular recognition. However, the dynamic nature of the ion atmosphere renders it difficult to characterize. The basic thermodynamic description of this atmosphere, a full accounting of the type and number of associated ions, has remained elusive. Here we provide the first complete accounting of the ion atmosphere, using buffer equilibration and atomic emission spectroscopy (BE-AES) to accurately quantitate the cation association and anion depletion. We have examined the influence of ion size and charge on ion occupancy around simple, well-defined DNA molecules. The relative affinity of monovalent and divalent cations correlates inversely with their size. Divalent cations associate preferentially over monovalent cations; e.g., with Na+ in four-fold excess of Mg2+ (20 vs. 5 mM), the ion atmosphere nevertheless has three-fold more Mg2+ than Na+. Further, the dicationic polyamine putrescine2+ does not compete effectively for association relative to divalent metal ions, presumably because of its lower charge density. These and other BE-AES results can be used to evaluate and guide the improvement of electrostatic treatments. As a first step, we compare the BE-AES results to predictions from the widely-used nonlinear Poisson Boltzmann (NLPB) theory and assess the applicability and precision of this theory. In the future, BE-AES in conjunction with improved theoretical models, can be applied to complex binding and folding equilibria of nucleic acids and their complexes, to parse the electrostatic contribution from the overall thermodynamics of important biological processes. PMID:17990882

  14. Modeling and analysis of cell membrane systems with probabilistic model checking

    PubMed Central

    2011-01-01

    Background Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. Results We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Conclusions Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries. PMID:22369714

  15. Pneumocystis jirovecii detection in asymptomatic patients: what does its natural history tell us?

    PubMed Central

    Alanio, Alexandre; Bretagne, Stéphane

    2017-01-01

    Pneumocystis jirovecii is an unusual ascomycetous fungus that can be detected in the lungs of healthy individuals. Transmission from human to human is one of its main characteristics in comparison with other fungi responsible for invasive infections. P. jirovecii is transmitted through the air between healthy individuals, who are considered to be the natural reservoir, at least transiently. In immunocompromised patients, P. jirovecii multiplies, leading to subacute infections and acute life-threatening pneumonia, called Pneumocystis pneumonia [PCP]. PCP is caused by genotypically distinct mixtures of organisms in more than 90% of cases, reinforcing the hypothesis that there is constant inhalation of P. jirovecii from different contacts over time, although reactivation of latent organisms from previous exposures may be possible. Detection of P. jirovecii DNA without any symptoms or related radiological signs has been called “colonization”. This situation could be considered as the result of recent exposure to P. jirovecii that could evolve towards PCP, raising the issue of cotrimoxazole prophylaxis for at-risk quantitative polymerase chain reaction (qPCR)-positive immunocompromised patients. The more accurate way to diagnose PCP is the use of real-time quantitative PCR, which prevents amplicon contamination and allows determination of the fungal load that is mandatory to interpret the qPCR results and manage the patient appropriately. The detection of P. jirovecii in respiratory samples of immunocompromised patients should be considered for potential risk of developing PCP. Many challenges still need to be addressed, including a better description of transmission, characterization of organisms present at low level, and prevention of environmental exposure during immunodepression. PMID:28649366

  16. The near-symmetry of proteins.

    PubMed

    Bonjack-Shterengartz, Maayan; Avnir, David

    2015-04-01

    The majority of protein oligomers form clusters which are nearly symmetric. Understanding of that imperfection, its origins, and perhaps also its advantages requires the conversion of the currently used vague qualitative descriptive language of the near-symmetry into an accurate quantitative measure that will allow to answer questions such as: "What is the degree of symmetry deviation of the protein?," "how do these deviations compare within a family of proteins?," and so on. We developed quantitative methods to answer this type of questions, which are capable of analyzing the whole protein, its backbone or selected portions of it, down to comparison of symmetry-related specific amino-acids, and which are capable of visualizing the various levels of symmetry deviations in the form of symmetry maps. We have applied these methods on an extensive list of homomers and heteromers and found that apparently all proteins never reach perfect symmetry. Strikingly, even homomeric protein clusters are never ideally symmetric. We also found that the main burden of symmetry distortion is on the amino-acids near the symmetry axis; that it is mainly the more hydrophilic amino-acids that take place in symmetry-distortive interactions; and more. The remarkable ability of heteromers to preserve near-symmetry, despite the different sequences, was also shown and analyzed. The comprehensive literature on the suggested advantages symmetric oligomerizations raises a yet-unsolved key question: If symmetry is so advantageous, why do proteins stop shy of perfect symmetry? Some tentative answers to be tested in further studies are suggested in a concluding outlook. © 2014 Wiley Periodicals, Inc.

  17. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data

    PubMed Central

    Chen, Yi-Hau

    2017-01-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA. PMID:28622336

  18. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    PubMed

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA.

  19. Accurate Construction of Photoactivated Localization Microscopy (PALM) Images for Quantitative Measurements

    PubMed Central

    Coltharp, Carla; Kessler, Rene P.; Xiao, Jie

    2012-01-01

    Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and allows a variety of quantitative measurements tailored to specific needs of different biological systems. PMID:23251611

  20. Quantitation of hepatitis B virus DNA in plasma using a sensitive cost-effective "in-house" real-time PCR assay.

    PubMed

    Daniel, Hubert Darius J; Fletcher, John G; Chandy, George M; Abraham, Priya

    2009-01-01

    Sensitive nucleic acid testing for the detection and accurate quantitation of hepatitis B virus (HBV) is necessary to reduce transmission through blood and blood products and for monitoring patients on antiviral therapy. The aim of this study is to standardize an "in-house" real-time HBV polymerase chain reaction (PCR) for accurate quantitation and screening of HBV. The "in-house" real-time assay was compared with a commercial assay using 30 chronically infected individuals and 70 blood donors who are negative for hepatitis B surface antigen, hepatitis C virus (HCV) antibody and human immunodeficiency virus (HIV) antibody. Further, 30 HBV-genotyped samples were tested to evaluate the "in-house" assay's capacity to detect genotypes prevalent among individuals attending this tertiary care hospital. The lower limit of detection of this "in-house" HBV real-time PCR was assessed against the WHO international standard and found to be 50 IU/mL. The interassay and intra-assay coefficient of variation (CV) of this "in-house" assay ranged from 1.4% to 9.4% and 0.0% to 2.3%, respectively. Virus loads as estimated with this "in-house" HBV real-time assay correlated well with the commercial artus HBV RG PCR assay ( r = 0.95, P < 0.0001). This assay can be used for the detection and accurate quantitation of HBV viral loads in plasma samples. This assay can be employed for the screening of blood donations and can potentially be adapted to a multiplex format for simultaneous detection of HBV, HIV and HCV to reduce the cost of testing in blood banks.

  1. Photogrammetry of the Human Brain: A Novel Method for Three-Dimensional Quantitative Exploration of the Structural Connectivity in Neurosurgery and Neurosciences.

    PubMed

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomic awareness of the structural connectivity of the brain is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard microdissection is a validated technique to investigate the different white matter (WM) pathways and to verify the results of tractography, the possibility of interactive exploration of the specimens and reliable acquisition of quantitative information has not been described. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined three-dimensional (3D) models. The aim of this work is to propose the application of the photogrammetric technique for supporting the 3D exploration and the quantitative analysis on the cerebral WM connectivity. The main perisylvian pathways, including the superior longitudinal fascicle and the arcuate fascicle were exposed using the Klingler technique. The photogrammetric acquisition followed each dissection step. The point clouds were registered to a reference magnetic resonance image of the specimen. All the acquisitions were coregistered into an open-source model. We analyzed 5 steps, including the cortical surface, the short intergyral fibers, the indirect posterior and anterior superior longitudinal fascicle, and the arcuate fascicle. The coregistration between the magnetic resonance imaging mesh and the point clouds models was highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D reproduction of WM anatomy and the acquisition of unlimited quantitative data directly on the real specimen during the postdissection analysis. These results open many new promising neuroscientific and educational perspectives and also optimize the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Thermally induced oscillations in fluid flow

    NASA Technical Reports Server (NTRS)

    Zuber, N.

    1970-01-01

    Theoretical investigation distinguishes the various mechanisms responsible for oscillations of pressure, temperature, and flow velocity, derives a quantitative description of the most troublesome mechanisms, and develops a capability to predict the occurrence of unstable flow.

  3. The Structure of Segmental Errors in the Speech of Deaf Children.

    ERIC Educational Resources Information Center

    Levitt, H.; And Others

    1980-01-01

    A quantitative description of the segmental errors occurring in the speech of deaf children is developed. Journal availability: Elsevier North Holland, Inc., 52 Vanderbilt Avenue, New York, NY 10017. (Author)

  4. 78 FR 56942 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    .... These indicators are both quantitative and descriptive and may include, for example, the characteristics... Centers, and to evaluate the progress of the program. Estimate of Burden: 185 hours per center for 223...

  5. 78 FR 50452 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    .... These indicators are both quantitative and descriptive and may include, for example, the characteristics... information to continue funding of the Centers, and to evaluate the progress of the program. Estimate of...

  6. 77 FR 32143 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-31

    ... contractor. These indicators are both quantitative and descriptive and may include, for example, the... the Centers, and to evaluate the progress of the program. Estimate of Burden: 100 hours per center for...

  7. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  8. Accurate Estimate of Some Propagation Characteristics for the First Higher Order Mode in Graded Index Fiber with Simple Analytic Chebyshev Method

    NASA Astrophysics Data System (ADS)

    Dutta, Ivy; Chowdhury, Anirban Roy; Kumbhakar, Dharmadas

    2013-03-01

    Using Chebyshev power series approach, accurate description for the first higher order (LP11) mode of graded index fibers having three different profile shape functions are presented in this paper and applied to predict their propagation characteristics. These characteristics include fractional power guided through the core, excitation efficiency and Petermann I and II spot sizes with their approximate analytic formulations. We have shown that where two and three Chebyshev points in LP11 mode approximation present fairly accurate results, the values based on our calculations involving four Chebyshev points match excellently with available exact numerical results.

  9. Quantitative Description of Crystal Nucleation and Growth from in Situ Liquid Scanning Transmission Electron Microscopy.

    PubMed

    Ievlev, Anton V; Jesse, Stephen; Cochell, Thomas J; Unocic, Raymond R; Protopopescu, Vladimir A; Kalinin, Sergei V

    2015-12-22

    Recent advances in liquid cell (scanning) transmission electron microscopy (S)TEM has enabled in situ nanoscale investigations of controlled nanocrystal growth mechanisms. Here, we experimentally and quantitatively investigated the nucleation and growth mechanisms of Pt nanostructures from an aqueous solution of K2PtCl6. Averaged statistical, network, and local approaches have been used for the data analysis and the description of both collective particles dynamics and local growth features. In particular, interaction between neighboring particles has been revealed and attributed to reduction of the platinum concentration in the vicinity of the particle boundary. The local approach for solving the inverse problem showed that particles dynamics can be simulated by a stationary diffusional model. The obtained results are important for understanding nanocrystal formation and growth processes and for optimization of synthesis conditions.

  10. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  11. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  12. A quantitative index of soil development from field descriptions: Examples from a chronosequence in central California

    USGS Publications Warehouse

    Harden, J.W.

    1982-01-01

    A soil development index has been developed in order to quantitatively measure the degree of soil profile development. This index, which combines eight soil field properties with soil thickness, is designed from field descriptions of the Merced River chronosequence in central California. These eight properties are: clay films, texture plus wet consistence, rubification (color hue and chroma), structure, dry consistence, moist consistence, color value, and pH. Other properties described in the field can be added when more soils are studied. Most of the properties change systematically within the 3 m.y. age span of the Merced River chronosequence. The absence of properties on occasion does not significantly affect the index. Individual quantified field properties, as well as the integrated index, are examined and compared as functions of soil depth and age. ?? 1982.

  13. From information theory to quantitative description of steric effects.

    PubMed

    Alipour, Mojtaba; Safari, Zahra

    2016-07-21

    Immense efforts have been made in the literature to apply the information theory descriptors for investigating the electronic structure theory of various systems. In the present study, the information theoretic quantities, such as Fisher information, Shannon entropy, Onicescu information energy, and Ghosh-Berkowitz-Parr entropy, have been used to present a quantitative description for one of the most widely used concepts in chemistry, namely the steric effects. Taking the experimental steric scales for the different compounds as benchmark sets, there are reasonable linear relationships between the experimental scales of the steric effects and theoretical values of steric energies calculated from information theory functionals. Perusing the results obtained from the information theoretic quantities with the two representations of electron density and shape function, the Shannon entropy has the best performance for the purpose. On the one hand, the usefulness of considering the contributions of functional groups steric energies and geometries, and on the other hand, dissecting the effects of both global and local information measures simultaneously have also been explored. Furthermore, the utility of the information functionals for the description of steric effects in several chemical transformations, such as electrophilic and nucleophilic reactions and host-guest chemistry, has been analyzed. The functionals of information theory correlate remarkably with the stability of systems and experimental scales. Overall, these findings show that the information theoretic quantities can be introduced as quantitative measures of steric effects and provide further evidences of the quality of information theory toward helping theoreticians and experimentalists to interpret different problems in real systems.

  14. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... establish accurate baseline data against which future measurements may be compared. A descriptive report... radioactive tracer survey; (iii) A temperature or noise log; (iv) A casing inspection log, if required by the...

  15. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... establish accurate baseline data against which future measurements may be compared. A descriptive report... radioactive tracer survey; (iii) A temperature or noise log; (iv) A casing inspection log, if required by the...

  16. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... establish accurate baseline data against which future measurements may be compared. A descriptive report... radioactive tracer survey; (iii) A temperature or noise log; (iv) A casing inspection log, if required by the...

  17. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... establish accurate baseline data against which future measurements may be compared. A descriptive report... radioactive tracer survey; (iii) A temperature or noise log; (iv) A casing inspection log, if required by the...

  18. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... establish accurate baseline data against which future measurements may be compared. A descriptive report... radioactive tracer survey; (iii) A temperature or noise log; (iv) A casing inspection log, if required by the...

  19. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  20. Accurate Energies and Orbital Description in Semi-Local Kohn-Sham DFT

    NASA Astrophysics Data System (ADS)

    Lindmaa, Alexander; Kuemmel, Stephan; Armiento, Rickard

    2015-03-01

    We present our progress on a scheme in semi-local Kohn-Sham density-functional theory (KS-DFT) for improving the orbital description while still retaining the level of accuracy of the usual semi-local exchange-correlation (xc) functionals. DFT is a widely used tool for first-principles calculations of properties of materials. A given task normally requires a balance of accuracy and computational cost, which is well achieved with semi-local DFT. However, commonly used semi-local xc functionals have important shortcomings which often can be attributed to features of the corresponding xc potential. One shortcoming is an overly delocalized representation of localized orbitals. Recently a semi-local GGA-type xc functional was constructed to address these issues, however, it has the trade-off of lower accuracy of the total energy. We discuss the source of this error in terms of a surplus energy contribution in the functional that needs to be accounted for, and offer a remedy for this issue which formally stays within KS-DFT, and, which does not harshly increase the computational effort. The end result is a scheme that combines accurate total energies (e.g., relaxed geometries) with an improved orbital description (e.g., improved band structure).

  1. Accurate Identification of MCI Patients via Enriched White-Matter Connectivity Network

    NASA Astrophysics Data System (ADS)

    Wee, Chong-Yaw; Yap, Pew-Thian; Brownyke, Jeffery N.; Potter, Guy G.; Steffens, David C.; Welsh-Bohmer, Kathleen; Wang, Lihong; Shen, Dinggang

    Mild cognitive impairment (MCI), often a prodromal phase of Alzheimer's disease (AD), is frequently considered to be a good target for early diagnosis and therapeutic interventions of AD. Recent emergence of reliable network characterization techniques have made understanding neurological disorders at a whole brain connectivity level possible. Accordingly, we propose a network-based multivariate classification algorithm, using a collection of measures derived from white-matter (WM) connectivity networks, to accurately identify MCI patients from normal controls. An enriched description of WM connections, utilizing six physiological parameters, i.e., fiber penetration count, fractional anisotropy (FA), mean diffusivity (MD), and principal diffusivities (λ 1, λ 2, λ 3), results in six connectivity networks for each subject to account for the connection topology and the biophysical properties of the connections. Upon parcellating the brain into 90 regions-of-interest (ROIs), the average statistics of each ROI in relation to the remaining ROIs are extracted as features for classification. These features are then sieved to select the most discriminant subset of features for building an MCI classifier via support vector machines (SVMs). Cross-validation results indicate better diagnostic power of the proposed enriched WM connection description than simple description with any single physiological parameter.

  2. Accuracy of commercially available c-reactive protein rapid tests in the context of undifferentiated fevers in rural Laos.

    PubMed

    Phommasone, Koukeo; Althaus, Thomas; Souvanthong, Phonesavanh; Phakhounthong, Khansoudaphone; Soyvienvong, Laxoy; Malapheth, Phatthaphone; Mayxay, Mayfong; Pavlicek, Rebecca L; Paris, Daniel H; Dance, David; Newton, Paul; Lubell, Yoel

    2016-02-04

    C-Reactive Protein (CRP) has been shown to be an accurate biomarker for discriminating bacterial from viral infections in febrile patients in Southeast Asia. Here we investigate the accuracy of existing rapid qualitative and semi-quantitative tests as compared with a quantitative reference test to assess their potential for use in remote tropical settings. Blood samples were obtained from consecutive patients recruited to a prospective fever study at three sites in rural Laos. At each site, one of three rapid qualitative or semi-quantitative tests was performed, as well as a corresponding quantitative NycoCard Reader II as a reference test. We estimate the sensitivity and specificity of the three tests against a threshold of 10 mg/L and kappa values for the agreement of the two semi-quantitative tests with the results of the reference test. All three tests showed high sensitivity, specificity and kappa values as compared with the NycoCard Reader II. With a threshold of 10 mg/L the sensitivity of the tests ranged from 87-98 % and the specificity from 91-98 %. The weighted kappa values for the semi-quantitative tests were 0.7 and 0.8. The use of CRP rapid tests could offer an inexpensive and effective approach to improve the targeting of antibiotics in remote settings where health facilities are basic and laboratories are absent. This study demonstrates that accurate CRP rapid tests are commercially available; evaluations of their clinical impact and cost-effectiveness at point of care is warranted.

  3. Utility of high-resolution accurate MS to eliminate interferences in the bioanalysis of ribavirin and its phosphate metabolites.

    PubMed

    Wei, Cong; Grace, James E; Zvyaga, Tatyana A; Drexler, Dieter M

    2012-08-01

    The polar nucleoside drug ribavirin (RBV) combined with IFN-α is a front-line treatment for chronic hepatitis C virus infection. RBV acts as a prodrug and exerts its broad antiviral activity primarily through its active phosphorylated metabolite ribavirin 5´-triphosphate (RTP), and also possibly through ribavirin 5´-monophosphate (RMP). To study RBV transport, diffusion, metabolic clearance and its impact on drug-metabolizing enzymes, a LC-MS method is needed to simultaneously quantify RBV and its phosphorylated metabolites (RTP, ribavirin 5´-diphosphate and RMP). In a recombinant human UGT1A1 assay, the assay buffer components uridine and its phosphorylated derivatives are isobaric with RBV and its phosphorylated metabolites, leading to significant interference when analyzed by LC-MS with the nominal mass resolution mode. Presented here is a LC-MS method employing LC coupled with full-scan high-resolution accurate MS analysis for the simultaneous quantitative determination of RBV, RMP, ribavirin 5´-diphosphate and RTP by differentiating RBV and its phosphorylated metabolites from uridine and its phosphorylated derivatives by accurate mass, thus avoiding interference. The developed LC-high-resolution accurate MS method allows for quantitation of RBV and its phosphorylated metabolites, eliminating the interferences from uridine and its phosphorylated derivatives in recombinant human UGT1A1 assays.

  4. A method for evaluating the fatigue crack growth in spiral notch torsion fracture toughness test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jy -An John; Tan, Ting

    The spiral notch torsion test (SNTT) has been a recent breakthrough in measuring fracture toughness for different materials, including metals, ceramics, concrete, and polymers composites. Due to its high geometry constraint and unique loading condition, SNTT can be used to measure the fracture toughness with smaller specimens without concern of size effects. The application of SNTT to brittle materials has been proved to be successful. The micro-cracks induced by original notches in brittle materials could ensure crack growth in SNTT samples. Therefore, no fatigue pre-cracks are needed. The application of SNTT to the ductile material to generate valid toughness datamore » will require a test sample with sufficient crack length. Fatigue pre-crack growth techniques are employed to introduce sharp crack front into the sample. Previously, only rough calculations were applied to estimate the compliance evolution in the SNTT crack growth process, while accurate quantitative descriptions have never been attempted. This generates an urgent need to understand the crack evolution during the SNTT fracture testing process of ductile materials. Here, the newly developed governing equations for SNTT crack growth estimate are discussed in the paper.« less

  5. Comparison of LEWICE and GlennICE in the SLD Regime

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Potapczuk, Mark G.; Levinson, Laurie H.

    2008-01-01

    A research project is underway at the NASA Glenn Research Center (GRC) to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. This report will present results from two different computer programs. The first program, LEWICE version 3.2.2, has been reported on previously. The second program is GlennICE version 0.1. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the GRC Icing Research Tunnel (IRT) has also been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. This paper will show the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. This report will also provide a description of both programs. Comparisons are then made to recent additions to the SLD database and selected previous cases. Quantitative comparisons are shown for horn height, horn angle, icing limit, area, and leading edge thickness. The results show that the predicted results for both programs are within the accuracy limits of the experimental data for the majority of cases.

  6. A level set method for determining critical curvatures for drainage and imbibition.

    PubMed

    Prodanović, Masa; Bryant, Steven L

    2006-12-15

    An accurate description of the mechanics of pore level displacement of immiscible fluids could significantly improve the predictions from pore network models of capillary pressure-saturation curves, interfacial areas and relative permeability in real porous media. If we assume quasi-static displacement, at constant pressure and surface tension, pore scale interfaces are modeled as constant mean curvature surfaces, which are not easy to calculate. Moreover, the extremely irregular geometry of natural porous media makes it difficult to evaluate surface curvature values and corresponding geometric configurations of two fluids. Finally, accounting for the topological changes of the interface, such as splitting or merging, is nontrivial. We apply the level set method for tracking and propagating interfaces in order to robustly handle topological changes and to obtain geometrically correct interfaces. We describe a simple but robust model for determining critical curvatures for throat drainage and pore imbibition. The model is set up for quasi-static displacements but it nevertheless captures both reversible and irreversible behavior (Haines jump, pore body imbibition). The pore scale grain boundary conditions are extracted from model porous media and from imaged geometries in real rocks. The method gives quantitative agreement with measurements and with other theories and computational approaches.

  7. Quantification of complex modular architecture in plants.

    PubMed

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  8. Tubulin Beta-3 Chain as a New Candidate Protein Biomarker of Human Skin Aging: A Preliminary Study

    PubMed Central

    2017-01-01

    Skin aging is a complex process, and a lot of efforts have been made to identify new and specific targets that could help to diagnose, prevent, and treat skin aging. Several studies concerning skin aging have analyzed the changes in gene expression, and very few investigations have been performed at the protein level. Moreover, none of these proteomic studies has used a global quantitative labeled proteomic offgel approach that allows a more accurate description of aging phenotype. We applied such an approach on human primary keratinocytes obtained from sun-nonexposed skin biopsies of young and elderly women. A total of 517 unique proteins were identified, and 58 proteins were significantly differentially expressed with 40 that were downregulated and 18 upregulated with aging. Gene ontology and pathway analysis performed on these 58 putative biomarkers of skin aging evidenced that these dysregulated proteins were mostly involved in metabolism and cellular processes such as cell cycle and signaling pathways. Change of expression of tubulin beta-3 chain was confirmed by western blot on samples originated from several donors. Thus, this study suggested the tubulin beta-3 chain has a promising biomarker in skin aging. PMID:28626498

  9. Equations of state of detonation products: ammonia and methane

    NASA Astrophysics Data System (ADS)

    Lang, John; Dattelbaum, Dana; Goodwin, Peter; Garcia, Daniel; Coe, Joshua; Leiding, Jeffery; Gibson, Lloyd; Bartram, Brian

    2015-06-01

    Ammonia (NH3) and methane (CH4) are two principal product gases resulting from explosives detonation, and the decomposition of other organic materials under shockwave loading (such as foams). Accurate thermodynamic descriptions of these gases are important for understanding the detonation performance of high explosives. However, shock compression data often do not exist for molecular species in the dense gas phase, and are limited in the fluid phase. Here, we present equation of state measurements of elevated initial density ammonia and methane gases dynamically compressed in gas-gun driven plate impact experiments. Pressure and density of the shocked gases on the principal Hugoniot were determined from direct particle velocity and shock wave velocity measurements recorded using optical velocimetry (Photonic Doppler velocimetry (PDV) and VISAR (velocity interferometer system for any reflector)). Streak spectroscopy and 5-color pyrometry were further used to measure the emission from the shocked gases, from which the temperatures of the shocked gases were estimated. Up to 0.07 GPa, ammonia was not observed to ionize, with temperature remaining below 7000 K. These results provide quantitative measurements of the Hugoniot locus for improving equations of state models of detonation products.

  10. Automatic tissue image segmentation based on image processing and deep learning

    NASA Astrophysics Data System (ADS)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in multimodality imaging, especially in fusion structural images offered by CT, MRI with functional images collected by optical technologies or other novel imaging technologies. Plus, image segmentation also provides detailed structure description for quantitative visualization of treating light distribution in the human body when incorporated with 3D light transport simulation method. Here we used image enhancement, operators, and morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in a deep learning way. We also introduced parallel computing. Such approaches greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. Our results can be used as a criteria when diagnosing diseases such as cerebral atrophy, which is caused by pathological changes in gray matter or white matter. We demonstrated the great potential of such image processing and deep leaning combined automatic tissue image segmentation in personalized medicine, especially in monitoring, and treatments.

  11. Polarized light microscopy for 3-dimensional mapping of collagen fiber architecture in ocular tissues.

    PubMed

    Yang, Bin; Jan, Ning-Jiun; Brazile, Bryn; Voorhees, Andrew; Lathrop, Kira L; Sigal, Ian A

    2018-04-06

    Collagen fibers play a central role in normal eye mechanics and pathology. In ocular tissues, collagen fibers exhibit a complex 3-dimensional (3D) fiber orientation, with both in-plane (IP) and out-of-plane (OP) orientations. Imaging techniques traditionally applied to the study of ocular tissues only quantify IP fiber orientation, providing little information on OP fiber orientation. Accurate description of the complex 3D fiber microstructures of the eye requires quantifying full 3D fiber orientation. Herein, we present 3dPLM, a technique based on polarized light microscopy developed to quantify both IP and OP collagen fiber orientations of ocular tissues. The performance of 3dPLM was examined by simulation and experimental verification and validation. The experiments demonstrated an excellent agreement between extracted and true 3D fiber orientation. Both IP and OP fiber orientations can be extracted from the sclera and the cornea, providing previously unavailable quantitative 3D measures and insight into the tissue microarchitecture. Together, the results demonstrate that 3dPLM is a powerful imaging technique for the analysis of ocular tissues. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Coalescence and genetic diversity in sexual populations under selection.

    PubMed

    Neher, Richard A; Kessinger, Taylor A; Shraiman, Boris I

    2013-09-24

    In sexual populations, selection operates neither on the whole genome, which is repeatedly taken apart and reassembled by recombination, nor on individual alleles that are tightly linked to the chromosomal neighborhood. The resulting interference between linked alleles reduces the efficiency of selection and distorts patterns of genetic diversity. Inference of evolutionary history from diversity shaped by linked selection requires an understanding of these patterns. Here, we present a simple but powerful scaling analysis identifying the unit of selection as the genomic "linkage block" with a characteristic length, , determined in a self-consistent manner by the condition that the rate of recombination within the block is comparable to the fitness differences between different alleles of the block. We find that an asexual model with the strength of selection tuned to that of the linkage block provides an excellent description of genetic diversity and the site frequency spectra compared with computer simulations. This linkage block approximation is accurate for the entire spectrum of strength of selection and is particularly powerful in scenarios with many weakly selected loci. The latter limit allows us to characterize coalescence, genetic diversity, and the speed of adaptation in the infinitesimal model of quantitative genetics.

  13. A method for evaluating the fatigue crack growth in spiral notch torsion fracture toughness test

    DOE PAGES

    Wang, Jy -An John; Tan, Ting

    2018-05-21

    The spiral notch torsion test (SNTT) has been a recent breakthrough in measuring fracture toughness for different materials, including metals, ceramics, concrete, and polymers composites. Due to its high geometry constraint and unique loading condition, SNTT can be used to measure the fracture toughness with smaller specimens without concern of size effects. The application of SNTT to brittle materials has been proved to be successful. The micro-cracks induced by original notches in brittle materials could ensure crack growth in SNTT samples. Therefore, no fatigue pre-cracks are needed. The application of SNTT to the ductile material to generate valid toughness datamore » will require a test sample with sufficient crack length. Fatigue pre-crack growth techniques are employed to introduce sharp crack front into the sample. Previously, only rough calculations were applied to estimate the compliance evolution in the SNTT crack growth process, while accurate quantitative descriptions have never been attempted. This generates an urgent need to understand the crack evolution during the SNTT fracture testing process of ductile materials. Here, the newly developed governing equations for SNTT crack growth estimate are discussed in the paper.« less

  14. Cation solvation with quantum chemical effects modeled by a size-consistent multi-partitioning quantum mechanics/molecular mechanics method.

    PubMed

    Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi

    2017-07-21

    In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.

  15. Bridging the gap between system and cell: The role of ultra-high field MRI in human neuroscience.

    PubMed

    Turner, Robert; De Haan, Daniel

    2017-01-01

    The volume of published research at the levels of systems and cellular neuroscience continues to increase at an accelerating rate. At the same time, progress in psychiatric medicine has stagnated and scientific confidence in cognitive psychology research is under threat due to careless analysis methods and underpowered experiments. With the advent of ultra-high field MRI, with submillimeter image voxels, imaging neuroscience holds the potential to bridge the cellular and systems levels. Use of these accurate and precisely localized quantitative measures of brain activity may go far in providing more secure foundations for psychology, and hence for more appropriate treatment and management of psychiatric illness. However, fundamental issues regarding the construction of testable mechanistic models using imaging data require careful consideration. This chapter summarizes the characteristics of acceptable models of brain function and provides concise descriptions of the relevant types of neuroimaging data that have recently become available. Approaches to data-driven experiments and analyses are described that may lead to more realistic conceptions of the competences of neural assemblages, as they vary across the brain's complex neuroanatomy. © 2017 Elsevier B.V. All rights reserved.

  16. Fundamentals, achievements and challenges in the electrochemical sensing of pathogens.

    PubMed

    Monzó, Javier; Insua, Ignacio; Fernandez-Trillo, Francisco; Rodriguez, Paramaconi

    2015-11-07

    Electrochemical sensors are powerful tools widely used in industrial, environmental and medical applications. The versatility of electrochemical methods allows for the investigation of chemical composition in real time and in situ. Electrochemical detection of specific biological molecules is a powerful means for detecting disease-related markers. In the last 10 years, highly-sensitive and specific methods have been developed to detect waterborne and foodborne pathogens. In this review, we classify the different electrochemical techniques used for the qualitative and quantitative detection of pathogens. The robustness of electrochemical methods allows for accurate detection even in heterogeneous and impure samples. We present a fundamental description of the three major electrochemical sensing methods used in the detection of pathogens and the advantages and disadvantages of each of these methods. In each section, we highlight recent breakthroughs, including the utilisation of microfluidics, immunomagnetic separation and multiplexing for the detection of multiple pathogens in a single device. We also include recent studies describing new strategies for the design of future immunosensing systems and protocols. The high sensitivity and selectivity, together with the portability and the cost-effectiveness of the instrumentation, enhances the demand for further development in the electrochemical detection of microbes.

  17. Automated multi-day tracking of marked mice for the analysis of social behaviour.

    PubMed

    Ohayon, Shay; Avni, Ofer; Taylor, Adam L; Perona, Pietro; Roian Egnor, S E

    2013-09-30

    A quantitative description of animal social behaviour is informative for behavioural biologists and clinicians developing drugs to treat social disorders. Social interaction in a group of animals has been difficult to measure because behaviour develops over long periods of time and requires tedious manual scoring, which is subjective and often non-reproducible. Computer-vision systems with the ability to measure complex social behaviour automatically would have a transformative impact on biology. Here, we present a method for tracking group-housed mice individually as they freely interact over multiple days. Each mouse is bleach-marked with a unique fur pattern. The patterns are automatically learned by the tracking software and used to infer identities. Trajectories are analysed to measure behaviour as it develops over days, beyond the range of acute experiments. We demonstrate how our system may be used to study the development of place preferences, associations and social relationships by tracking four mice continuously for five days. Our system enables accurate and reproducible characterisation of wild-type mouse social behaviour and paves the way for high-throughput long-term observation of the effects of genetic, pharmacological and environmental manipulations. Published by Elsevier B.V.

  18. Memory and obesity affect the population dynamics of asexual freshwater planarians

    NASA Astrophysics Data System (ADS)

    Dunkel, Jörn; Talbot, Jared; Schötz, Eva-Maria

    2011-04-01

    Asexual reproduction in multicellular organisms is a complex biophysical process that is not yet well understood quantitatively. Here, we report a detailed population study for the asexual freshwater planarian Schmidtea mediterranea, which can reproduce via transverse fission due to a large stem cell contingent. Our long-term observations of isolated non-interacting planarian populations reveal that the characteristic fission waiting time distributions for head and tail fragments differ significantly from each other. The stochastic fission dynamics of tail fragments exhibits non-negligible memory effects, implying that an accurate mathematical description of future data should be based on non-Markovian tree models. By comparing the effective growth of non-interacting planarian populations with those of self-interacting populations, we are able to quantify the influence of interactions between flatworms and physical conditions on the population growth. A surprising result is the non-monotonic relationship between effective population growth rate and nutrient supply: planarians exhibit a tendency to become 'obese' if the feeding frequency exceeds a critical level, resulting in a decreased reproduction activity. This suggests that these flatworms, which possess many genes homologous to those of humans, could become a new model system for studying dietary effects on reproduction and regeneration in multicellular organisms.

  19. Re-thinking the classification of autism spectrum disorders

    PubMed Central

    Lord, Catherine; Jones, Rebecca M.

    2012-01-01

    Background The nosology of autism spectrum disorders (ASD) is at a critical point in history as the field seeks to better define dimensions of social-communication deficits and restricted/repetitive behaviors on an individual level for both clinical and neurobiological purposes. These different dimensions also suggest an increasing need for quantitative measures that accurately map their differences, independent of developmental factors such as age, language level and IQ. Method Psychometric measures, clinical observation as well as genetic, neurobiological and physiological research from toddlers, children and adults with ASD are reviewed. Results The question of how to conceptualize ASDs along dimensions versus categories is discussed within the nosology of autism and the proposed changes to the DSM-5 and ICD-11. Differences across development are incorporated into the new classification frameworks. Conclusions It is crucial to balance the needs of clinical practice in ASD diagnostic systems, with neurobiologically based theories that address the associations between social-communication and restricted/repetitive dimensions in individuals. Clarifying terminology, improving description of the core features of ASD and other dimensions that interact with them and providing more valid and reliable ways to quantify them, both for research and clinical purposes, will move forward both practice and science. PMID:22486486

  20. Tubulin Beta-3 Chain as a New Candidate Protein Biomarker of Human Skin Aging: A Preliminary Study.

    PubMed

    Lehmann, Sylvia G; Bourgoin-Voillard, Sandrine; Seve, Michel; Rachidi, Walid

    2017-01-01

    Skin aging is a complex process, and a lot of efforts have been made to identify new and specific targets that could help to diagnose, prevent, and treat skin aging. Several studies concerning skin aging have analyzed the changes in gene expression, and very few investigations have been performed at the protein level. Moreover, none of these proteomic studies has used a global quantitative labeled proteomic offgel approach that allows a more accurate description of aging phenotype. We applied such an approach on human primary keratinocytes obtained from sun-nonexposed skin biopsies of young and elderly women. A total of 517 unique proteins were identified, and 58 proteins were significantly differentially expressed with 40 that were downregulated and 18 upregulated with aging. Gene ontology and pathway analysis performed on these 58 putative biomarkers of skin aging evidenced that these dysregulated proteins were mostly involved in metabolism and cellular processes such as cell cycle and signaling pathways. Change of expression of tubulin beta-3 chain was confirmed by western blot on samples originated from several donors. Thus, this study suggested the tubulin beta-3 chain has a promising biomarker in skin aging.

  1. Aretaeus of Cappadocia and the first description of diabetes.

    PubMed

    Laios, Konstantinos; Karamanou, Marianna; Saridaki, Zenia; Androutsos, George

    2012-01-01

    The name Aretaeus of Cappadocia has been linked with diabetes more than that of any other physician of antiquity, his texts forming a sophisticated synthesis of the previous knowledge on this disease copiously supplemented by his own observations. Gifted with a unique faculty for observing pathologic phenomena, he was able to elaborate upon earlier texts enriching them with his own original findings and numerous thoughtful reflections. Among the many diseases he dealt with, Aretaeus has bequeathed to us an outstandingly vivid and accurate description of diabetes.

  2. A microscopic description of black hole evaporation via holography

    DOE PAGES

    Berkowitz, Evan; Hanada, Masanori; Maltz, Jonathan

    2016-07-19

    Here, we propose a description of how a large, cold black hole (black zero-brane) in type IIA superstring theory evaporates into freely propagating D0-branes, by solving the dual gauge theory quantitatively. The energy spectrum of emitted D0-branes is parametrically close to thermal when the black hole is large. The black hole, while initially cold, gradually becomes an extremely hot and stringy object as it evaporates. As it emits D0-branes, its emission rate speeds up and it evaporates completely without leaving any remnant. Hence this system provides us with a concrete holographic description of black hole evaporation without information loss.

  3. A microscopic description of black hole evaporation via holography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berkowitz, Evan; Hanada, Masanori; Maltz, Jonathan

    Here, we propose a description of how a large, cold black hole (black zero-brane) in type IIA superstring theory evaporates into freely propagating D0-branes, by solving the dual gauge theory quantitatively. The energy spectrum of emitted D0-branes is parametrically close to thermal when the black hole is large. The black hole, while initially cold, gradually becomes an extremely hot and stringy object as it evaporates. As it emits D0-branes, its emission rate speeds up and it evaporates completely without leaving any remnant. Hence this system provides us with a concrete holographic description of black hole evaporation without information loss.

  4. Digital Imaging

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.

  5. Strategy for Extracting DNA from Clay Soil and Detecting a Specific Target Sequence via Selective Enrichment and Real-Time (Quantitative) PCR Amplification ▿

    PubMed Central

    Yankson, Kweku K.; Steck, Todd R.

    2009-01-01

    We present a simple strategy for isolating and accurately enumerating target DNA from high-clay-content soils: desorption with buffers, an optional magnetic capture hybridization step, and quantitation via real-time PCR. With the developed technique, μg quantities of DNA were extracted from mg samples of pure kaolinite and a field clay soil. PMID:19633108

  6. Quantitative analysis of naphthenic acids in water by liquid chromatography-accurate mass time-of-flight mass spectrometry.

    PubMed

    Hindle, Ralph; Noestheden, Matthew; Peru, Kerry; Headley, John

    2013-04-19

    This study details the development of a routine method for quantitative analysis of oil sands naphthenic acids, which are a complex class of compounds found naturally and as contaminants in oil sands process waters from Alberta's Athabasca region. Expanding beyond classical naphthenic acids (CnH2n-zO2), those compounds conforming to the formula CnH2n-zOx (where 2≥x≤4) were examined in commercial naphthenic acid and environmental water samples. HPLC facilitated a five-fold reduction in ion suppression when compared to the more commonly used flow injection analysis. A comparison of 39 model naphthenic acids revealed significant variability in response factors, demonstrating the necessity of using naphthenic acid mixtures for quantitation, rather than model compounds. It was also demonstrated that naphthenic acidic heterogeneity (commercial and environmental) necessitates establishing a single NA mix as the standard against which all quantitation is performed. The authors present the first ISO17025 accredited method for the analysis of naphthenic acids in water using HPLC high resolution accurate mass time-of-flight mass spectrometry. The method detection limit was 1mg/L total oxy-naphthenic acids (Sigma technical mix). Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Quantitative inference of population response properties across eccentricity from motion-induced maps in macaque V1

    PubMed Central

    Chen, Ming; Wu, Si; Lu, Haidong D.; Roe, Anna W.

    2013-01-01

    Interpreting population responses in the primary visual cortex (V1) remains a challenge especially with the advent of techniques measuring activations of large cortical areas simultaneously with high precision. For successful interpretation, a quantitatively precise model prediction is of great importance. In this study, we investigate how accurate a spatiotemporal filter (STF) model predicts average response profiles to coherently drifting random dot motion obtained by optical imaging of intrinsic signals in V1 of anesthetized macaques. We establish that orientation difference maps, obtained by subtracting orthogonal axis-of-motion, invert with increasing drift speeds, consistent with the motion streak effect. Consistent with perception, the speed at which the map inverts (the critical speed) depends on cortical eccentricity and systematically increases from foveal to parafoveal. We report that critical speeds and response maps to drifting motion are excellently reproduced by the STF model. Our study thus suggests that the STF model is quantitatively accurate enough to be used as a first model of choice for interpreting responses obtained with intrinsic imaging methods in V1. We show further that this good quantitative correspondence opens the possibility to infer otherwise not easily accessible population receptive field properties from responses to complex stimuli, such as drifting random dot motions. PMID:23197457

  8. Quantitative Live-Cell Confocal Imaging of 3D Spheroids in a High-Throughput Format.

    PubMed

    Leary, Elizabeth; Rhee, Claire; Wilks, Benjamin T; Morgan, Jeffrey R

    2018-06-01

    Accurately predicting the human response to new compounds is critical to a wide variety of industries. Standard screening pipelines (including both in vitro and in vivo models) often lack predictive power. Three-dimensional (3D) culture systems of human cells, a more physiologically relevant platform, could provide a high-throughput, automated means to test the efficacy and/or toxicity of novel substances. However, the challenge of obtaining high-magnification, confocal z stacks of 3D spheroids and understanding their respective quantitative limitations must be overcome first. To address this challenge, we developed a method to form spheroids of reproducible size at precise spatial locations across a 96-well plate. Spheroids of variable radii were labeled with four different fluorescent dyes and imaged with a high-throughput confocal microscope. 3D renderings of the spheroid had a complex bowl-like appearance. We systematically analyzed these confocal z stacks to determine the depth of imaging and the effect of spheroid size and dyes on quantitation. Furthermore, we have shown that this loss of fluorescence can be addressed through the use of ratio imaging. Overall, understanding both the limitations of confocal imaging and the tools to correct for these limits is critical for developing accurate quantitative assays using 3D spheroids.

  9. Freight Terminals Operating Environment

    DOT National Transportation Integrated Search

    1981-06-01

    The research analysis has been directed toward (1) developing a realistic, quantitative description of the structure of the economic zones that are centered upon medium-size urban areas, (2) determining the nature of traffic in manufactured goods whi...

  10. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  11. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    PubMed Central

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

  12. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    PubMed

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  13. A simplified and accurate detection of the genetically modified wheat MON71800 with one calibrator plasmid.

    PubMed

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Park, Sunghoon; Shin, Min-Ki; Moon, Gui Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2015-06-01

    With the increasing number of genetically modified (GM) events, unauthorized GMO releases into the food market have increased dramatically, and many countries have developed detection tools for them. This study described the qualitative and quantitative detection methods of unauthorized the GM wheat MON71800 with a reference plasmid (pGEM-M71800). The wheat acetyl-CoA carboxylase (acc) gene was used as the endogenous gene. The plasmid pGEM-M71800, which contains both the acc gene and the event-specific target MON71800, was constructed as a positive control for the qualitative and quantitative analyses. The limit of detection in the qualitative PCR assay was approximately 10 copies. In the quantitative PCR assay, the standard deviation and relative standard deviation repeatability values ranged from 0.06 to 0.25 and from 0.23% to 1.12%, respectively. This study supplies a powerful and very simple but accurate detection strategy for unauthorized GM wheat MON71800 that utilizes a single calibrator plasmid. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Digital PCR Quantitation of Muscle Mitochondrial DNA: Age, Fiber Type, and Mutation-Induced Changes.

    PubMed

    Herbst, Allen; Widjaja, Kevin; Nguy, Beatrice; Lushaj, Entela B; Moore, Timothy M; Hevener, Andrea L; McKenzie, Debbie; Aiken, Judd M; Wanagat, Jonathan

    2017-10-01

    Definitive quantitation of mitochondrial DNA (mtDNA) and mtDNA deletion mutation abundances would help clarify the role of mtDNA instability in aging. To more accurately quantify mtDNA, we applied the emerging technique of digital polymerase chain reaction to individual muscle fibers and muscle homogenates from aged rodents. Individual fiber mtDNA content correlated with fiber type and decreased with age. We adapted a digital polymerase chain reaction deletion assay that was accurate in mixing experiments to a mutation frequency of 0.03% and quantitated an age-induced increase in deletion frequency from rat muscle homogenates. Importantly, the deletion frequency measured in muscle homogenates strongly correlated with electron transport chain-deficient fiber abundance determined by histochemical analyses. These data clarify the temporal accumulation of mtDNA deletions that lead to electron chain-deficient fibers, a process culminating in muscle fiber loss. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Quantitative characterization of surface topography using spectral analysis

    NASA Astrophysics Data System (ADS)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  16. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  17. Analysis of ribosomal RNA stability in dead cells of wine yeast by quantitative PCR.

    PubMed

    Sunyer-Figueres, Merce; Wang, Chunxiao; Mas, Albert

    2018-04-02

    During wine production, some yeasts enter a Viable But Not Culturable (VBNC) state, which may influence the quality and stability of the final wine through remnant metabolic activity or by resuscitation. Culture-independent techniques are used for obtaining an accurate estimation of the number of live cells, and quantitative PCR could be the most accurate technique. As a marker of cell viability, rRNA was evaluated by analyzing its stability in dead cells. The species-specific stability of rRNA was tested in Saccharomyces cerevisiae, as well as in three species of non-Saccharomyces yeast (Hanseniaspora uvarum, Torulaspora delbrueckii and Starmerella bacillaris). High temperature and antimicrobial dimethyl dicarbonate (DMDC) treatments were efficient in lysing the yeast cells. rRNA gene and rRNA (as cDNA) were analyzed over 48 h after cell lysis by quantitative PCR. The results confirmed the stability of rRNA for 48 h after the cell lysis treatments. To sum up, rRNA may not be a good marker of cell viability in the wine yeasts that were tested. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. A second generation distributed point polarizable water model.

    PubMed

    Kumar, Revati; Wang, Fang-Fang; Jenness, Glen R; Jordan, Kenneth D

    2010-01-07

    A distributed point polarizable model (DPP2) for water, with explicit terms for charge penetration, induction, and charge transfer, is introduced. The DPP2 model accurately describes the interaction energies in small and large water clusters and also gives an average internal energy per molecule and radial distribution functions of liquid water in good agreement with experiment. A key to the success of the model is its accurate description of the individual terms in the n-body expansion of the interaction energies.

  19. Dissecting innate immune responses with the tools of systems biology.

    PubMed

    Smith, Kelly D; Bolouri, Hamid

    2005-02-01

    Systems biology strives to derive accurate predictive descriptions of complex systems such as innate immunity. The innate immune system is essential for host defense, yet the resulting inflammatory response must be tightly regulated. Current understanding indicates that this system is controlled by complex regulatory networks, which maintain homoeostasis while accurately distinguishing pathogenic infections from harmless exposures. Recent studies have used high throughput technologies and computational techniques that presage predictive models and will be the foundation of a systems level understanding of innate immunity.

  20. User's guide for a computer program for calculating the zero-lift wave drag of complex aircraft configurations

    NASA Technical Reports Server (NTRS)

    Craidon, C. B.

    1983-01-01

    A computer program was developed to extend the geometry input capabilities of previous versions of a supersonic zero lift wave drag computer program. The arbitrary geometry input description is flexible enough to describe almost any complex aircraft concept, so that highly accurate wave drag analysis can now be performed because complex geometries can be represented accurately and do not have to be modified to meet the requirements of a restricted input format.

  1. Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.

    PubMed

    Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W

    2017-01-01

    Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.

  2. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  3. Quantitative perceptual differences among over-the-counter vaginal products using a standardized methodology: implications for microbicide development☆

    PubMed Central

    Mahan, Ellen D.; Morrow, Kathleen M.; Hayes, John E.

    2015-01-01

    Background Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Study Design Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey’s honestly significant difference test. Results Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Conclusions Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. PMID:21757061

  4. Quantitative perceptual differences among over-the-counter vaginal products using a standardized methodology: implications for microbicide development.

    PubMed

    Mahan, Ellen D; Morrow, Kathleen M; Hayes, John E

    2011-08-01

    Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey's honestly significant difference test. Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Conformations of eight-membered cyclosiloxanes

    NASA Astrophysics Data System (ADS)

    Palyulin, V. A.; Zefirov, N. S.; Shklover, V. E.; Struchkov, Yu. T.

    1981-01-01

    Using the Cremer—Pople approach the classification and quantitative description of the conformations of eight-membered rings has been accomplished. The conformations of eight-membered cyclosiloxanes are considered and classified on the basis of available X-ray structural data.

  6. 78 FR 40518 - Notice of Intent To Seek Approval To Extend an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    ... data electronically. These indicators are both quantitative and descriptive and may include, for... information to continue funding of PREMs, and to evaluate the progress of the program. Estimate of Burden: 25...

  7. 78 FR 40517 - Notice of Intent To Seek Approval To Establish an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    .... These indicators are both quantitative and descriptive and may include, for example, the characteristics... the Information: NSF will use the information to continue funding of the Centers, and to evaluate the...

  8. 78 FR 22917 - Notice of Intent To Seek Approval To Establish an Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-17

    .... These indicators are both quantitative and descriptive and may include, for example, the characteristics... information to continue funding of the Centers, and to evaluate the progress of the program. Estimate of...

  9. 75 FR 27777 - Science Advisory Board Staff Office; Notification of a Public Teleconference and Public Meeting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-18

    ... descriptive and quantitative toxicological information on human health effects that may result from chronic... Center for Environmental Assessment (NCEA) within the Office of Research and Development (ORD). NCEA's...

  10. 3D methodology for evaluating rail crossing roughness.

    DOT National Transportation Integrated Search

    2015-03-02

    Description of Research Project The overall objective of this project is to investigate develop a quantitative method or measure for determining the need to rehabilitate rail crossings. The scope of the project includes investigation of sensor capabi...

  11. QUANTITATIVE SOIL DESCRIPTIONS FOR ECOREGIONS OF THE UNITED STATES

    EPA Science Inventory

    Researchers have defined ecological regions of the United States based on patterns in the coincidence of terrestrial, aquatic, abiotic and biotic characteristics that are associated with spatial differences in ecosystems. Ecoregions potentially facilitate regional research, monit...

  12. 42 CFR 495.316 - State monitoring and reporting regarding activities required to receive an incentive payment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... quality measures data; and (v) A description and quantitative data on how its incentive payment program... for quality improvement, reduction of disparities, research or outreach. (ii) Capability to submit...

  13. 42 CFR 495.316 - State monitoring and reporting regarding activities required to receive an incentive payment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... quality measures data; and (v) A description and quantitative data on how its incentive payment program... for quality improvement, reduction of disparities, research or outreach. (ii) Capability to submit...

  14. Download TRIM.Risk

    EPA Pesticide Factsheets

    TRIM.Risk is used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.

  15. The Development, Description and Appraisal of an Emergent Multimethod Research Design to Study Workforce Changes in Integrated Care Interventions

    PubMed Central

    Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G.

    2017-01-01

    Introduction: In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. Theory and methods: The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. Results: The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. Conclusion and discussion: We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here. PMID:29042843

  16. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  17. Description and Application of a Mathematical Method for the Analysis of Harmony

    PubMed Central

    Zuo, Qiting; Jin, Runfang; Ma, Junxia

    2015-01-01

    Harmony issues are widespread in human society and nature. To analyze these issues, harmony theory has been proposed as the main theoretical approach for the study of interpersonal relationships and relationships between humans and nature. Therefore, it is of great importance to study harmony theory. After briefly introducing the basic concepts of harmony theory, this paper expounds the five elements that are essential for the quantitative description of harmony issues in water resources management: harmony participant, harmony objective, harmony regulation, harmony factor, and harmony action. A basic mathematical equation for the harmony degree, that is, a quantitative expression of harmony issues, is introduced in the paper: HD = ai − bj, where a is the uniform degree, b is the difference degree, i is the harmony coefficient, and j is the disharmony coefficient. This paper also discusses harmony assessment and harmony regulation and introduces some application examples. PMID:26167535

  18. Household economic modelsof gill net fishermen at Madura strait

    NASA Astrophysics Data System (ADS)

    Primyastanto, M.

    2018-04-01

    The purposes of this research was to analyze household economic models of gill net fishermen at Madura strait. 30 families of gillnet fishermenwere used for purposive sampling. Data analysis used descriptive qualitative and quantitative (regression analysis). Quantitative descriptive analysis was used to analyze research and compare to factors that affecting household economic models of gill net fishermen family. Research results showed tha thousehold economic models of gill net fishermen at Madura strait was production value level or fishermen revenue at sea was strongly influenced byp roduction asset production, education level, fuel, and work flow. Work flow rate of fishermen families affected by asset production, non fisheries workflow and number of male workforce. Non fishing income level was strongly influenced by non-fishery business assets, number of family members owned andnon-fishing work flow. Spending levels of gill net fishermen at Madura strait was affected by fishing income, non-fishing income, fishermen wife education and fishermen family members.

  19. Application of principal component analysis (PCA) as a sensory assessment tool for fermented food products.

    PubMed

    Ghosh, Debasree; Chattopadhyay, Parimal

    2012-06-01

    The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.

  20. Sensory profile and acceptability for pitanga (Eugenia uniflora L.) nectar with different sweeteners.

    PubMed

    Freitas, Mírian Luisa Faria; Dutra, Mariana Borges de Lima; Bolini, Helena Maria André

    2016-12-01

    The objective of this study was to evaluate the sensory properties and acceptability of pitanga nectar samples prepared with sucrose and different sweeteners (sucralose, aspartame, stevia with 40% rebaudioside A, stevia with 95% rebaudioside A, neotame, and a 2:1 cyclamate/saccharin blend). A total of 13 assessors participated in a quantitative descriptive analysis and evaluated the samples in relation to the descriptor terms. The acceptability test was carried out by 120 fruit juice consumers. The results of the quantitative descriptive analysis of pitanga nectar showed that samples prepared with sucralose, aspartame, and the 2:1 cyclamate/saccharin blend had sensory profiles similar to that of the sample prepared with sucrose. Consumers' most accepted samples were prepared with sucrose, sucralose, aspartame, and neotame. The sweeteners that have the greatest potential to replace sucrose in pitanga nectar are sucralose and aspartame. © The Author(s) 2016.

Top