Sample records for detailed quantitative analysis

  1. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    DTIC Science & Technology

    2017-03-01

    due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE

  2. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  3. Review of Department of Defense Education Activity (DoDEA) Schools. Volume II: Quantitative Analysis of Educational Quality. IDA Paper.

    ERIC Educational Resources Information Center

    Anderson, Lowell Bruce; Bracken, Jerome; Bracken, Marilyn C.

    This volume compiles, and presents in integrated form, the Institute for Defense Analyses' (IDA) quantitative analysis of educational quality provided by the Department of Defense's dependent schools. It covers the quantitative aspects of volume 1 in greater detail and presents some analyses deemed too technical for that volume. The first task in…

  4. Quantitative analysis of SMEX'02 AIRSAR data for soil moisture inversion

    NASA Technical Reports Server (NTRS)

    Zyl, J. J. van; Njoku, E.; Jackson, T.

    2003-01-01

    This paper discusses in detail the characteristics of the AIRSAR data acquired, and provides an initial quantitative assessment of the accuracy of the radar inversion algorithms under these vegetated conditions.

  5. Quantitative analysis of detailed lignin monomer composition by pyrolysis-gas chromatography combined with preliminary acetylation of the samples.

    PubMed

    Sonoda, T; Ona, T; Yokoi, H; Ishida, Y; Ohtani, H; Tsuge, S

    2001-11-15

    Detailed quantitative analysis of lignin monomer composition comprising p-coumaryl, coniferyl, and sinapyl alcohol and p-coumaraldehyde, coniferaldehyde, and sinapaldehyde in plant has not been studied from every point mainly because of artifact formation during the lignin isolation procedure, partial loss of the lignin components inherent in the chemical degradative methods, and difficulty in the explanation of the complex spectra generally observed for the lignin components. Here we propose a new method to quantify lignin monomer composition in detail by pyrolysis-gas chromatography (Py-GC) using acetylated lignin samples. The lignin acetylation procedure would contribute to prevent secondary formation of cinnamaldehydes from the corresponding alcohol forms during pyrolysis, which are otherwise unavoidable in conventional Py-GC process to some extent. On the basis of the characteristic peaks on the pyrograms of the acetylated sample, lignin monomer compositions in various dehydrogenative polymers (DHP) as lignin model compounds were determined, taking even minor components such as cinnamaldehydes into consideration. The observed compositions by Py-GC were in good agreement with the supplied lignin monomer contents on DHP synthesis. The new Py-GC method combined with sample preacetylation allowed us an accurate quantitative analysis of detailed lignin monomer composition using a microgram order of extractive-free plant samples.

  6. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  7. Engaging Business Students in Quantitative Skills Development

    ERIC Educational Resources Information Center

    Cronin, Anthony; Carroll, Paula

    2015-01-01

    In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…

  8. A Comprehensive Analysis in Terms of Molecule-Intrinsic, Quasi-Atomic Orbitals. III. The Covalent Bonding Structure of Urea.

    PubMed

    West, Aaron C; Schmidt, Michael W; Gordon, Mark S; Ruedenberg, Klaus

    2015-10-15

    The analysis of molecular electron density matrices in terms of quasi-atomic orbitals, which was developed in previous investigations, is quantitatively exemplified by a detailed application to the urea molecule. The analysis is found to identify strong and weak covalent bonding interactions as well as intramolecular charge transfers. It yields a qualitative as well as quantitative ab initio description of the bonding structure of this molecule, which raises questions regarding some traditional rationalizations.

  9. Quantitative probe of the transition metal redox in battery electrodes through soft x-ray absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Qinghao; Qiao, Ruimin; Wray, L. Andrew; Chen, Jun; Zhuo, Zengqing; Chen, Yanxue; Yan, Shishen; Pan, Feng; Hussain, Zahid; Yang, Wanli

    2016-10-01

    Most battery positive electrodes operate with a 3d transition-metal (TM) reaction centre. A direct and quantitative probe of the TM states upon electrochemical cycling is valuable for understanding the detailed cycling mechanism and charge diffusion in the electrodes, which is related with many practical parameters of a battery. This review includes a comprehensive summary of our recent demonstrations of five different types of quantitative analysis of the TM states in battery electrodes based on soft x-ray absorption spectroscopy and multiplet calculations. In LiFePO4, a system of a well-known two-phase transformation type, the TM redox could be strictly determined through a simple linear combination of the two end-members. In Mn-based compounds, the Mn states could also be quantitatively evaluated, but a set of reference spectra with all the three possible Mn valences needs to be deliberately selected and considered in the fitting. Although the fluorescence signals suffer the self-absorption distortion, the multiplet calculations could consider the distortion effect, which allows a quantitative determination of the overall Ni oxidation state in the bulk. With the aid of multiplet calculations, one could also achieve a quasi-quantitative analysis of the Co redox evolution in LiCoO2 based on the energy position of the spectroscopic peak. The benefit of multiplet calculations is more important for studying electrode materials with TMs of mixed spin states, as exemplified by the quantitative analysis of the mixed spin Na2-x Fe2(CN)6 system. At the end, we showcase that such quantitative analysis could provide valuable information for optimizing the electrochemical performance of Na0.44MnO2 electrodes for Na-ion batteries. The methodology summarized in this review could be extended to other energy application systems with TM redox centre for detailed analysis, for example, fuel cell and catalytic materials.

  10. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  11. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  12. Products of combustion of non-metallic materials

    NASA Technical Reports Server (NTRS)

    Perry, Cortes L.

    1995-01-01

    The objective of this project is to evaluate methodologies for the qualitative and quantitative determination of the gaseous products of combustion of non-metallic materials of interest to the aerospace community. The goal is to develop instrumentation and analysis procedures which qualitatively and quantitatively identify gaseous products evolved by thermal decomposition and provide NASA a detailed system operating procedure.

  13. X-ray vision of fuel sprays.

    PubMed

    Wang, Jin

    2005-03-01

    With brilliant synchrotron X-ray sources, microsecond time-resolved synchrotron X-ray radiography and tomography have been used to elucidate the detailed three-dimensional structure and dynamics of high-pressure high-speed fuel sprays in the near-nozzle region. The measurement allows quantitative determination of the fuel distribution in the optically impenetrable region owing to the multiple scattering of visible light by small atomized fuel droplets surrounding the jet. X-radiographs of the jet-induced shock waves prove that the fuel jets become supersonic under appropriate injection conditions and that the quantitative analysis of the thermodynamic properties of the shock waves can also be derived from the most direct measurement. In other situations where extremely axial-asymmetric sprays are encountered, mass deconvolution and cross-sectional fuel distribution models can be computed based on the monochromatic and time-resolved X-radiographic images collected from various rotational orientations of the sprays. Such quantitative analysis reveals the never-before-reported characteristics and most detailed near-nozzle mass distribution of highly transient fuel sprays.

  14. Investigation of Carbon Fiber Architecture in Braided Composites Using X-Ray CT Inspection

    NASA Technical Reports Server (NTRS)

    Rhoads, Daniel J.; Miller, Sandi G.; Roberts, Gary D.; Rauser, Richard W.; Golovaty, Dmitry; Wilber, J. Patrick; Espanol, Malena I.

    2017-01-01

    During the fabrication of braided carbon fiber composite materials, process variations occur which affect the fiber architecture. Quantitative measurements of local and global fiber architecture variations are needed to determine the potential effect of process variations on mechanical properties of the cured composite. Although non-destructive inspection via X-ray CT imaging is a promising approach, difficulties in quantitative analysis of the data arise due to the similar densities of the material constituents. In an effort to gain more quantitative information about features related to fiber architecture, methods have been explored to improve the details that can be captured by X-ray CT imaging. Metal-coated fibers and thin veils are used as inserts to extract detailed information about fiber orientations and inter-ply behavior from X-ray CT images.

  15. Computer-Assisted Analysis of Spontaneous Speech: Quantification of Basic Parameters in Aphasic and Unimpaired Language

    ERIC Educational Resources Information Center

    Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter

    2012-01-01

    Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…

  16. 77 FR 71479 - Tribal Consultation Consistent With Executive Order 13175; Request for Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... Native Communities. The research will provide policy-makers, Tribal governments, Tribal community organizations, and economic development practitioners with detailed analysis and quantitative research that can...

  17. Falcon: A Temporal Visual Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.

    2016-09-05

    Flexible visible exploration of long, high-resolution time series from multiple sensor streams is a challenge in several domains. Falcon is a visual analytics approach that helps researchers acquire a deep understanding of patterns in log and imagery data. Falcon allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations with multiple levels of detail. These capabilities are applicable to the analysis of any quantitative time series.

  18. Airport Surface Traffic Control Concept Formulation Study : Volume 3. Operations Analysis of O'Hare Airport - Part 2.

    DOT National Transportation Integrated Search

    1975-07-01

    The volume presents the results of the quantitative analyses of the O'Hare ASTC System operations. The operations environments for the periods selected for detailed analysis of the ASDE films and controller communications recording are described. Fol...

  19. Rapid qualitative and quantitative analysis of proanthocyanidin oligomers and polymers by UPLC-MS/MS

    USDA-ARS?s Scientific Manuscript database

    Proanthocyanidins (PAs) are a structurally complex and bioactive group of tannins. Detailed analysis of PA concentration, composition, and structure typically requires the use of one or more time-consuming analytical methods. For example, the commonly employed thiolysis and phloroglucinolysis method...

  20. The other half of the story: effect size analysis in quantitative research.

    PubMed

    Maher, Jessica Middlemis; Markey, Jonathan C; Ebert-May, Diane

    2013-01-01

    Statistical significance testing is the cornerstone of quantitative research, but studies that fail to report measures of effect size are potentially missing a robust part of the analysis. We provide a rationale for why effect size measures should be included in quantitative discipline-based education research. Examples from both biological and educational research demonstrate the utility of effect size for evaluating practical significance. We also provide details about some effect size indices that are paired with common statistical significance tests used in educational research and offer general suggestions for interpreting effect size measures. Finally, we discuss some inherent limitations of effect size measures and provide further recommendations about reporting confidence intervals.

  1. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  2. What Is Rotating in Exploratory Factor Analysis?

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2015-01-01

    Exploratory factor analysis (EFA) is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what "rotation" is, what exactly is rotating, and why we use rotation when performing…

  3. Practical considerations for obtaining high quality quantitative computed tomography data of the skeletal system.

    PubMed

    Troy, Karen L; Edwards, W Brent

    2018-05-01

    Quantitative CT (QCT) analysis involves the calculation of specific parameters such as bone volume and density from CT image data, and can be a powerful tool for understanding bone quality and quantity. However, without careful attention to detail during all steps of the acquisition and analysis process, data can be of poor- to unusable-quality. Good quality QCT for research requires meticulous attention to detail and standardization of all aspects of data collection and analysis to a degree that is uncommon in a clinical setting. Here, we review the literature to summarize practical and technical considerations for obtaining high quality QCT data, and provide examples of how each recommendation affects calculated variables. We also provide an overview of the QCT analysis technique to illustrate additional opportunities to improve data reproducibility and reliability. Key recommendations include: standardizing the scanner and data acquisition settings, minimizing image artifacts, selecting an appropriate reconstruction algorithm, and maximizing repeatability and objectivity during QCT analysis. The goal of the recommendations is to reduce potential sources of error throughout the analysis, from scan acquisition to the interpretation of results. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Visualization and Quantitative Analysis of Crack-Tip Plastic Zone in Pure Nickel

    NASA Astrophysics Data System (ADS)

    Kelton, Randall; Sola, Jalal Fathi; Meletis, Efstathios I.; Huang, Haiying

    2018-05-01

    Changes in surface morphology have long been thought to be associated with crack propagation in metallic materials. We have studied areal surface texture changes around crack tips in an attempt to understand the correlations between surface texture changes and crack growth behavior. Detailed profiling of the fatigue sample surface was carried out at short fatigue intervals. An image processing algorithm was developed to calculate the surface texture changes. Quantitative analysis of the crack-tip plastic zone, crack-arrested sites near triple points, and large surface texture changes associated with crack release from arrested locations was carried out. The results indicate that surface texture imaging enables visualization of the development of plastic deformation around a crack tip. Quantitative analysis of the surface texture changes reveals the effects of local microstructures on the crack growth behavior.

  5. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    NASA Astrophysics Data System (ADS)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  6. Analysis of Radio Frequency Surveillance Systems for Air Traffic Control : Volume 1. Text.

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  7. Methodology Series Module 10: Qualitative Health Research

    PubMed Central

    Setia, Maninder Singh

    2017-01-01

    Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher. PMID:28794545

  8. Methodology Series Module 10: Qualitative Health Research.

    PubMed

    Setia, Maninder Singh

    2017-01-01

    Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher.

  9. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    NASA Astrophysics Data System (ADS)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  10. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  11. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    PubMed

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  12. [Analysis of active components of evidence materials secured in the cases of drugs abuse associated with amphetamines and cannabis products].

    PubMed

    Wachowiak, Roman; Strach, Bogna

    2006-01-01

    The study takes advantage of the presently available effective physicochemical methods (isolation, crystallization, determination of melting point, TLC, GLC and UV spectrophotometry) for an objective and reliable qualitative and quantitative analysis of frequently abused drugs. The authors determined the conditions for qualitative and quantitative analysis of active components of the secured evidence materials containing amphetamine sulphate, methylamphetamine hydrochloride, 3,4-me-tylenedioxy-methamphetamine hydrochloride (MDMA, Ecstasy), as well as delta(9)-tetrahydrocannabinol (delta(9)-THC) as an active component of cannabis (marihuana, hashish). The usefulness of physicochemical tests of evidence materials for opinionating purposes is subject to a detailed forensic toxicological interpretation.

  13. An analysis of radio frequency surveillance systems for air traffic control volume II: appendixes

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  14. Qualitative and Quantitative Analyses of Glycogen in Human Milk.

    PubMed

    Matsui-Yatsuhashi, Hiroko; Furuyashiki, Takashi; Takata, Hiroki; Ishida, Miyuki; Takumi, Hiroko; Kakutani, Ryo; Kamasaka, Hiroshi; Nagao, Saeko; Hirose, Junko; Kuriki, Takashi

    2017-02-22

    Identification as well as a detailed analysis of glycogen in human milk has not been shown yet. The present study confirmed that glycogen is contained in human milk by qualitative and quantitative analyses. High-performance anion exchange chromatography (HPAEC) and high-performance size exclusion chromatography with a multiangle laser light scattering detector (HPSEC-MALLS) were used for qualitative analysis of glycogen in human milk. Quantitative analysis was carried out by using samples obtained from the individual milks. The result revealed that the concentration of human milk glycogen varied depending on the mother's condition-such as the period postpartum and inflammation. The amounts of glycogen in human milk collected at 0 and 1-2 months postpartum were higher than in milk collected at 3-14 months postpartum. In the milk from mothers with severe mastitis, the concentration of glycogen was about 40 times higher than that in normal milk.

  15. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    PubMed Central

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649

  16. Quantitative analysis of small molecule-nucleic acid interactions with a biosensor surface and surface plasmon resonance detection.

    PubMed

    Liu, Yang; Wilson, W David

    2010-01-01

    Surface plasmon resonance (SPR) technology with biosensor surfaces has become a widely-used tool for the study of nucleic acid interactions without any labeling requirements. The method provides simultaneous kinetic and equilibrium characterization of the interactions of biomolecules as well as small molecule-biopolymer binding. SPR monitors molecular interactions in real time and provides significant advantages over optical or calorimetic methods for systems with strong binding coupled to small spectroscopic signals and/or reaction heats. A detailed and practical guide for nucleic acid interaction analysis using SPR-biosensor methods is presented. Details of the SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips, and samples, as well as extensive information on experimental design, quantitative and qualitative data analysis and presentation. A specific example of the interaction of a minor-groove-binding agent with DNA is evaluated by both kinetic and steady-state SPR methods to illustrate the technique. Since the molecules that bind cooperatively to specific DNA sequences are attractive for many applications, a cooperative small molecule-DNA interaction is also presented.

  17. 3D quantitative comparative analysis of long bone diaphysis variations in microanatomy and cross-sectional geometry.

    PubMed

    Houssaye, Alexandra; Taverne, Maxime; Cornette, Raphaël

    2018-05-01

    Long bone inner structure and cross-sectional geometry display a strong functional signal, leading to convergences, and are widely analyzed in comparative anatomy at small and large taxonomic scales. Long bone microanatomical studies have essentially been conducted on transverse sections but also on a few longitudinal ones. Recent studies highlighted the interest in analyzing variations of the inner structure along the diaphysis using a qualitative as well as a quantitative approach. With the development of microtomography, it has become possible to study three-dimensional (3D) bone microanatomy and, in more detail, the form-function relationships of these features. This study focused on the selection of quantitative parameters to describe in detail the cross-sectional shape changes and distribution of the osseous tissue along the diaphysis. Two-dimensional (2D) virtual transverse sections were also performed in the two usual reference planes and results were compared with those obtained based on the whole diaphysis analysis. The sample consisted in 14 humeri and 14 femora of various mammalian taxa that are essentially terrestrial. Comparative quantitative analyses between different datasets made it possible to highlight the parameters that are strongly impacted by size and phylogeny and the redundant ones, and thus to estimate their relevance for use in form-function analyses. The analysis illustrated that results based on 2D transverse sections are similar for both sectional planes; thus if a strong bias exists when mixing sections from the two reference planes in the same analysis, it would not problematic to use either one plane or the other in comparative studies. However, this may no longer hold for taxa showing a much stronger variation in bone microstructure along the diaphysis. Finally, the analysis demonstrated the significant contribution of the parameters describing variations along the diaphysis, and thus the interest in performing 3D analyses; this should be even more fruitful for heterogeneous diaphyses. In addition, covariation analyses showed that there is a strong interest in removing the size effect to access the differences in the microstructure of the humerus and femur. This methodological study provides a reference for future quantitative analyses on long bone inner structure and should make it possible, through a detailed knowledge of each descriptive parameter, to better interpret results from the multivariate analyses associated with these studies. This will have direct implications for studies in vertebrate anatomy, but also in paleontology and anthropology. © 2018 Anatomical Society.

  18. P53 Mutation Analysis to Predict Tumor Response in Patients Undergoing Neoadjuvant Treatment for Locally Advanced Breast Cancer

    DTIC Science & Technology

    2006-10-01

    then sequenced (for GeneChip- positiv SSCP (for GeneChip-negative). We have received a total of 43 core breast biopsy DNA samples from the UNC... quantitative luciferase reporter. Both reporters exploit a “rheostatable” promoter for p53 expression and utilize the “delitto perfetto” in vivo... quantitative luciferase-based assay is also being used to characterize the altered function sistent an tion T mutants in greater detail. Preliminary

  19. A collection of flow visualization techniques used in the Aerodynamic Research Branch

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.

  20. Quantitative phase analysis and microstructure characterization of magnetite nanocrystals obtained by microwave assisted non-hydrolytic sol–gel synthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sciancalepore, Corrado, E-mail: corrado.sciancalepore@unimore.it; Bondioli, Federica; INSTM Consortium, Via G. Giusti 9, 51121 Firenze

    2015-02-15

    An innovative preparation procedure, based on microwave assisted non-hydrolytic sol–gel synthesis, to obtain spherical magnetite nanoparticles was reported together with a detailed quantitative phase analysis and microstructure characterization of the synthetic products. The nanoparticle growth was analyzed as a function of the synthesis time and was described in terms of crystallization degree employing the Rietveld method on the magnetic nanostructured system for the determination of the amorphous content using hematite as internal standard. Product crystallinity increases as the microwave thermal treatment is increased and reaches very high percentages for synthesis times longer than 1 h. Microstructural evolution of nanocrystals wasmore » followed by the integral breadth methods to obtain information on the crystallite size-strain distribution. The results of diffraction line profile analysis were compared with nanoparticle grain distribution estimated by dimensional analysis of the transmission electron microscopy (TEM) images. A variation both in the average grain size and in the distribution of the coherently diffraction domains is evidenced, allowing to suppose a relationship between the two quantities. The traditional integral breadth methods have proven to be valid for a rapid assessment of the diffraction line broadening effects in the above-mentioned nanostructured systems and the basic assumption for the correct use of these methods are discussed as well. - Highlights: • Fe{sub 3}O{sub 4} nanocrystals were obtained by MW-assisted non-hydrolytic sol–gel synthesis. • Quantitative phase analysis revealed that crystallinity up to 95% was reached. • The strategy of Rietveld refinements was discussed in details. • Dimensional analysis showed nanoparticles ranging from 4 to 8 nm. • Results of integral breadth methods were compared with microscopic analysis.« less

  1. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  2. Characterization of Low-Molecular-Weight Heparins by Strong Anion-Exchange Chromatography.

    PubMed

    Sadowski, Radosław; Gadzała-Kopciuch, Renata; Kowalkowski, Tomasz; Widomski, Paweł; Jujeczka, Ludwik; Buszewski, Bogusław

    2017-11-01

    Currently, detailed structural characterization of low-molecular-weight heparin (LMWH) products is an analytical subject of great interest. In this work, we carried out a comprehensive structural analysis of LMWHs and applied a modified pharmacopeial method, as well as methods developed by other researchers, to the analysis of novel biosimilar LMWH products; and, for the first time, compared the qualitative and quantitative composition of commercially available drugs (enoxaparin, nadroparin, and dalteparin). For this purpose, we used strong anion-exchange (SAX) chromatography with spectrophotometric detection because this method is more helpful, easier, and faster than other separation techniques for the detailed disaccharide analysis of new LMWH drugs. In addition, we subjected the obtained results to statistical analysis (factor analysis, t-test, and Newman-Keuls post hoc test).

  3. Dominant Epistasis Between Two Quantitative Trait Loci Governing Sporulation Efficiency in Yeast Saccharomyces cerevisiae

    PubMed Central

    Bergman, Juraj; Mitrikeski, Petar T.

    2015-01-01

    Summary Sporulation efficiency in the yeast Saccharomyces cerevisiae is a well-established model for studying quantitative traits. A variety of genes and nucleotides causing different sporulation efficiencies in laboratory, as well as in wild strains, has already been extensively characterised (mainly by reciprocal hemizygosity analysis and nucleotide exchange methods). We applied a different strategy in order to analyze the variation in sporulation efficiency of laboratory yeast strains. Coupling classical quantitative genetic analysis with simulations of phenotypic distributions (a method we call phenotype modelling) enabled us to obtain a detailed picture of the quantitative trait loci (QTLs) relationships underlying the phenotypic variation of this trait. Using this approach, we were able to uncover a dominant epistatic inheritance of loci governing the phenotype. Moreover, a molecular analysis of known causative quantitative trait genes and nucleotides allowed for the detection of novel alleles, potentially responsible for the observed phenotypic variation. Based on the molecular data, we hypothesise that the observed dominant epistatic relationship could be caused by the interaction of multiple quantitative trait nucleotides distributed across a 60--kb QTL region located on chromosome XIV and the RME1 locus on chromosome VII. Furthermore, we propose a model of molecular pathways which possibly underlie the phenotypic variation of this trait. PMID:27904371

  4. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  5. Imaging and quantitative methods for studying cytoskeletal rearrangements during root development and gravitropism.

    PubMed

    Jacques, Eveline; Wells, Darren M; Bennett, Malcolm J; Vissenberg, Kris

    2015-01-01

    High-resolution imaging of cytoskeletal structures paves the way for standardized methods to quantify cytoskeletal organization. Here we provide a detailed description of the analysis performed to determine the microtubule patterns in gravistimulated roots, using the recently developed software tool MicroFilament Analyzer.

  6. The Representation of Pragmatic Knowledge in Recent ELT Textbooks

    ERIC Educational Resources Information Center

    Ren, Wei; Han, Zhengrui

    2016-01-01

    Pragmatic competence has become an increasingly crucial component of language pedagogy. This article reports on a quantitative and qualitative study of ten English language textbooks used in Chinese universities with a particular focus on their coverage of pragmatic knowledge. Detailed analysis focused specifically on the mention of pragmatic…

  7. Spiraling between qualitative and quantitative data on women's health behaviors: a double helix model for mixed methods.

    PubMed

    Mendlinger, Sheryl; Cwikel, Julie

    2008-02-01

    A double helix spiral model is presented which demonstrates how to combine qualitative and quantitative methods of inquiry in an interactive fashion over time. Using findings on women's health behaviors (e.g., menstruation, breast-feeding, coping strategies), we show how qualitative and quantitative methods highlight the theory of knowledge acquisition in women's health decisions. A rich data set of 48 semistructured, in-depth ethnographic interviews with mother-daughter dyads from six ethnic groups (Israeli, European, North African, Former Soviet Union [FSU], American/Canadian, and Ethiopian), plus seven focus groups, provided the qualitative sources for analysis. This data set formed the basis of research questions used in a quantitative telephone survey of 302 Israeli women from the ages of 25 to 42 from four ethnic groups. We employed multiple cycles of data analysis from both data sets to produce a more detailed and multidimensional picture of women's health behavior decisions through a spiraling process.

  8. Two worlds collide: Image analysis methods for quantifying structural variation in cluster molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less

  9. Pre-Steady-State Kinetic Analysis of Single-Nucleotide Incorporation by DNA Polymerases

    PubMed Central

    Su, Yan; Guengerich, F. Peter

    2016-01-01

    Pre-steady-state kinetic analysis is a powerful and widely used method to obtain multiple kinetic parameters. This protocol provides a step-by-step procedure for pre-steady-state kinetic analysis of single-nucleotide incorporation by a DNA polymerase. It describes the experimental details of DNA substrate annealing, reaction mixture preparation, handling of the RQF-3 rapid quench-flow instrument, denaturing polyacrylamide DNA gel preparation, electrophoresis, quantitation, and data analysis. The core and unique part of this protocol is the rationale for preparation of the reaction mixture (the ratio of the polymerase to the DNA substrate) and methods for conducting pre-steady-state assays on an RQF-3 rapid quench-flow instrument, as well as data interpretation after analysis. In addition, the methods for the DNA substrate annealing and DNA polyacrylamide gel preparation, electrophoresis, quantitation and analysis are suitable for use in other studies. PMID:27248785

  10. Two worlds collide: image analysis methods for quantifying structural variation in cluster molecular dynamics.

    PubMed

    Steenbergen, K G; Gaston, N

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.

  11. Comprehensive analysis of ß-lactam antibiotics including penicillins, cephalosporins, and carbapenems in poultry muscle using liquid chromatography coupled to tandem mass spectrometry.

    PubMed

    Berendsen, Bjorn J A; Gerritsen, Henk W; Wegh, Robin S; Lameris, Steven; van Sebille, Ralph; Stolker, Alida A M; Nielen, Michel W F

    2013-09-01

    A comprehensive method for the quantitative residue analysis of trace levels of 22 ß-lactam antibiotics, including penicillins, cephalosporins, and carbapenems, in poultry muscle by liquid chromatography in combination with tandem mass spectrometric detection is reported. The samples analyzed for ß-lactam residues are hydrolyzed using piperidine in order to improve compound stability and to include the total residue content of the cephalosporin ceftifour. The reaction procedure was optimized using a full experimental design. Following detailed isotope labeling, tandem mass spectrometry studies and exact mass measurements using high-resolution mass spectrometry reaction schemes could be proposed for all ß-lactams studied. The main reaction occurring is the hydrolysis of the ß-lactam ring under formation of the piperidine substituted amide. For some ß-lactams, multiple isobaric hydrolysis reaction products are obtained, in accordance with expectations, but this did not hamper quantitative analysis. The final method was fully validated as a quantitative confirmatory residue analysis method according to Commission Decision 2002/657/EC and showed satisfactory quantitative performance for all compounds with trueness between 80 and 110% and within-laboratory reproducibility below 22% at target level, except for biapenem. For biapenem, the method proved to be suitable for qualitative analysis only.

  12. Safety evaluation methodology for advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  13. Quantified Energy Dissipation Rates in the Terrestrial Bow Shock. 1.; Analysis Techniques and Methodology

    NASA Technical Reports Server (NTRS)

    Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.

    2014-01-01

    We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.

  14. Estimation of Characteristics of Echo Envelope Using RF Echo Signal from the Liver

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Tadashi; Hachiya, Hiroyuki; Kamiyama, Naohisa; Ikeda, Kazuki; Moriyasu, Norifumi

    2001-05-01

    To realize quantitative diagnosis of liver cirrhosis, we have been analyzing the probability density function (PDF) of echo amplitude using B-mode images. However, the B-mode image is affected by the various signal and image processing techniques used in the diagnosis equipment, so a detailed and quantitative analysis is very difficult. In this paper, we analyze the PDF of echo amplitude using RF echo signal and B-mode images of normal and cirrhotic livers, and compare both results to examine the validity of the RF echo signal.

  15. Returning to Work after Cancer: Quantitative Studies and Prototypical Narratives

    PubMed Central

    Steiner, John F.; Nowels, Carolyn T.; Main, Deborah S.

    2009-01-01

    Objective A combination of quantitative data and illustrative narratives may allow cancer survivorship researchers to disseminate their research findings more broadly. We identified recent, methodologically rigorous quantitative studies on return to work after cancer, summarized the themes from these studies, and illustrated those themes with narratives of individual cancer survivors. Methods We reviewed English-language studies of return to work for adult cancer survivors through June, 2008, and identified 13 general themes from papers that met methodological criteria (population-based sampling, prospective and longitudinal assessment, detailed assessment of work, evaluation of economic impact, assessment of moderators of work return, and large sample size). We drew survivorship narratives from a prior qualitative research study to illustrate these themes. Results Nine quantitative studies met 4 or more of our 6 methodological criteria. These studies suggested that most cancer survivors could return to work without residual disabilities. Cancer site, clinical prognosis, treatment modalities, socioeconomic status, and attributes of the job itself influenced the likelihood of work return. Three narratives - a typical survivor who returned to work after treatment, an individual unable to return to work, and an inspiring survivor who returned to work despite substantial barriers - illustrated many of the themes from the quantitative literature while providing additional contextual details. Conclusion Illustrative narratives can complement the findings of cancer survivorship research if researchers are rigorous and transparent in the selection, analysis, and retelling of those stories. PMID:19507264

  16. On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment

    NASA Astrophysics Data System (ADS)

    Guterres, Rui M.

    The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.

  17. The Fathering Indicators Framework: A Tool for Quantitative and Qualitative Analysis.

    ERIC Educational Resources Information Center

    Gadsden, Vivian, Ed.; Fagan, Jay, Ed.; Ray, Aisha, Ed.; Davis, James Earl, Ed.

    The Fathering Indicators Framework (FIF) is an evaluation tool designed to help researchers, practitioners, and policymakers conceptualize, examine, and measure change in fathering behaviors in relation to child and family well-being. This report provides a detailed overview of the research and theory informing the development of the FIF. The FIF…

  18. Quantitative Study of Vibrational Symmetry of Injured Vocal Folds via Digital Kymography in Excised Canine Larynges

    ERIC Educational Resources Information Center

    Krausert, Christopher R.; Ying, Di; Zhang, Yu; Jiang, Jack J.

    2011-01-01

    Purpose: Digital kymography and vocal fold curve fitting are blended with detailed symmetry analysis of kymograms to provide a comprehensive characterization of the vibratory properties of injured vocal folds. Method: Vocal fold vibration of 12 excised canine larynges was recorded under uninjured, unilaterally injured, and bilaterally injured…

  19. 76 FR 1660 - Request for Information for the 2011 Trafficking in Persons Report

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... observations, or other sources of quantitative or qualitative data, details on the research or data-gathering... to Congress and Assessment of U.S. Government Activities to Combat Human Trafficking (``AG Report... analysis in the TIP report is done in addition to the AG Report, resulting in a multi-faceted self...

  20. Secondary Interstellar Oxygen in the Heliosphere: Numerical Modeling and Comparison with IBEX-Lo Data

    NASA Astrophysics Data System (ADS)

    Baliukin, I. I.; Izmodenov, V. V.; Möbius, E.; Alexashov, D. B.; Katushkina, O. A.; Kucharek, H.

    2017-12-01

    Quantitative analysis of the interstellar heavy (oxygen and neon) atom fluxes obtained by the Interstellar Boundary Explorer (IBEX) suggests the existence of the secondary interstellar oxygen component. This component is formed near the heliopause due to charge exchange of interstellar oxygen ions with hydrogen atoms, as was predicted theoretically. A detailed quantitative analysis of the fluxes of interstellar heavy atoms is only possible with a model that takes into account both the filtration of primary and the production of secondary interstellar oxygen in the boundary region of the heliosphere as well as a detailed simulation of the motion of interstellar atoms inside the heliosphere. This simulation must take into account photoionization, charge exchange with the protons of the solar wind and solar gravitational attraction. This paper presents the results of modeling interstellar oxygen and neon atoms through the heliospheric interface and inside the heliosphere based on a three-dimensional kinetic-MHD model of the solar wind interaction with the local interstellar medium and a comparison of these results with the data obtained on the IBEX spacecraft.

  1. Detailed mechanism of benzene oxidation

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1987-01-01

    A detailed quantitative mechanism for the oxidation of benzene in both argon and nitrogen diluted systems is presented. Computed ignition delay time for argon diluted mixtures are in satisfactory agreement with experimental results for a wide range of initial conditions. An experimental temperature versus time profile for a nitrogen diluted oxidation was accurately matched and several concentration profiles were matched qualitatively. Application of sensitivity analysis has given approximate rate constant expressions for the two dominant heat release reactions, the oxidation of C6H5 and C5H5 radicals by molecular oxygen.

  2. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  3. Quantitative analysis of naphthenic acids in water by liquid chromatography-accurate mass time-of-flight mass spectrometry.

    PubMed

    Hindle, Ralph; Noestheden, Matthew; Peru, Kerry; Headley, John

    2013-04-19

    This study details the development of a routine method for quantitative analysis of oil sands naphthenic acids, which are a complex class of compounds found naturally and as contaminants in oil sands process waters from Alberta's Athabasca region. Expanding beyond classical naphthenic acids (CnH2n-zO2), those compounds conforming to the formula CnH2n-zOx (where 2≥x≤4) were examined in commercial naphthenic acid and environmental water samples. HPLC facilitated a five-fold reduction in ion suppression when compared to the more commonly used flow injection analysis. A comparison of 39 model naphthenic acids revealed significant variability in response factors, demonstrating the necessity of using naphthenic acid mixtures for quantitation, rather than model compounds. It was also demonstrated that naphthenic acidic heterogeneity (commercial and environmental) necessitates establishing a single NA mix as the standard against which all quantitation is performed. The authors present the first ISO17025 accredited method for the analysis of naphthenic acids in water using HPLC high resolution accurate mass time-of-flight mass spectrometry. The method detection limit was 1mg/L total oxy-naphthenic acids (Sigma technical mix). Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Chernobyl Doses. Volume 1. Analysis of Forest Canopy Radiation Response from Multispectral Imagery and the Relationship to Doses

    DTIC Science & Technology

    1994-09-01

    AD-A284 746 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-37-V1 Chernobyl Doses Volume 1-Analysis of Forest Canopy Radiation Response...REPORT DATE 3. REPORT TYPE AND DATES COVERED 940901 Technical 870929- 930930 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS Chernobyl Doses Volume 1-Analysis of...volume of the report Chernobyl Doses presents details of a new, quantitative method for remotely sensing ionizing radiation dose to vegetation

  5. Hypercuboidal renormalization in spin foam quantum gravity

    NASA Astrophysics Data System (ADS)

    Bahr, Benjamin; Steinhaus, Sebastian

    2017-06-01

    In this article, we apply background-independent renormalization group methods to spin foam quantum gravity. It is aimed at extending and elucidating the analysis of a companion paper, in which the existence of a fixed point in the truncated renormalization group flow for the model was reported. Here, we repeat the analysis with various modifications and find that both qualitative and quantitative features of the fixed point are robust in this setting. We also go into details about the various approximation schemes employed in the analysis.

  6. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  7. Quantitative fingerprinting by headspace--two-dimensional comprehensive gas chromatography-mass spectrometry of solid matrices: some challenging aspects of the exhaustive assessment of food volatiles.

    PubMed

    Nicolotti, Luca; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Sgorbini, Barbara; Rubiolo, Patrizia; Bicchi, Carlo

    2013-10-10

    The study proposes an investigation strategy that simultaneously provides detailed profiling and quantitative fingerprinting of food volatiles, through a "comprehensive" analytical platform that includes sample preparation by Headspace Solid Phase Microextraction (HS-SPME), separation by two-dimensional comprehensive gas chromatography coupled with mass spectrometry detection (GC×GC-MS) and data processing using advanced fingerprinting approaches. Experiments were carried out on roasted hazelnuts and on Gianduja pastes (sugar, vegetable oil, hazelnuts, cocoa, nonfat dried milk, vanilla flavorings) and demonstrated that the information potential of each analysis can better be exploited if suitable quantitation methods are applied. Quantitation approaches through Multiple Headspace Extraction and Standard Addition were compared in terms of performance parameters (linearity, precision, accuracy, Limit of Detection and Limit of Quantitation) under headspace linearity conditions. The results on 19 key analytes, potent odorants, and technological markers, and more than 300 fingerprint components, were used for further processing to obtain information concerning the effect of the matrix on volatile release, and to produce an informative chemical blueprint for use in sensomics and flavoromics. The importance of quantitation approaches in headspace analysis of solid matrices of complex composition, and the advantages of MHE, are also critically discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Structure and Function of Iron-Loaded Synthetic Melanin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yiwen; Xie, Yijun; Wang, Zhao

    We describe a synthetic method for increasing and controlling the iron loading of synthetic melanin nanoparticles and use the resulting materials to perform a systematic quantitative investigation on their structure- property relationship. A comprehensive analysis by magnetometry, electron paramagnetic resonance, and nuclear magnetic relaxation dispersion reveals the complexities of their magnetic behavior and how these intraparticle magnetic interactions manifest in useful material properties such as their performance as MRI contrast agents. This analysis allows predictions of the optimal iron loading through a quantitative modeling of antiferromagnetic coupling that arises from proximal iron ions. This study provides a detailed understanding ofmore » this complex class of synthetic biomaterials and gives insight into interactions and structures prevalent in naturally occurring melanins.« less

  9. Quantitative analysis of backflow of reversible pump-turbine in generating mode

    NASA Astrophysics Data System (ADS)

    Liu, K. H.; Zhang, Y. N.; Li, J. W.; Xian, H. Z.

    2016-05-01

    Significant vibration and pressure fluctuations are usually observed when pump- turbine is operated during the off-design conditions, especially turbine brake and runaway. The root cause of these instability phenomena is the abnormal unsteady flow (especially the backflow) inside the pump-turbine. In the present paper, numerical simulation method is adopted to investigate the characteristics of the flow inside the whole passage of pump-turbine with two guide vane openings (6° and 21° respectively) and three kinds of operating conditions (turbine, runaway and turbine braking respectively). A quantitative analysis of backflow is performed in both the axial and radial directions and the generation and development of backflow in the pump turbine are revealed with great details.

  10. Local structure in LaMnO3 and CaMnO3 perovskites: A quantitative structural refinement of Mn K -edge XANES data

    NASA Astrophysics Data System (ADS)

    Monesi, C.; Meneghini, C.; Bardelli, F.; Benfatto, M.; Mobilio, S.; Manju, U.; Sarma, D. D.

    2005-11-01

    Hole-doped perovskites such as La1-xCaxMnO3 present special magnetic and magnetotransport properties, and it is commonly accepted that the local atomic structure around Mn ions plays a crucial role in determining these peculiar features. Therefore experimental techniques directly probing the local atomic structure, like x-ray absorption spectroscopy (XAS), have been widely exploited to deeply understand the physics of these compounds. Quantitative XAS analysis usually concerns the extended region [extended x-ray absorption fine structure (EXAFS)] of the absorption spectra. The near-edge region [x-ray absorption near-edge spectroscopy (XANES)] of XAS spectra can provide detailed complementary information on the electronic structure and local atomic topology around the absorber. However, the complexity of the XANES analysis usually prevents a quantitative understanding of the data. This work exploits the recently developed MXAN code to achieve a quantitative structural refinement of the Mn K -edge XANES of LaMnO3 and CaMnO3 compounds; they are the end compounds of the doped manganite series LaxCa1-xMnO3 . The results derived from the EXAFS and XANES analyses are in good agreement, demonstrating that a quantitative picture of the local structure can be obtained from XANES in these crystalline compounds. Moreover, the quantitative XANES analysis provides topological information not directly achievable from EXAFS data analysis. This work demonstrates that combining the analysis of extended and near-edge regions of Mn K -edge XAS spectra could provide a complete and accurate description of Mn local atomic environment in these compounds.

  11. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  12. Aerodynamic design and analysis of small horizontal axis wind turbine blades

    NASA Astrophysics Data System (ADS)

    Tang, Xinzi

    This work investigates the aerodynamic design and analysis of small horizontal axis wind turbine blades via the blade element momentum (BEM) based approach and the computational fluid dynamics (CFD) based approach. From this research, it is possible to draw a series of detailed guidelines on small wind turbine blade design and analysis. The research also provides a platform for further comprehensive study using these two approaches. The wake induction corrections and stall corrections of the BEM method were examined through a case study of the NREL/NASA Phase VI wind turbine. A hybrid stall correction model was proposed to analyse wind turbine power performance. The proposed model shows improvement in power prediction for the validation case, compared with the existing stall correction models. The effects of the key rotor parameters of a small wind turbine as well as the blade chord and twist angle distributions on power performance were investigated through two typical wind turbines, i.e. a fixed-pitch variable-speed (FPVS) wind turbine and a fixed-pitch fixed-speed (FPFS) wind turbine. An engineering blade design and analysis code was developed in MATLAB to accommodate aerodynamic design and analysis of the blades.. The linearisation for radial profiles of blade chord and twist angle for the FPFS wind turbine blade design was discussed. Results show that, the proposed linearisation approach leads to reduced manufacturing cost and higher annual energy production (AEP), with minimal effects on the low wind speed performance. Comparative studies of mesh and turbulence models in 2D and 3D CFD modelling were conducted. The CFD predicted lift and drag coefficients of the airfoil S809 were compared with wind tunnel test data and the 3D CFD modelling method of the NREL/NASA Phase VI wind turbine were validated against measurements. Airfoil aerodynamic characterisation and wind turbine power performance as well as 3D flow details were studied. The detailed flow characteristics from the CFD modelling are quantitatively comparable to the measurements, such as blade surface pressure distribution and integrated forces and moments. It is confirmed that the CFD approach is able to provide a more detailed qualitative and quantitative analysis for wind turbine airfoils and rotors..

  13. Managing Technical and Cost Uncertainties During Product Development in a Simulation-Based Design Environment

    NASA Technical Reports Server (NTRS)

    Karandikar, Harsh M.

    1997-01-01

    An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.

  14. Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.

    ERIC Educational Resources Information Center

    Swiger, John; Klaus, Allen

    1996-01-01

    A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)

  15. Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.

    ERIC Educational Resources Information Center

    Lindahl, William H.; Gardner, James H.

    Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…

  16. Development and Evaluation of a Multimedia e-Learning Resource for Electrolyte and Acid-Base Disorders

    ERIC Educational Resources Information Center

    Davids, Mogamat Razeen; Chikte, Usuf M. E.; Halperin, Mitchell L.

    2011-01-01

    This article reports on the development and evaluation of a Web-based application that provides instruction and hands-on practice in managing electrolyte and acid-base disorders. Our teaching approach, which focuses on concepts rather than details, encourages quantitative analysis and a logical problem-solving approach. Identifying any dangers to…

  17. The Effects of Corrective Feedback on Chinese Learners' Writing Accuracy: A Quantitative Analysis in an EFL Context

    ERIC Educational Resources Information Center

    Wang, Xin

    2017-01-01

    Scholars debate whether corrective feedback contributes to improving L2 learners' grammatical accuracy in writing performance. Some researchers take a stance on the ineffectiveness of corrective feedback based on the impracticality of providing detailed corrective feedback for all L2 learners and detached grammar instruction in language…

  18. Using Mixed Methods to Study First-Year College Impact on Liberal Arts Learning Outcomes

    ERIC Educational Resources Information Center

    Seifert, Tricia A.; Goodman, Kathleen; King, Patricia M.; Baxter Magolda, Marcia B.

    2010-01-01

    This study details the collection, analysis, and interpretation of data from a national multi-institutional longitudinal mixed methods study of college impact and student development of liberal arts outcomes. The authors found three sets of practices in the quantitative data that corroborated with the themes that emerged from the qualitative data:…

  19. Analysis of Coupled Model Uncertainties in Source to Dose Modeling of Human Exposures to Ambient Air Pollution: a PM2.5 Case-Study

    EPA Science Inventory

    Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into acco...

  20. Predictors of Academic Success for Maori, Pacific and Non-Maori Non-Pacific Students in Health Professional Education: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Wikaire, Erena; Curtis, Elana; Cormack, Donna; Jiang, Yannan; McMillan, Louise; Loto, Rob; Reid, Papaarangi

    2017-01-01

    Tertiary institutions internationally aim to increase student diversity, however are struggling to achieve equitable academic outcomes for indigenous and ethnic minority students and detailed exploration of factors that impact on success is required. This study explored the predictive effect of admission variables on academic outcomes for health…

  1. Method for a quantitative investigation of the frozen flow hypothesis

    PubMed

    Schock; Spillar

    2000-09-01

    We present a technique to test the frozen flow hypothesis quantitatively, using data from wave-front sensors such as those found in adaptive optics systems. Detailed treatments of the theoretical background of the method and of the error analysis are presented. Analyzing data from the 1.5-m and 3.5-m telescopes at the Starfire Optical Range, we find that the frozen flow hypothesis is an accurate description of the temporal development of atmospheric turbulence on time scales of the order of 1-10 ms but that significant deviations from the frozen flow behavior are present for longer time scales.

  2. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  3. X-ray diffraction and SEM study of kidney stones in Israel: quantitative analysis, crystallite size determination, and statistical characterization.

    PubMed

    Uvarov, Vladimir; Popov, Inna; Shapur, Nandakishore; Abdin, Tamer; Gofrit, Ofer N; Pode, Dov; Duvdevani, Mordechai

    2011-12-01

    Urinary calculi have been recognized as one of the most painful medical disorders. Tenable knowledge of the phase composition of the stones is very important to elucidate an underlying etiology of the stone disease. We report here the results of quantitative X-ray diffraction phase analysis performed on 278 kidney stones from the 275 patients treated at the Department of Urology of Hadassah Hebrew University Hospital (Jerusalem, Israel). Quantification of biominerals in multicomponent samples was performed using the normalized reference intensity ratio method. According to the observed phase compositions, all the tested stones were classified into five chemical groups: oxalates (43.2%), phosphates (7.7%), urates (10.3%), cystines (2.9%), and stones composed of a mixture of different minerals (35.9%). A detailed analysis of each allocated chemical group is presented along with the crystallite size calculations for all the observed crystalline phases. The obtained results have been compared with the published data originated from different geographical regions. Morphology and spatial distribution of the phases identified in the kidney stones were studied with scanning electron microscopy (SEM) and energy-dispersive X-ray spectroscopy (EDS). This type of detailed study of phase composition and structural characteristics of the kidney stones was performed in Israel for the first time.

  4. IB-LBM simulation of the haemocyte dynamics in a stenotic capillary.

    PubMed

    Yuan-Qing, Xu; Xiao-Ying, Tang; Fang-Bao, Tian; Yu-Hua, Peng; Yong, Xu; Yan-Jun, Zeng

    2014-01-01

    To study the behaviour of a haemocyte when crossing a stenotic capillary, the immersed boundary-lattice Boltzmann method was used to establish a quantitative analysis model. The haemocyte was assumed to be spherical and to have an elastic cell membrane, which can be driven by blood flow to adopt a highly deformable character. In the stenotic capillary, the spherical blood cell was stressed both by the flow and the wall dimension, and the cell shape was forced to be stretched to cross the stenosis. Our simulation investigated the haemocyte crossing process in detail. The velocity and pressure were anatomised to obtain information on how blood flows through a capillary and to estimate the degree of cell damage caused by excessive pressure. Quantitative velocity analysis results demonstrated that a large haemocyte crossing a small stenosis would have a noticeable effect on blood flow, while quantitative pressure distribution analysis results indicated that the crossing process would produce a special pressure distribution in the cell interior and to some extent a sudden change between the cell interior and the surrounding plasma.

  5. Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing

    PubMed Central

    Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak

    2012-01-01

    This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700

  6. Nanoscale morphological analysis of soft matter aggregates with fractal dimension ranging from 1 to 3.

    PubMed

    Valle, Francesco; Brucale, Marco; Chiodini, Stefano; Bystrenova, Eva; Albonetti, Cristiano

    2017-09-01

    While the widespread emergence of nanoscience and nanotechnology can be dated back to the early eighties, the last decade has witnessed a true coming of age of this research field, with novel nanomaterials constantly finding their way into marketed products. The performance of nanomaterials being dominated by their nanoscale morphology, their quantitative characterization with respect to a number of properties is often crucial. In this context, those imaging techniques able to resolve nanometer scale details are clearly key players. In particular, atomic force microscopy can yield a fully quantitative tridimensional (3D) topography at the nanoscale. Herein, we will review a set of morphological analysis based on the scaling approach, which give access to important quantitative parameters for describing nanomaterial samples. To generalize the use of such morphological analysis on all D-dimensions (1D, 2D and 3D), the review will focus on specific soft matter aggregates with fractal dimension ranging from just above 1 to just below 3. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. High-throughput real-time quantitative reverse transcription PCR.

    PubMed

    Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F

    2006-02-01

    Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.

  8. High Definition Confocal Imaging Modalities for the Characterization of Tissue-Engineered Substitutes.

    PubMed

    Mayrand, Dominique; Fradette, Julie

    2018-01-01

    Optimal imaging methods are necessary in order to perform a detailed characterization of thick tissue samples from either native or engineered tissues. Tissue-engineered substitutes are featuring increasing complexity including multiple cell types and capillary-like networks. Therefore, technical approaches allowing the visualization of the inner structural organization and cellular composition of tissues are needed. This chapter describes an optical clearing technique which facilitates the detailed characterization of whole-mount samples from skin and adipose tissues (ex vivo tissues and in vitro tissue-engineered substitutes) when combined with spectral confocal microscopy and quantitative analysis on image renderings.

  9. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model.

    PubMed

    Yoshioka, S; Matsuhana, B; Tanaka, S; Inouye, Y; Oshima, N; Kinoshita, S

    2011-01-06

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model.

  10. COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA

    PubMed Central

    Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.

    2011-01-01

    Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793

  11. The application of drug dose equivalence in the quantitative analysis of receptor occupation and drug combinations

    PubMed Central

    Tallarida, Ronald J.; Raffa, Robert B.

    2014-01-01

    In this review we show that the concept of dose equivalence for two drugs, the theoretical basis of the isobologram, has a wider use in the analysis of pharmacological data derived from single and combination drug use. In both its application to drug combination analysis with isoboles and certain other actions, listed below, the determination of doses, or receptor occupancies, that yield equal effects provide useful metrics that can be used to obtain quantitative information on drug actions without postulating any intimate mechanism of action. These other drug actions discussed here include (1) combinations of agonists that produce opposite effects, (2) analysis of inverted U-shaped dose effect curves of single agents, (3) analysis on the effect scale as an alternative to isoboles and (4) the use of occupation isoboles to examine competitive antagonism in the dual receptor case. New formulas derived to assess the statistical variance for additive combinations are included, and the more detailed mathematical topics are included in the appendix. PMID:20546783

  12. Predictive and mechanistic multivariate linear regression models for reaction development

    PubMed Central

    Santiago, Celine B.; Guo, Jing-Yao

    2018-01-01

    Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711

  13. An analysis of the multiple model adaptive control algorithm. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Greene, C. S.

    1978-01-01

    Qualitative and quantitative aspects of the multiple model adaptive control method are detailed. The method represents a cascade of something which resembles a maximum a posteriori probability identifier (basically a bank of Kalman filters) and a bank of linear quadratic regulators. Major qualitative properties of the MMAC method are examined and principle reasons for unacceptable behavior are explored.

  14. The Use of Cohesive Devices in Argumentative Writing by Chinese EFL Learners at Different Proficiency Levels

    ERIC Educational Resources Information Center

    Yang, Wenxing; Sun, Ying

    2012-01-01

    This article reports on a study that comparatively investigated the differences and similarities in the (incorrect) use of cohesive devices by second-year and fourth-year undergraduate Chinese EFL learners in their argumentative writings. Via detailed analysis of the quantitative and qualitative data, this study seeks to reveal if the patterns of…

  15. INFRARED SPECTROSCOPY: A TOOL FOR DETERMINATION OF THE DEGREE OF CONVERSION IN DENTAL COMPOSITES

    PubMed Central

    Moraes, Luciene Gonçalves Palmeira; Rocha, Renata Sanches Ferreira; Menegazzo, Lívia Maluf; de AraÚjo, Eudes Borges; Yukimitu, Keizo; Moraes, João Carlos Silos

    2008-01-01

    Infrared spectroscopy is one of the most widely used techniques for measurement of conversion degree in dental composites. However, to obtain good quality spectra and quantitative analysis from spectral data, appropriate expertise and knowledge of the technique are mandatory. This paper presents important details to use infrared spectroscopy for determination of the conversion degree. PMID:19089207

  16. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework.

    PubMed

    Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.

  17. Deformation analysis of MEMS structures by modified digital moiré methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhanwei; Lou, Xinhao; Gao, Jianxin

    2010-11-01

    Quantitative deformation analysis of micro-fabricated electromechanical systems is of importance for the design and functional control of microsystems. In this paper, two modified digital moiré processing methods, Gaussian blurring algorithm combined with digital phase shifting and geometrical phase analysis (GPA) technique based on digital moiré method, are developed to quantitatively analyse the deformation behaviour of micro-electro-mechanical system (MEMS) structures. Measuring principles and experimental procedures of the two methods are described in detail. A digital moiré fringe pattern is generated by superimposing a specimen grating etched directly on a microstructure surface with a digital reference grating (DRG). Most of the grating noise is removed from the digital moiré fringes, which enables the phase distribution of the moiré fringes to be obtained directly. Strain measurement result of a MEMS structure demonstrates the feasibility of the two methods.

  18. Chromatographic background drift correction coupled with parallel factor analysis to resolve coelution problems in three-dimensional chromatographic data: quantification of eleven antibiotics in tap water samples by high-performance liquid chromatography coupled with a diode array detector.

    PubMed

    Yu, Yong-Jie; Wu, Hai-Long; Fu, Hai-Yan; Zhao, Juan; Li, Yuan-Na; Li, Shu-Fang; Kang, Chao; Yu, Ru-Qin

    2013-08-09

    Chromatographic background drift correction has been an important field of research in chromatographic analysis. In the present work, orthogonal spectral space projection for background drift correction of three-dimensional chromatographic data was described in detail and combined with parallel factor analysis (PARAFAC) to resolve overlapped chromatographic peaks and obtain the second-order advantage. This strategy was verified by simulated chromatographic data and afforded significant improvement in quantitative results. Finally, this strategy was successfully utilized to quantify eleven antibiotics in tap water samples. Compared with the traditional methodology of introducing excessive factors for the PARAFAC model to eliminate the effect of background drift, clear improvement in the quantitative performance of PARAFAC was observed after background drift correction by orthogonal spectral space projection. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. New insights on ion track morphology in pyrochlores by aberration corrected scanning transmission electron microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sachan, Ritesh; Zhang, Yanwen; Ou, Xin

    Here we demonstrate the enhanced imaging capabilities of an aberration corrected scanning transmission electron microscope to advance the understanding of ion track structure in pyrochlore structured materials (i.e., Gd 2Ti 2O 7 and Gd 2TiZrO 7). Track formation occurs due to the inelastic transfer of energy from incident ions to electrons, and atomic-level details of track morphology as a function of energy-loss are revealed in the present work. A comparison of imaging details obtained by varying collection angles of detectors is discussed in the present work. A quantitative analysis of phase identification using high-angle annular dark field imaging is performedmore » on the ion tracks. Finally, a novel 3-dimensional track reconstruction method is provided that is based on depth dependent imaging of the ion tracks. The technique is used in extracting the atomic-level details of nanoscale features, such as the disordered ion tracks, which are embedded in relatively thicker matrix. Another relevance of the method is shown by measuring the tilt of the ion tracks relative to the electron beam incidence that helps in knowing the structure and geometry of ion tracks quantitatively.« less

  20. New insights on ion track morphology in pyrochlores by aberration corrected scanning transmission electron microscopy

    DOE PAGES

    Sachan, Ritesh; Zhang, Yanwen; Ou, Xin; ...

    2016-12-13

    Here we demonstrate the enhanced imaging capabilities of an aberration corrected scanning transmission electron microscope to advance the understanding of ion track structure in pyrochlore structured materials (i.e., Gd 2Ti 2O 7 and Gd 2TiZrO 7). Track formation occurs due to the inelastic transfer of energy from incident ions to electrons, and atomic-level details of track morphology as a function of energy-loss are revealed in the present work. A comparison of imaging details obtained by varying collection angles of detectors is discussed in the present work. A quantitative analysis of phase identification using high-angle annular dark field imaging is performedmore » on the ion tracks. Finally, a novel 3-dimensional track reconstruction method is provided that is based on depth dependent imaging of the ion tracks. The technique is used in extracting the atomic-level details of nanoscale features, such as the disordered ion tracks, which are embedded in relatively thicker matrix. Another relevance of the method is shown by measuring the tilt of the ion tracks relative to the electron beam incidence that helps in knowing the structure and geometry of ion tracks quantitatively.« less

  1. Analysis of metalaxyl racemate using high performance liquid chromatography coupled with four kinds of detectors.

    PubMed

    Chen, Tao; Fan, Jun; Gao, Ruiqi; Wang, Tai; Yu, Ying; Zhang, Weiguang

    2016-10-07

    Chiral stationary phase-high performance liquid chromatography coupled with various detectors has been one of most commonly used methods for analysis and separation of chiral compounds over the past decades. Various detectors exhibit different characteristics in qualitative and quantitative studies under different chromatographic conditions. Herein, a comparative evaluation of HPLC coupled with ultraviolet, optical rotation, refractive index, and evaporative light scattering detectors has been conducted for qualitative and quantitative analyses of metalaxyl racemate. Effects of separation conditions on the peak area ratio between two enantiomers, including sample concentration, column temperature, mobile phase composition, as well as flow rate, have been investigated in detail. In addition, the limits of detection, the limits of quantitation, quantitative range and precision for these two enantiomers by using four detectors have been also studied. As indicated, the chromatographic separation conditions have been slight effects on ultraviolet and refractive index detections and the peak area ratio between two enantiomers remains almost unchanged, but the evaporative light scattering detection has been significantly affected by the above-mentioned chromatographic conditions and the corresponding peak area ratios varied greatly. Moreover, the limits of detection, the limits of quantitation, and the quantitative ranges of two enantiomers with UV detection were remarkably lower by 1-2 magnitudes than the others. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.

  3. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  4. Assessing agreement between preclinical magnetic resonance imaging and histology: An evaluation of their image qualities and quantitative results

    PubMed Central

    Elschner, Cindy; Korn, Paula; Hauptstock, Maria; Schulz, Matthias C.; Range, Ursula; Jünger, Diana; Scheler, Ulrich

    2017-01-01

    One consequence of demographic change is the increasing demand for biocompatible materials for use in implants and prostheses. This is accompanied by a growing number of experimental animals because the interactions between new biomaterials and its host tissue have to be investigated. To evaluate novel materials and engineered tissues the use of non-destructive imaging modalities have been identified as a strategic priority. This provides the opportunity for studying interactions repeatedly with individual animals, along with the advantages of reduced biological variability and decreased number of laboratory animals. However, histological techniques are still the golden standard in preclinical biomaterial research. The present article demonstrates a detailed method comparison between histology and magnetic resonance imaging. This includes the presentation of their image qualities as well as the detailed statistical analysis for assessing agreement between quantitative measures. Exemplarily, the bony ingrowth of tissue engineered bone substitutes for treatment of a cleft-like maxillary bone defect has been evaluated. By using a graphical concordance analysis the mean difference between MRI results and histomorphometrical measures has been examined. The analysis revealed a slightly but significant bias in the case of the bone volume (biasHisto−MRI:Bone volume=2.40 %, p<0.005) and a clearly significant deviation for the remaining defect width (biasHisto−MRI:Defect width=−6.73 %, p≪0.005). But the study although showed a considerable effect of the analyzed section position to the quantitative result. It could be proven, that the bias of the data sets was less originated due to the imaging modalities, but mainly on the evaluation of different slice positions. The article demonstrated that method comparisons not always need the use of an independent animal study, additionally. PMID:28666026

  5. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography.

    PubMed

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-09-01

    Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.

  6. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography

    PubMed Central

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-01-01

    Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037

  7. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  8. Quantitative analysis of backbone dynamics in a crystalline protein from nitrogen-15 spin-lattice relaxation.

    PubMed

    Giraud, Nicolas; Blackledge, Martin; Goldman, Maurice; Böckmann, Anja; Lesage, Anne; Penin, François; Emsley, Lyndon

    2005-12-28

    A detailed analysis of nitrogen-15 longitudinal relaxation times in microcrystalline proteins is presented. A theoretical model to quantitatively interpret relaxation times is developed in terms of motional amplitude and characteristic time scale. Different averaging schemes are examined in order to propose an analysis of relaxation curves that takes into account the specificity of MAS experiments. In particular, it is shown that magic angle spinning averages the relaxation rate experienced by a single spin over one rotor period, resulting in individual relaxation curves that are dependent on the orientation of their corresponding carousel with respect to the rotor axis. Powder averaging thus leads to a nonexponential behavior in the observed decay curves. We extract dynamic information from experimental decay curves, using a diffusion in a cone model. We apply this study to the analysis of spin-lattice relaxation rates of the microcrystalline protein Crh at two different fields and determine differential dynamic parameters for several residues in the protein.

  9. Quantitative Proteomics Analysis of Streptomyces coelicolor Development Demonstrates That Onset of Secondary Metabolism Coincides with Hypha Differentiation*

    PubMed Central

    Manteca, Angel; Sanchez, Jesus; Jung, Hye R.; Schwämmle, Veit; Jensen, Ole N.

    2010-01-01

    Streptomyces species produce many clinically important secondary metabolites, including antibiotics and antitumorals. They have a complex developmental cycle, including programmed cell death phenomena, that makes this bacterium a multicellular prokaryotic model. There are two differentiated mycelial stages: an early compartmentalized vegetative mycelium (first mycelium) and a multinucleated reproductive mycelium (second mycelium) arising after programmed cell death processes. In the present study, we made a detailed proteomics analysis of the distinct developmental stages of solid confluent Streptomyces coelicolor cultures using iTRAQ (isobaric tags for relative and absolute quantitation) labeling and LC-MS/MS. A new experimental approach was developed to obtain homogeneous samples at each developmental stage (temporal protein analysis) and also to obtain membrane and cytosolic protein fractions (spatial protein analysis). A total of 345 proteins were quantified in two biological replicates. Comparative bioinformatics analyses revealed the switch from primary to secondary metabolism between the initial compartmentalized mycelium and the multinucleated hyphae. PMID:20224110

  10. A Computational Observer For Performing Contrast-Detail Analysis Of Ultrasound Images

    NASA Astrophysics Data System (ADS)

    Lopez, H.; Loew, M. H.

    1988-06-01

    Contrast-Detail (C/D) analysis allows the quantitative determination of an imaging system's ability to display a range of varying-size targets as a function of contrast. Using this technique, a contrast-detail plot is obtained which can, in theory, be used to compare image quality from one imaging system to another. The C/D plot, however, is usually obtained by using data from human observer readings. We have shown earlier(7) that the performance of human observers in the task of threshold detection of simulated lesions embedded in random ultrasound noise is highly inaccurate and non-reproducible for untrained observers. We present an objective, computational method for the determination of the C/D curve for ultrasound images. This method utilizes digital images of the C/D phantom developed at CDRH, and lesion-detection algorithms that simulate the Bayesian approach using the likelihood function for an ideal observer. We present the results of this method, and discuss the relationship to the human observer and to the comparability of image quality between systems.

  11. Soldier as a System Value Analysis

    DTIC Science & Technology

    2008-09-01

    effort is to use established quantitative methods in the development of the framework and explore possible metrics for the assessment of applicable...and ease of use is an important part of the development of the holistic Soldier system. Small details can make a big difference to a Soldier in harsh...important (Friedl and Allan, 2004). Different kinds of environmental stressors include heat, cold, hypobaric hypoxia, physical work, energy

  12. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  13. Analysing neutron scattering data using McStas virtual experiments

    NASA Astrophysics Data System (ADS)

    Udby, L.; Willendrup, P. K.; Knudsen, E.; Niedermayer, Ch.; Filges, U.; Christensen, N. B.; Farhi, E.; Wells, B. O.; Lefmann, K.

    2011-04-01

    With the intention of developing a new data analysis method using virtual experiments we have built a detailed virtual model of the cold triple-axis spectrometer RITA-II at PSI, Switzerland, using the McStas neutron ray-tracing package. The parameters characterising the virtual instrument were carefully tuned against real experiments. In the present paper we show that virtual experiments reproduce experimentally observed linewidths within 1-3% for a variety of samples. Furthermore we show that the detailed knowledge of the instrumental resolution found from virtual experiments, including sample mosaicity, can be used for quantitative estimates of linewidth broadening resulting from, e.g., finite domain sizes in single-crystal samples.

  14. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  15. Proteomic and Lipidomic Analysis of Nanoparticle Corona upon Contact with Lung Surfactant Reveals Differences in Protein, but Not Lipid Composition.

    PubMed

    Raesch, Simon Sebastian; Tenzer, Stefan; Storck, Wiebke; Rurainski, Alexander; Selzer, Dominik; Ruge, Christian Arnold; Perez-Gil, Jesus; Schaefer, Ulrich Friedrich; Lehr, Claus-Michael

    2015-12-22

    Pulmonary surfactant (PS) constitutes the first line of host defense in the deep lung. Because of its high content of phospholipids and surfactant specific proteins, the interaction of inhaled nanoparticles (NPs) with the pulmonary surfactant layer is likely to form a corona that is different to the one formed in plasma. Here we present a detailed lipidomic and proteomic analysis of NP corona formation using native porcine surfactant as a model. We analyzed the adsorbed biomolecules in the corona of three NP with different surface properties (PEG-, PLGA-, and Lipid-NP) after incubation with native porcine surfactant. Using label-free shotgun analysis for protein and LC-MS for lipid analysis, we quantitatively determined the corona composition. Our results show a conserved lipid composition in the coronas of all investigated NPs regardless of their surface properties, with only hydrophilic PEG-NPs adsorbing fewer lipids in total. In contrast, the analyzed NP displayed a marked difference in the protein corona, consisting of up to 417 different proteins. Among the proteins showing significant differences between the NP coronas, there was a striking prevalence of molecules with a notoriously high lipid and surface binding, such as, e.g., SP-A, SP-D, DMBT1. Our data indicate that the selective adsorption of proteins mediates the relatively similar lipid pattern in the coronas of different NPs. On the basis of our lipidomic and proteomic analysis, we provide a detailed set of quantitative data on the composition of the surfactant corona formed upon NP inhalation, which is unique and markedly different to the plasma corona.

  16. Large explosive basaltic eruptions at Katla volcano, Iceland: Fragmentation, grain size and eruption dynamics

    NASA Astrophysics Data System (ADS)

    Schmith, Johanne; Höskuldsson, Ármann; Holm, Paul Martin; Larsen, Guðrún

    2018-04-01

    Katla volcano in Iceland produces hazardous large explosive basaltic eruptions on a regular basis, but very little quantitative data for future hazard assessments exist. Here details on fragmentation mechanism and eruption dynamics are derived from a study of deposit stratigraphy with detailed granulometry and grain morphology analysis, granulometric modeling, componentry and the new quantitative regularity index model of fragmentation mechanism. We show that magma/water interaction is important in the ash generation process, but to a variable extent. By investigating the large explosive basaltic eruptions from 1755 and 1625, we document that eruptions of similar size and magma geochemistry can have very different fragmentation dynamics. Our models show that fragmentation in the 1755 eruption was a combination of magmatic degassing and magma/water-interaction with the most magma/water-interaction at the beginning of the eruption. The fragmentation of the 1625 eruption was initially also a combination of both magmatic and phreatomagmatic processes, but magma/water-interaction diminished progressively during the later stages of the eruption. However, intense magma/water interaction was reintroduced during the final stages of the eruption dominating the fine fragmentation at the end. This detailed study of fragmentation changes documents that subglacial eruptions have highly variable interaction with the melt water showing that the amount and access to melt water changes significantly during eruptions. While it is often difficult to reconstruct the progression of eruptions that have no quantitative observational record, this study shows that integrating field observations and granulometry with the new regularity index can form a coherent model of eruption evolution.

  17. Precise quantitation of 136 urinary proteins by LC/MRM-MS using stable isotope labeled peptides as internal standards for biomarker discovery and/or verification studies.

    PubMed

    Percy, Andrew J; Yang, Juncong; Hardie, Darryl B; Chambers, Andrew G; Tamura-Wells, Jessica; Borchers, Christoph H

    2015-06-15

    Spurred on by the growing demand for panels of validated disease biomarkers, increasing efforts have focused on advancing qualitative and quantitative tools for more highly multiplexed and sensitive analyses of a multitude of analytes in various human biofluids. In quantitative proteomics, evolving strategies involve the use of the targeted multiple reaction monitoring (MRM) mode of mass spectrometry (MS) with stable isotope-labeled standards (SIS) used for internal normalization. Using that preferred approach with non-invasive urine samples, we have systematically advanced and rigorously assessed the methodology toward the precise quantitation of the largest, multiplexed panel of candidate protein biomarkers in human urine to date. The concentrations of the 136 proteins span >5 orders of magnitude (from 8.6 μg/mL to 25 pg/mL), with average CVs of 8.6% over process triplicate. Detailed here is our quantitative method, the analysis strategy, a feasibility application to prostate cancer samples, and a discussion of the utility of this method in translational studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Mathematics of quantitative kinetic PCR and the application of standard curves.

    PubMed

    Rutledge, R G; Côté, C

    2003-08-15

    Fluorescent monitoring of DNA amplification is the basis of real-time PCR, from which target DNA concentration can be determined from the fractional cycle at which a threshold amount of amplicon DNA is produced. Absolute quantification can be achieved using a standard curve constructed by amplifying known amounts of target DNA. In this study, the mathematics of quantitative PCR are examined in detail, from which several fundamental aspects of the threshold method and the application of standard curves are illustrated. The construction of five replicate standard curves for two pairs of nested primers was used to examine the reproducibility and degree of quantitative variation using SYBER Green I fluorescence. Based upon this analysis the application of a single, well- constructed standard curve could provide an estimated precision of +/-6-21%, depending on the number of cycles required to reach threshold. A simplified method for absolute quantification is also proposed, in which quantitative scale is determined by DNA mass at threshold.

  19. Composition-driven Cu-speciation and reducibility in Cu-CHA zeolite catalysts: a multivariate XAS/FTIR approach to complexity† †Electronic supplementary information (ESI) available: Sample description and synthesis details, experimental setup for in situ XAS and FTIR spectroscopy, details on the MCR-ALS method, details on DFT-assisted XANES simulations, details on the determination of N pure by PCA, MCR-ALS results for downsized and upsized component spaces, additional information to support the assignment of theoretical XANES curves, details on EXAFS analysis, details on IR spectral deconvolution. See DOI: 10.1039/c7sc02266b Click here for additional data file.

    PubMed Central

    Martini, A.; Lomachenko, K. A.; Pankin, I. A.; Negri, C.; Berlier, G.; Beato, P.; Falsig, H.; Bordiga, S.; Lamberti, C.

    2017-01-01

    The small pore Cu-CHA zeolite is attracting increasing attention as a versatile platform to design novel single-site catalysts for deNOx applications and for the direct conversion of methane to methanol. Understanding at the atomic scale how the catalyst composition influences the Cu-species formed during thermal activation is a key step to unveil the relevant composition–activity relationships. Herein, we explore by in situ XAS the impact of Cu-CHA catalyst composition on temperature-dependent Cu-speciation and reducibility. Advanced multivariate analysis of in situ XANES in combination with DFT-assisted simulation of XANES spectra and multi-component EXAFS fits as well as in situ FTIR spectroscopy of adsorbed N2 allow us to obtain unprecedented quantitative structural information on the complex dynamics during the speciation of Cu-sites inside the framework of the CHA zeolite. PMID:29147509

  20. Quantitative analysis of woodpecker habitat using high-resolution airborne LiDAR estimates of forest structure and composition

    Treesearch

    James E. Garabedian; Robert J. McGaughey; Stephen E. Reutebuch; Bernard R. Parresol; John C. Kilgo; Christopher E. Moorman; M. Nils. Peterson

    2014-01-01

    Light detection and ranging (LiDAR) technology has the potential to radically alter the way researchers and managers collect data on wildlife–habitat relationships. To date, the technology has fostered several novel approaches to characterizing avian habitat, but has been limited by the lack of detailed LiDAR-habitat attributes relevant to species across a continuum of...

  1. Price Analysis on Commercial Item Purchases within the Department of the Navy

    DTIC Science & Technology

    2014-06-01

    auditors determined that, on average, pricing was 28% higher than previous contract prices when adjusted for inflation. The audit recommended the ...greater use of alternative contracting approaches , which offer the benefits of improved efficiency and timeliness for acquiring goods and services...workforce is one of the areas that the panel discussed in detail. The panel noted that a qualified workforce should also have the quantitative skills

  2. The Role of Excitons on Light Amplification in Lead Halide Perovskites.

    PubMed

    Lü, Quan; Wei, Haohan; Sun, Wenzhao; Wang, Kaiyang; Gu, Zhiyuan; Li, Jiankai; Liu, Shuai; Xiao, Shumin; Song, Qinghai

    2016-12-01

    The role of excitons on the amplifications of lead halide perovskites has been explored. Unlike the photoluminescence, the intensity of amplified spontaneous emission is partially suppressed at low temperature. The detailed analysis and experiments show that the inhibition is attributed to the existence of exciton and a quantitative model has been built to explain the experimental observations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Spin-polarized scanning tunneling microscopy with quantitative insights into magnetic probes

    NASA Astrophysics Data System (ADS)

    Phark, Soo-hyon; Sander, Dirk

    2017-04-01

    Spin-polarized scanning tunneling microscopy and spectroscopy (spin-STM/S) have been successfully applied to magnetic characterizations of individual nanostructures. Spin-STM/S is often performed in magnetic fields of up to some Tesla, which may strongly influence the tip state. In spite of the pivotal role of the tip in spin-STM/S, the contribution of the tip to the differential conductance d I/d V signal in an external field has rarely been investigated in detail. In this review, an advanced analysis of spin-STM/S data measured on magnetic nanoislands, which relies on a quantitative magnetic characterization of tips, is discussed. Taking advantage of the uniaxial out-of-plane magnetic anisotropy of Co bilayer nanoisland on Cu(111), in-field spin-STM on this system has enabled a quantitative determination, and thereby, a categorization of the magnetic states of the tips. The resulting in-depth and conclusive analysis of magnetic characterization of the tip opens new venues for a clear-cut sub-nanometer scale spin ordering and spin-dependent electronic structure of the non-collinear magnetic state in bilayer high Fe nanoislands on Cu(111).

  4. Data set for the proteomic inventory and quantitative analysis of chicken eggshell matrix proteins during the primary events of eggshell mineralization and the active growth phase of calcification.

    PubMed

    Marie, Pauline; Labas, Valérie; Brionne, Aurélien; Harichaux, Grégoire; Hennequet-Antier, Christelle; Rodriguez-Navarro, Alejandro B; Nys, Yves; Gautron, Joël

    2015-09-01

    Chicken eggshell is a biomineral composed of 95% calcite calcium carbonate mineral and of 3.5% organic matrix proteins. The assembly of mineral and its structural organization is controlled by its organic matrix. In a recent study [1], we have used quantitative proteomic, bioinformatic and functional analyses to explore the distribution of 216 eggshell matrix proteins at four key stages of shell mineralization defined as: (1) widespread deposition of amorphous calcium carbonate (ACC), (2) ACC transformation into crystalline calcite aggregates, (3) formation of larger calcite crystal units and (4) rapid growth of calcite as columnar structure with preferential crystal orientation. The current article detailed the quantitative analysis performed at the four stages of shell mineralization to determine the proteins which are the most abundant. Additionally, we reported the enriched GO terms and described the presence of 35 antimicrobial proteins equally distributed at all stages to keep the egg free of bacteria and of 81 proteins, the function of which could not be ascribed.

  5. Data set for the proteomic inventory and quantitative analysis of chicken eggshell matrix proteins during the primary events of eggshell mineralization and the active growth phase of calcification

    PubMed Central

    Marie, Pauline; Labas, Valérie; Brionne, Aurélien; Harichaux, Grégoire; Hennequet-Antier, Christelle; Rodriguez-Navarro, Alejandro B.; Nys, Yves; Gautron, Joël

    2015-01-01

    Chicken eggshell is a biomineral composed of 95% calcite calcium carbonate mineral and of 3.5% organic matrix proteins. The assembly of mineral and its structural organization is controlled by its organic matrix. In a recent study [1], we have used quantitative proteomic, bioinformatic and functional analyses to explore the distribution of 216 eggshell matrix proteins at four key stages of shell mineralization defined as: (1) widespread deposition of amorphous calcium carbonate (ACC), (2) ACC transformation into crystalline calcite aggregates, (3) formation of larger calcite crystal units and (4) rapid growth of calcite as columnar structure with preferential crystal orientation. The current article detailed the quantitative analysis performed at the four stages of shell mineralization to determine the proteins which are the most abundant. Additionally, we reported the enriched GO terms and described the presence of 35 antimicrobial proteins equally distributed at all stages to keep the egg free of bacteria and of 81 proteins, the function of which could not be ascribed. PMID:26306314

  6. T1, diffusion tensor, and quantitative magnetization transfer imaging of the hippocampus in an Alzheimer's disease mouse model.

    PubMed

    Whittaker, Heather T; Zhu, Shenghua; Di Curzio, Domenico L; Buist, Richard; Li, Xin-Min; Noy, Suzanna; Wiseman, Frances K; Thiessen, Jonathan D; Martin, Melanie

    2018-07-01

    Alzheimer's disease (AD) pathology causes microstructural changes in the brain. These changes, if quantified with magnetic resonance imaging (MRI), could be studied for use as an early biomarker for AD. The aim of our study was to determine if T 1 relaxation, diffusion tensor imaging (DTI), and quantitative magnetization transfer imaging (qMTI) metrics could reveal changes within the hippocampus and surrounding white matter structures in ex vivo transgenic mouse brains overexpressing human amyloid precursor protein with the Swedish mutation. Delineation of hippocampal cell layers using DTI color maps allows more detailed analysis of T 1 -weighted imaging, DTI, and qMTI metrics, compared with segmentation of gross anatomy based on relaxation images, and with analysis of DTI or qMTI metrics alone. These alterations are observed in the absence of robust intracellular Aβ accumulation or plaque deposition as revealed by histology. This work demonstrates that multiparametric quantitative MRI methods are useful for characterizing changes within the hippocampal substructures and surrounding white matter tracts of mouse models of AD. Copyright © 2018. Published by Elsevier Inc.

  7. Interdisciplinary study of atmospheric processes and constituents of the mid-Atlantic coastal region.. [air pollution control studies in Virginia

    NASA Technical Reports Server (NTRS)

    Kindle, E. C.; Bandy, E. C.; Copeland, G.; Blais, R.; Levy, G.; Sonenshine, D.

    1975-01-01

    Past research projects for the year 1974-1975 are listed along with future research programs in the area of air pollution control, remote sensor analysis of smoke plumes, the biosphere component, and field experiments. A detailed budget analysis is presented. Attachments are included on the following topics: mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques, and use of LARS system for the quantitative determination of smoke plume lateral diffusion coefficients from ERTS images of Virginia.

  8. An implementation and analysis of the Abstract Syntax Notation One and the basic encoding rules

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The details of abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively solve the problem of data transfer across incompatible host environments are presented, and a compiler that was built to automate their use is described. Experiences with this compiler are also discussed which provide a quantitative analysis of the performance costs associated with the application of these standards. An evaluation is offered as to how well suited ASN.1 and BER are in solving the common data representation problem.

  9. Nonlocal means-based speckle filtering for ultrasound images

    PubMed Central

    Coupé, Pierrick; Hellier, Pierre; Kervrann, Charles; Barillot, Christian

    2009-01-01

    In image processing, restoration is expected to improve the qualitative inspection of the image and the performance of quantitative image analysis techniques. In this paper, an adaptation of the Non Local (NL-) means filter is proposed for speckle reduction in ultrasound (US) images. Originally developed for additive white Gaussian noise, we propose to use a Bayesian framework to derive a NL-means filter adapted to a relevant ultrasound noise model. Quantitative results on synthetic data show the performances of the proposed method compared to well-established and state-of-the-art methods. Results on real images demonstrate that the proposed method is able to preserve accurately edges and structural details of the image. PMID:19482578

  10. A benchmark for comparison of dental radiography analysis algorithms.

    PubMed

    Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia

    2016-07-01

    Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Quantitative proteomics and systems analysis of cultured H9C2 cardiomyoblasts during differentiation over time supports a 'function follows form' model of differentiation.

    PubMed

    Kankeu, Cynthia; Clarke, Kylie; Van Haver, Delphi; Gevaert, Kris; Impens, Francis; Dittrich, Anna; Roderick, H Llewelyn; Passante, Egle; Huber, Heinrich J

    2018-05-17

    The rat cardiomyoblast cell line H9C2 has emerged as a valuable tool for studying cardiac development, mechanisms of disease and toxicology. We present here a rigorous proteomic analysis that monitored the changes in protein expression during differentiation of H9C2 cells into cardiomyocyte-like cells over time. Quantitative mass spectrometry followed by gene ontology (GO) enrichment analysis revealed that early changes in H9C2 differentiation are related to protein pathways of cardiac muscle morphogenesis and sphingolipid synthesis. These changes in the proteome were followed later in the differentiation time-course by alterations in the expression of proteins involved in cation transport and beta-oxidation. Studying the temporal profile of the H9C2 proteome during differentiation in further detail revealed eight clusters of co-regulated proteins that can be associated with early, late, continuous and transient up- and downregulation. Subsequent reactome pathway analysis based on these eight clusters further corroborated and detailed the results of the GO analysis. Specifically, this analysis confirmed that proteins related to pathways in muscle contraction are upregulated early and transiently, and proteins relevant to extracellular matrix organization are downregulated early. In contrast, upregulation of proteins related to cardiac metabolism occurs at later time points. Finally, independent validation of the proteomics results by immunoblotting confirmed hereto unknown regulators of cardiac structure and ionic metabolism. Our results are consistent with a 'function follows form' model of differentiation, whereby early and transient alterations of structural proteins enable subsequent changes that are relevant to the characteristic physiology of cardiomyocytes.

  12. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  13. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  14. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  15. A Meta-analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    NASA Astrophysics Data System (ADS)

    Zhang, Lin

    2014-02-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.

  16. Combinational pixel-by-pixel and object-level classifying, segmenting, and agglomerating in performing quantitative image analysis that distinguishes between healthy non-cancerous and cancerous cell nuclei and delineates nuclear, cytoplasm, and stromal material objects from stained biological tissue materials

    DOEpatents

    Boucheron, Laura E

    2013-07-16

    Quantitative object and spatial arrangement-level analysis of tissue are detailed using expert (pathologist) input to guide the classification process. A two-step method is disclosed for imaging tissue, by classifying one or more biological materials, e.g. nuclei, cytoplasm, and stroma, in the tissue into one or more identified classes on a pixel-by-pixel basis, and segmenting the identified classes to agglomerate one or more sets of identified pixels into segmented regions. Typically, the one or more biological materials comprises nuclear material, cytoplasm material, and stromal material. The method further allows a user to markup the image subsequent to the classification to re-classify said materials. The markup is performed via a graphic user interface to edit designated regions in the image.

  17. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  18. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework

    PubMed Central

    Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028

  19. Application of Particle Image Velocimetry and Reference Image Topography to jet shock cells using the hydraulic analogy

    NASA Astrophysics Data System (ADS)

    Kumar, Vaibhav; Ng, Ivan; Sheard, Gregory J.; Brocher, Eric; Hourigan, Kerry; Fouras, Andreas

    2011-08-01

    This paper examines the shock cell structure, vorticity and velocity field at the exit of an underexpanded jet nozzle using a hydraulic analogy and the Reference Image Topography technique. Understanding the flow in this region is important for the mitigation of screech, an aeroacoustic problem harmful to aircraft structures. Experiments are conducted on a water table, allowing detailed quantitative investigation of this important flow regime at a greatly reduced expense. Conventional Particle Image Velocimetry is employed to determine the velocity and vorticity fields of the nozzle exit region. Applying Reference Image Topography, the wavy water surface is reconstructed and when combined with the hydraulic analogy, provides a pressure map of the region. With this approach subtraction of surfaces is used to highlight the unsteady regions of the flow, which is not as convenient or quantitative with conventional Schlieren techniques. This allows a detailed analysis of the shock cell structures and their interaction with flow instabilities in the shear layer that are the underlying cause of jet screech.

  20. A sedimentological approach to hydrologic characterization: A detailed three-dimensional study of an outcrop of the Sierra Ladrones Formation, Albuquerque basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lohmann, R.C.

    1992-01-01

    Three-dimensional geologic outcrop studies which quantitatively describe the geologic architecture of deposits of a specific depositional environment are a necessary requirement for characterization of the permeability structure of an aquifer. The objective of this study is to address this need for quantitative, three-dimensional outcrop studies. For this study, a 10,000 m{sup 2} by 25 m high outcrop of Pliocene-Pleistocene Sierra Ladrones Formation located near Belen, New Mexico was mapped in detail, and the geologic architecture was quantified using geostatistical variogram analysis. In general, the information contained in this study should be useful for hydrologists working on the characterization of aquifersmore » from similar depositional environments such as this one. However, for the permeability correlation study to be truly useful, the within-element correlation structure needs to be superimposed on the elements themselves instead of using mean log (k) values, as was done for this study. Such information is derived from outcrop permeability sampling such as the work of Davis (1990) and Goggin et al. (1988).« less

  1. The Development, Description and Appraisal of an Emergent Multimethod Research Design to Study Workforce Changes in Integrated Care Interventions.

    PubMed

    Busetto, Loraine; Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G; Vrijhoef, Hubertus J M

    2017-03-08

    In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here.

  2. Ultrafast X-ray Imaging of Fuel Sprays

    NASA Astrophysics Data System (ADS)

    Wang, Jin

    2007-01-01

    Detailed analysis of fuel sprays has been well recognized as an important step for optimizing the operation of internal combustion engines to improve efficiency and reduce emissions. Ultrafast radiographic and tomographic techniques have been developed for probing the fuel distribution close to the nozzles of direct-injection diesel and gasoline injectors. The measurement was made using x-ray absorption of monochromatic synchrotron-generated radiation, allowing quantitative determination of the fuel distribution in this optically impenetrable region with a time resolution on the order of 1 μs. Furthermore, an accurate 3-dimensional fuel-density distribution, in the form of fuel volume fraction, was obtained by the time-resolved computed tomography. These quantitative measurements constitute the most detailed near-nozzle study of a fuel spray to date. With high-energy and high-brilliance x-ray beams available at the Advanced Photon Source, propagation-based phase-enhanced imaging was developed as a unique metrology technique to visualize the interior of an injection nozzle through a 3-mm-thick steel with a 10-μs temporal resolution, which is virtually impossible by any other means.

  3. Affective outcomes of virtual reality exposure therapy for anxiety and specific phobias: a meta-analysis.

    PubMed

    Parsons, Thomas D; Rizzo, Albert A

    2008-09-01

    Virtual reality exposure therapy (VRET) is an increasingly common treatment for anxiety and specific phobias. Lacking is a quantitative meta-analysis that enhances understanding of the variability and clinical significance of anxiety reduction outcomes after VRET. Searches of electronic databases yielded 52 studies, and of these, 21 studies (300 subjects) met inclusion criteria. Although meta-analysis revealed large declines in anxiety symptoms following VRET, moderator analyses were limited due to inconsistent reporting in the VRET literature. This highlights the need for future research studies that report uniform and detailed information regarding presence, immersion, anxiety and/or phobia duration, and demographics.

  4. The Evolution of 3D Microimaging Techniques in Geosciences

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Proussevitch, A.

    2009-05-01

    In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.

  5. A Primer on Disseminating Applied Quantitative Research

    ERIC Educational Resources Information Center

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  6. In or out? Methodological considerations for including and excluding findings from a meta-analysis of predictors of antiretroviral adherence in HIV-positive women

    PubMed Central

    Voils, Corrine I.; Barroso, Julie; Hasselblad, Victor; Sandelowski, Margarete

    2008-01-01

    Aim This paper is a discussion detailing the decisions concerning whether to include or exclude findings from a meta-analysis of report of quantitative studies of antiretroviral adherence in HIV-positive women. Background Publication constraints and the absence of reflexivity as a criterion for validity in, and reporting of, quantitative research preclude detailing the many judgements made in the course of a meta-analysis. Yet, such an accounting would assist researchers better to address the unique challenges to meta-analysis presented by the bodies of research they have targeted for review, and to show the subjectivity, albeit disciplined, that characterizes the meta-analytic process. Data sources Data were 29 published and unpublished studies on antiretroviral adherence in HIV-positive women of any race/ethnicity, class, or nationality living in the United States of America. The studies were retrieved between June 2005 and January 2006 using 40 databases. Review methods Findings were included if they met the statistical assumptions of meta-analysis, including: (1) normal distribution of observations; (2) homogeneity of variances; and (3) independence of observations. Results Relevant studies and findings were excluded because of issues related to differences in study design, different operationalizations of dependent and independent variables, multiple cuts from common longitudinal data sets, and presentation of unadjusted and adjusted findings. These reasons led to the exclusion of 73% of unadjusted relationships and 87% of adjusted relationships from our data set, leaving few findings to synthesize. Conclusion Decisions made during research synthesis studies may result in more information losses than gains, thereby obliging researchers to find ways to preserve findings that are potentially valuable for practice. PMID:17543011

  7. Functional Module Search in Protein Networks based on Semantic Similarity Improves the Analysis of Proteomics Data*

    PubMed Central

    Boyanova, Desislava; Nilla, Santosh; Klau, Gunnar W.; Dandekar, Thomas; Müller, Tobias; Dittrich, Marcus

    2014-01-01

    The continuously evolving field of proteomics produces increasing amounts of data while improving the quality of protein identifications. Albeit quantitative measurements are becoming more popular, many proteomic studies are still based on non-quantitative methods for protein identification. These studies result in potentially large sets of identified proteins, where the biological interpretation of proteins can be challenging. Systems biology develops innovative network-based methods, which allow an integrated analysis of these data. Here we present a novel approach, which combines prior knowledge of protein-protein interactions (PPI) with proteomics data using functional similarity measurements of interacting proteins. This integrated network analysis exactly identifies network modules with a maximal consistent functional similarity reflecting biological processes of the investigated cells. We validated our approach on small (H9N2 virus-infected gastric cells) and large (blood constituents) proteomic data sets. Using this novel algorithm, we identified characteristic functional modules in virus-infected cells, comprising key signaling proteins (e.g. the stress-related kinase RAF1) and demonstrate that this method allows a module-based functional characterization of cell types. Analysis of a large proteome data set of blood constituents resulted in clear separation of blood cells according to their developmental origin. A detailed investigation of the T-cell proteome further illustrates how the algorithm partitions large networks into functional subnetworks each representing specific cellular functions. These results demonstrate that the integrated network approach not only allows a detailed analysis of proteome networks but also yields a functional decomposition of complex proteomic data sets and thereby provides deeper insights into the underlying cellular processes of the investigated system. PMID:24807868

  8. Detailed Modeling and Analysis of the CPFM Dataset

    NASA Technical Reports Server (NTRS)

    Swartz, William H.; Lloyd, Steven A.; DeMajistre, Robert

    2004-01-01

    A quantitative understanding of photolysis rate coefficients (or "j-values") is essential to determining the photochemical reaction rates that define ozone loss and other crucial processes in the atmosphere. j-Values can be calculated with radiative transfer models, derived from actinic flux observations, or inferred from trace gas measurements. The principal objective of this study is to cross-validate j-values from the Composition and Photodissociative Flux Measurement (CPFM) instrument during the Photochemistry of Ozone Loss in the Arctic Region In Summer (POLARIS) and SAGE I11 Ozone Loss and Validation Experiment (SOLVE) field campaigns with model calculations and other measurements and to use this detailed analysis to improve our ability to determine j-values. Another objective is to analyze the spectral flux from the CPFM (not just the j-values) and, using a multi-wavelength/multi-species spectral fitting technique, determine atmospheric composition.

  9. Carbothermic Synthesis of 820 m UN Kernels: Literature Review, Thermodynamics, Analysis, and Related Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindemer, Terrence; Voit, Stewart L; Silva, Chinthaka M

    2014-01-01

    The U.S. Department of Energy is considering a new nuclear fuel that would be less susceptible to ruptures during a loss-of-coolant accident. The fuel would consist of tristructural isotropic coated particles with large, dense uranium nitride (UN) kernels. This effort explores many factors involved in using gel-derived uranium oxide-carbon microspheres to make large UN kernels. Analysis of recent studies with sufficient experimental details is provided. Extensive thermodynamic calculations are used to predict carbon monoxide and other pressures for several different reactions that may be involved in conversion of uranium oxides and carbides to UN. Experimentally, the method for making themore » gel-derived microspheres is described. These were used in a microbalance with an attached mass spectrometer to determine details of carbothermic conversion in argon, nitrogen, or vacuum. A quantitative model is derived from experiments for vacuum conversion to an uranium oxide-carbide kernel.« less

  10. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  11. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  12. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  13. Publications - GMC 429 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    DGGS GMC 429 Publication Details Title: Quantitative Fluorescence Technology - Dual Wavelength (QFT2 Bibliographic Reference Canrig Drilling Technology Ltd., 2014, Quantitative Fluorescence Technology - Dual

  14. Measuring the Internal Structure and Physical Conditions in Star and Planet Forming Clouds Cores: Towards a Quantitative Description of Cloud Evolution

    NASA Technical Reports Server (NTRS)

    Lada, Charles J.

    2004-01-01

    This grant funds a research program to use infrared extinction measurements to probe the detailed structure of dark molecular cloud cores and investigate the physical conditions which give rise to star and planet formation. The goals of this program are to acquire, reduce and analyze deep infrared and molecular-line observations of a carefully selected sample of nearby dark clouds in order to determine the detailed initial conditions for star formation from quantitative measurements of the internal structure of starless cloud cores and to quantitatively investigate the evolution of such structure through the star and planet formation process.

  15. The RAMANITA © method for non-destructive and in situ semi-quantitative chemical analysis of mineral solid-solutions by multidimensional calibration of Raman wavenumber shifts

    NASA Astrophysics Data System (ADS)

    Smith, David C.

    2005-08-01

    The "RAMANITA ©" method, for semi-quantitative chemical analysis of mineral solid-solutions by multidimensional calibration of Raman wavenumber shifts and mathematical calculation by simultaneous equations, is published here in detail in English for the first time. It was conceived by the present writer 20 years ago for binary and ternary pyroxene and garnet systems. The mathematical description was set out in 1989, but in an abstract in an obscure French special publication. Detailed "step-by-step" calibration of two garnet ternaries, followed by their linking, by M. Pinet and D.C. Smith in the early 1990s provided a hexary garnet database. Much later, using this garnet database, which forms part of his personal database called RAMANITA ©, the present writer began to develop the method by improving the terminology, automating the calculations, discussing problems and experimenting with different real chemical problems in archaeometry. Although this RAMANITA © method has been very briefly mentioned in two recent books, the necessary full mathematical explanation is given only here. The method will find application in any study which requires obtaining a non-destructive semi-quantitative chemical analysis from mineral solid solutions that cannot be analysed by any destructive analytical method, in particular for archaeological, geological or extraterrestrial research projects, e.g. from gemstones or other crystalline artworks of the cultural heritage (especially by Mobile Raman Microscopy (MRM)) in situ in museums or at archaeological sites, including under water for subaquatic archaeometry; from scientifically precious mineral microinclusions (such as garnet or pyroxene within diamond); from minerals in rocks analysed in situ on planetary bodies by a rover (especially "at distance" by telescopy). Recently some other workers have begun deducing chemical compositions from Raman wavenumber shifts in multivariate chemical space, but the philosophical approach is quite different.

  16. SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data

    NASA Astrophysics Data System (ADS)

    Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos

    2015-04-01

    Since the foundation of stratigraphy geoscientists have recognized that data obtained from stratigraphic columns (SC), two dimensional schemes recording descriptions of both geological and paleontological features (e.g., thickness of rock packages, grain size, fossil and lithological components, and sedimentary structures), are key elements for establishing reliable hypotheses about the distribution in space and time of rock sequences, and ancient sedimentary environmental and paleobiological dynamics. Despite the tremendous advances on the way geoscientists store, plot, and quantitatively analyze sedimentological and paleontological data (e.g., Macrostrat [http://www.macrostrat.org/], Paleobiology Database [http://www.paleodb.org/], respectively), there is still a lack of computational methodologies designed to quantitatively examine data from a highly detailed SCs. Moreover, frequently the stratigraphic information is plotted "manually" using vector graphics editors (e.g., Corel Draw, Illustrator), however, this information although store on a digital format, cannot be used readily for any quantitative analysis. Therefore, any attempt to examine the stratigraphic data in an analytical fashion necessarily takes further steps. Given these issues, we have developed the sofware 'Stratigraphic Data Analysis in R' (SDAR), which stores in a database all sedimentological, stratigraphic, and paleontological information collected from a SC, allowing users to generate high-quality graphic plots (including one or multiple features stored in the database). SDAR also encompasses quantitative analyses helping users to quantify stratigraphic information (e.g. grain size, sorting and rounding, proportion of sand/shale). Finally, given that the SDAR analysis module, has been written in the open-source high-level computer language "R graphics/statistics language" [R Development Core Team, 2014], it is already loaded with many of the crucial features required to accomplish basic and complex tasks of statistical analysis (i.e., R language provide more than hundred spatial libraries that allow users to explore various Geostatistics and spatial analysis). Consequently, SDAR allows a deeper exploration of the stratigraphic data collected in the field, it will allow the geoscientific community in the near future to develop complex analyses related with the distribution in space and time of rock sequences, such as lithofacial correlations, by a multivariate comparison between empirical SCs with quantitative lithofacial models established from modern sedimentary environments.

  17. STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.

    PubMed

    Grillo, Vincenzo; Rossi, Francesca

    2013-02-01

    A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Ecological Change, Sliding Baselines and the Importance of Historical Data: Lessons from Combing Observational and Quantitative Data on a Temperate Reef Over 70 Years

    PubMed Central

    Gatti, Giulia; Bianchi, Carlo Nike; Parravicini, Valeriano; Rovere, Alessio; Peirano, Andrea; Montefalcone, Monica; Massa, Francesco; Morri, Carla

    2015-01-01

    Understanding the effects of environmental change on ecosystems requires the identification of baselines that may act as reference conditions. However, the continuous change of these references challenges our ability to define the true natural status of ecosystems. The so-called sliding baseline syndrome can be overcome through the analysis of quantitative time series, which are, however, extremely rare. Here we show how combining historical quantitative data with descriptive ‘naturalistic’ information arranged in a chronological chain allows highlighting long-term trends and can be used to inform present conservation schemes. We analysed the long-term change of a coralligenous reef, a marine habitat endemic to the Mediterranean Sea. The coralligenous assemblages of Mesco Reef (Ligurian Sea, NW Mediterranean) have been studied, although discontinuously, since 1937 thus making available both detailed descriptive information and scanty quantitative data: while the former was useful to understand the natural history of the ecosystem, the analysis of the latter was of paramount importance to provide a formal measure of change over time. Epibenthic assemblages remained comparatively stable until the 1990s, when species replacement, invasion by alien algae, and biotic homogenisation occurred within few years, leading to a new and completely different ecosystem state. The shift experienced by the coralligenous assemblages of Mesco Reef was probably induced by a combination of seawater warming and local human pressures, the latter mainly resulting in increased water turbidity; in turn, cumulative stress may have favoured the establishment of alien species. This study showed that the combined analysis of quantitative and descriptive historical data represent a precious knowledge to understand ecosystem trends over time and provide help to identify baselines for ecological management. PMID:25714413

  19. Passive Fourier-transform infrared spectroscopy of chemical plumes: an algorithm for quantitative interpretation and real-time background removal

    NASA Astrophysics Data System (ADS)

    Polak, Mark L.; Hall, Jeffrey L.; Herr, Kenneth C.

    1995-08-01

    We present a ratioing algorithm for quantitative analysis of the passive Fourier-transform infrared spectrum of a chemical plume. We show that the transmission of a near-field plume is given by tau plume = (Lobsd - Lbb-plume)/(Lbkgd - Lbb-plume), where tau plume is the frequency-dependent transmission of the plume, L obsd is the spectral radiance of the scene that contains the plume, Lbkgd is the spectral radiance of the same scene without the plume, and Lbb-plume is the spectral radiance of a blackbody at the plume temperature. The algorithm simultaneously achieves background removal, elimination of the spectrometer internal signature, and quantification of the plume spectral transmission. It has applications to both real-time processing for plume visualization and quantitative measurements of plume column densities. The plume temperature (Lbb-plume ), which is not always precisely known, can have a profound effect on the quantitative interpretation of the algorithm and is discussed in detail. Finally, we provide an illustrative example of the use of the algorithm on a trichloroethylene and acetone plume.

  20. Systematic comparison of the behaviors produced by computational models of epileptic neocortex.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warlaumont, A. S.; Lee, H. C.; Benayoun, M.

    2010-12-01

    Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less

  1. Surface Passivation of InAs(001) With Thioacetamide

    DTIC Science & Technology

    2005-01-01

    Fitting and quantitative analysis of the In 3d data are described in detail elsewhere.9 Oxidation of As requires breaching the top two layers of the S...ACS or HPLC grade used with- out additional purificationd. The average S coverage after these exposures was reduced by ᝿% compared to as- passivated...embedding one-half of an InAs sample in freshly cast polydimethylsiloxane sPDMSd followed by hardening the PDMS overnight at room tem- perature. After 30

  2. A New Approach for Quantitative Evaluation of Ultrasonic Wave Attenuation in Composites

    NASA Astrophysics Data System (ADS)

    Ni, Qing-Qing; Li, Ran; Xia, Hong

    2017-02-01

    When ultrasonic waves propagate in composite materials, the propagation behaviors result from the combination effects of various factors, such as material anisotropy and viscoelastic property, internal microstructure and defects, incident wave characteristics and interface condition between composite components. It is essential to make it clear how these factors affect the ultrasonic wave propagation and attenuation characteristics, and how they mutually interact on each other. In the present paper, based on a newly developed time-domain finite element analysis code, PZflex, a unique approach for clarifying the detailed influence mechanism of aforementioned factors is proposed, in which each attenuation component can be extracted from the overall attenuation and analyzed respectively. By taking into consideration the interrelation between each individual attenuation component, the variation behaviors of each component and internal dynamic stress distribution against material anisotropy and matrix viscosity are separately and quantitatively evaluated. From the detailed analysis results of each attenuation component, the energy dissipation at interface is a major component in ultrasonic wave attenuation characteristics, which can provide a maximum contribution rate of 68.2 % to the overall attenuation, and each attenuation component is closely related to the material anisotropy and viscoelasticity. The results clarify the correlation between ultrasonic wave propagation characteristics and material viscoelastic properties, which will be useful in the further development of ultrasonic technology in defect detection.

  3. Application of sequential extraction analysis to Pb(II) recovery by zerovalent iron-based particles.

    PubMed

    Zhu, Neng-Min; Xu, Yan-Sheng; Dai, Lichun; Zhang, Yun-Fei; Hu, Guo-Quan

    2018-06-05

    Zerovalent iron (ZVI) is an environmental-friendly reactive reagent for recovering heavy metals. However, the detailed recovery mechanism remains unclear due to a lack of quantitative analysis of recovery products. Herein, microscale ZVI, nanoscale ZVI and Ni/Fe nanoparticles were used to recover Pb(II) in aqueous solution and a sequential extraction procedure (SEP) was applied to determine the formed lead species quantitatively. At high initial Pb(II) concentration (500 mg L -1 ), more than 99.5% of Pb(II) was immobilized by Ni/Fe and n-ZVI, whereas m-ZVI caused inferior recovery efficiency (<25%). XRD and XPS results revealed that Pb(II) was reduced to Pb 0 prior to the formation of metal hydroxides as the external shell of ZVI. SEP results showed that the fraction bound to carbonates (PbO), fraction bound to iron oxides and exchangeable fraction were the main lead species conducted by Ni/Fe, n-ZVI and m-ZVI, respectively. Consequently, (co-)precipitation and specific adsorption dominated Pb(II) recovery by Ni/Fe and n-ZVI, whereas m-ZVI conducted Pb(II) recovery mainly via weak adsorption. The reactivity of ZVI toward Pb(II) followed the increasing order of m-ZVI < n-ZVI ≤ Ni/Fe. The detailed mechanisms of Pb(II) recovery conducted by different ZVI were proposed. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Participation in environmental enhancement and conservation activities for health and well-being in adults: a review of quantitative and qualitative evidence.

    PubMed

    Husk, Kerryn; Lovell, Rebecca; Cooper, Chris; Stahl-Timmins, Will; Garside, Ruth

    2016-05-21

    There is growing research and policy interest in the potential for using the natural environment to enhance human health and well-being. This resource may be underused as a health promotion tool to address the increasing burden of common health problems such as increased chronic diseases and mental health concerns. Outdoor environmental enhancement and conservation activities (EECA) (for instance unpaid litter picking, tree planting or path maintenance) offer opportunities for physical activity alongside greater connectedness with local environments, enhanced social connections within communities and improved self-esteem through activities that improve the locality which may, in turn, further improve well-being. To assess the health and well-being impacts on adults following participation in environmental enhancement and conservation activities. We contacted or searched the websites of more than 250 EECA organisations to identify grey literature. Resource limitations meant the majority of the websites were from UK, USA, Canada and Australia. We searched the following databases (initially in October 2012, updated October 2014, except CAB Direct, OpenGrey, SPORTDiscus, and TRIP Database), using a search strategy developed with our project advisory groups (predominantly leaders of EECA-type activities and methodological experts): ASSIA; BIOSIS; British Education Index; British Nursing Index; CAB Abstracts; Campbell Collaboration; Cochrane Public Health Specialized Register; DOPHER; EMBASE; ERIC; Global Health; GreenFILE; HMIC; MEDLINE-in-Process; MEDLINE; OpenGrey; PsychINFO; Social Policy and Practice; SPORTDiscus; TRoPHI; Social Services Abstracts; Sociological Abstracts; The Cochrane Library; TRIP database; and Web of Science. Citation and related article chasing was used. Searches were limited to studies in English published after 1990. Two review authors independently screened studies. Included studies examined the impact of EECA on adult health and well-being. Eligible interventions needed to include each of the following: intended to improve the outdoor natural or built environment at either a local or wider level; took place in urban or rural locations in any country; involved active participation; and were NOT experienced through paid employment.We included quantitative and qualitative research. Includable quantitative study designs were: randomised controlled trials (RCTs), cluster RCTs, quasi-RCTs, cluster quasi-RCTs, controlled before-and-after studies, interrupted-time-series, cohort studies (prospective or retrospective), case-control studies and uncontrolled before-and-after studies (uBA). We included qualitative research if it used recognised qualitative methods of data collection and analysis. One reviewer extracted data, and another reviewer checked the data. Two review authors independently appraised study quality using the Effective Public Health Practice Project tool (for quantitative studies) or Wallace criteria (for qualitative studies). Heterogeneity of outcome measures and poor reporting of intervention specifics prevented meta-analysis so we synthesised the results narratively. We synthesised qualitative research findings using thematic analysis. Database searches identified 21,420 records, with 21,304 excluded at title/abstract. Grey literature searches identified 211 records. We screened 327 full-text articles from which we included 21 studies (reported in 28 publications): two case-studies (which were not included in the synthesis due to inadequate robustness), one case-control, one retrospective cohort, five uBA, three mixed-method (uBA, qualitative), and nine qualitative studies. The 19 studies included in the synthesis detailed the impacts to a total of 3,603 participants: 647 from quantitative intervention studies and 2630 from a retrospective cohort study; and 326 from qualitative studies (one not reporting sample size).Included studies shared the key elements of EECA defined above, but the range of activities varied considerably. Quantitative evaluation methods were heterogeneous. The designs or reporting of quantitative studies, or both, were rated as 'weak' quality with high risk of bias due to one or more of the following: inadequate study design, intervention detail, participant selection, outcome reporting and blinding.Participants' characteristics were poorly reported; eight studies did not report gender or age and none reported socio-economic status. Three quantitative studies reported that participants were referred through health or social services, or due to mental ill health (five quantitative studies), however participants' engagement routes were often not clear.Whilst the majority of quantitative studies (n = 8) reported no effect on one or more outcomes, positive effects were reported in six quantitative studies relating to short-term physiological, mental/emotional health, and quality-of-life outcomes. Negative effects were reported in two quantitative studies; one study reported higher levels of anxiety amongst participants, another reported increased mental health stress.The design or reporting, or both, of the qualitative studies was rated as good in three studies or poor in nine; mainly due to missing detail about participants, methods and interventions. Included qualitative evidence provided rich data about the experience of participation. Thematic analysis identified eight themes supported by at least one good quality study, regarding participants' positive experiences and related to personal/social identity, physical activity, developing knowledge, spirituality, benefits of place, personal achievement, psychological benefits and social contact. There was one report of negative experiences. There is little quantitative evidence of positive or negative health and well-being benefits from participating in EECA. However, the qualitative research showed high levels of perceived benefit among participants. Quantitative evidence resulted from study designs with high risk of bias, qualitative evidence lacked reporting detail. The majority of included studies were programme evaluations, conducted internally or funded by the provider.The conceptual framework illustrates the range of interlinked mechanisms through which people believe they potentially achieve health and well-being benefits, such as opportunities for social contact. It also considers potential moderators and mediators of effect.One main finding of the review is the inherent difficulty associated with generating robust evidence of effectiveness for complex interventions. We developed the conceptual framework to illustrate how people believed they benefited. Investigating such mechanisms in a subsequent theory-led review might be one way of examining evidence of effect for these activities.The conceptual framework needs further refinement through linked reviews and more reliable evidence. Future research should use more robust study designs and report key intervention and participant detail.

  5. Qualitative and quantitative processing of side-scan sonar data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dwan, F.S.; Anderson, A.L.; Hilde, T.W.C.

    1990-06-01

    Modern side-scan sonar systems allow vast areas of seafloor to be rapidly imaged and quantitatively mapped in detail. The application of remote sensing image processing techniques can be used to correct for various distortions inherent in raw sonography. Corrections are possible for water column, slant-range, aspect ratio, speckle and striping noise, multiple returns, power drop-off, and for georeferencing. The final products reveal seafloor features and patterns that are geometrically correct, georeferenced, and have improved signal/noise ratio. These products can be merged with other georeferenced data bases for further database management and information extraction. In order to compare data collected bymore » different systems from a common area and to ground truth measurements and geoacoustic models, quantitative correction must be made for calibrated sonar system and bathymetry effects. Such data inversion must account for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area, and grazing angle effects. Seafloor classification can then be performed on the calculated back-scattering strength using Lambert's Law and regression analysis. Examples are given using both approaches: image analysis and inversion of data based on the sonar equation.« less

  6. Mastitomics, the integrated omics of bovine milk in an experimental model of Streptococcus uberis mastitis: 2. Label-free relative quantitative proteomics.

    PubMed

    Mudaliar, Manikhandan; Tassi, Riccardo; Thomas, Funmilola C; McNeilly, Tom N; Weidt, Stefan K; McLaughlin, Mark; Wilson, David; Burchmore, Richard; Herzyk, Pawel; Eckersall, P David; Zadoks, Ruth N

    2016-08-16

    Mastitis, inflammation of the mammary gland, is the most common and costly disease of dairy cattle in the western world. It is primarily caused by bacteria, with Streptococcus uberis as one of the most prevalent causative agents. To characterize the proteome during Streptococcus uberis mastitis, an experimentally induced model of intramammary infection was used. Milk whey samples obtained from 6 cows at 6 time points were processed using label-free relative quantitative proteomics. This proteomic analysis complements clinical, bacteriological and immunological studies as well as peptidomic and metabolomic analysis of the same challenge model. A total of 2552 non-redundant bovine peptides were identified, and from these, 570 bovine proteins were quantified. Hierarchical cluster analysis and principal component analysis showed clear clustering of results by stage of infection, with similarities between pre-infection and resolution stages (0 and 312 h post challenge), early infection stages (36 and 42 h post challenge) and late infection stages (57 and 81 h post challenge). Ingenuity pathway analysis identified upregulation of acute phase protein pathways over the course of infection, with dominance of different acute phase proteins at different time points based on differential expression analysis. Antimicrobial peptides, notably cathelicidins and peptidoglycan recognition protein, were upregulated at all time points post challenge and peaked at 57 h, which coincided with 10 000-fold decrease in average bacterial counts. The integration of clinical, bacteriological, immunological and quantitative proteomics and other-omic data provides a more detailed systems level view of the host response to mastitis than has been achieved previously.

  7. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry.

    PubMed

    Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.

  8. An eQTL Analysis of Partial Resistance to Puccinia hordei in Barley

    PubMed Central

    Chen, Xinwei; Hackett, Christine A.; Niks, Rients E.; Hedley, Peter E.; Booth, Clare; Druka, Arnis; Marcel, Thierry C.; Vels, Anton; Bayer, Micha; Milne, Iain; Morris, Jenny; Ramsay, Luke; Marshall, David; Cardle, Linda; Waugh, Robbie

    2010-01-01

    Background Genetic resistance to barley leaf rust caused by Puccinia hordei involves both R genes and quantitative trait loci. The R genes provide higher but less durable resistance than the quantitative trait loci. Consequently, exploring quantitative or partial resistance has become a favorable alternative for controlling disease. Four quantitative trait loci for partial resistance to leaf rust have been identified in the doubled haploid Steptoe (St)/Morex (Mx) mapping population. Further investigations are required to study the molecular mechanisms underpinning partial resistance and ultimately identify the causal genes. Methodology/Principal Findings We explored partial resistance to barley leaf rust using a genetical genomics approach. We recorded RNA transcript abundance corresponding to each probe on a 15K Agilent custom barley microarray in seedlings from St and Mx and 144 doubled haploid lines of the St/Mx population. A total of 1154 and 1037 genes were, respectively, identified as being P. hordei-responsive among the St and Mx and differentially expressed between P. hordei-infected St and Mx. Normalized ratios from 72 distant-pair hybridisations were used to map the genetic determinants of variation in transcript abundance by expression quantitative trait locus (eQTL) mapping generating 15685 eQTL from 9557 genes. Correlation analysis identified 128 genes that were correlated with resistance, of which 89 had eQTL co-locating with the phenotypic quantitative trait loci (pQTL). Transcript abundance in the parents and conservation of synteny with rice allowed us to prioritise six genes as candidates for Rphq11, the pQTL of largest effect, and highlight one, a phospholipid hydroperoxide glutathione peroxidase (HvPHGPx) for detailed analysis. Conclusions/Significance The eQTL approach yielded information that led to the identification of strong candidate genes underlying pQTL for resistance to leaf rust in barley and on the general pathogen response pathway. The dataset will facilitate a systems appraisal of this host-pathogen interaction and, potentially, for other traits measured in this population. PMID:20066049

  9. Chemical purity using quantitative 1H-nuclear magnetic resonance: a hierarchical Bayesian approach for traceable calibrations

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Lippa, Katrice A.

    2016-10-01

    Chemical purity assessment using quantitative 1H-nuclear magnetic resonance spectroscopy is a method based on ratio references of mass and signal intensity of the analyte species to that of chemical standards of known purity. As such, it is an example of a calculation using a known measurement equation with multiple inputs. Though multiple samples are often analyzed during purity evaluations in order to assess measurement repeatability, the uncertainty evaluation must also account for contributions from inputs to the measurement equation. Furthermore, there may be other uncertainty components inherent in the experimental design, such as independent implementation of multiple calibration standards. As such, the uncertainty evaluation is not purely bottom up (based on the measurement equation) or top down (based on the experimental design), but inherently contains elements of both. This hybrid form of uncertainty analysis is readily implemented with Bayesian statistical analysis. In this article we describe this type of analysis in detail and illustrate it using data from an evaluation of chemical purity and its uncertainty for a folic acid material.

  10. Analysis of transport eco-efficiency scenarios to support sustainability assessment: a study on Dhaka City, Bangladesh.

    PubMed

    Iqbal, Asif; Allan, Andrew; Afroze, Shirina

    2017-08-01

    The study focused to assess the level of efficiency (of both emissions and service quality) that can be achieved for the transport system in Dhaka City, Bangladesh. The assessment technique attempted to quantify the extent of eco-efficiency achievable for the system modifications due to planning or strategy. The eco-efficiency analysis was facilitated with a detailed survey data on Dhaka City transport system, which was conducted for 9 months in 2012-2013. Line source modelling (CALINE4) was incorporated to estimate the on-road emission concentration. The eco-efficiency of the transport systems was assessed with the 'multi-criteria analysis' (MCA) technique that enabled the valuation of systems' qualitative and quantitative parameters. As per the analysis, driving indiscipline on road can alone promise about 47% reductions in emissions, which along with the number of private vehicles were the important stressors that restrict achieving eco-efficiency in Dhaka City. Detailed analysis of the transport system together with the potential transport system scenarios can offer a checklist to the policy makers enabling to identify the possible actions needed that can offer greater services to the dwellers against lesser emissions, which in turn can bring sustainability of the system.

  11. Combined LC/MS-platform for analysis of all major stratum corneum lipids, and the profiling of skin substitutes.

    PubMed

    van Smeden, Jeroen; Boiten, Walter A; Hankemeier, Thomas; Rissmann, Robert; Bouwstra, Joke A; Vreeken, Rob J

    2014-01-01

    Ceramides (CERs), cholesterol, and free fatty acids (FFAs) are the main lipid classes in human stratum corneum (SC, outermost skin layer), but no studies report on the detailed analysis of these classes in a single platform. The primary aims of this study were to 1) develop an LC/MS method for (semi-)quantitative analysis of all main lipid classes present in human SC; and 2) use this method to study in detail the lipid profiles of human skin substitutes and compare them to human SC lipids. By applying two injections of 10μl, the developed method detects all major SC lipids using RPLC and negative ion mode APCI-MS for detection of FFAs, and NPLC using positive ion mode APCI-MS to analyze CERs and cholesterol. Validation showed this lipid platform to be robust, reproducible, sensitive, and fast. The method was successfully applied on ex vivo human SC, human SC obtained from tape strips and human skin substitutes (porcine SC and human skin equivalents). In conjunction with FFA profiles, clear differences in CER profiles were observed between these different SC sources. Human skin equivalents more closely mimic the lipid composition of human stratum corneum than porcine skin does, although noticeable differences are still present. These differences gave biologically relevant information on some of the enzymes that are probably involved in SC lipid processing. For future research, this provides an excellent method for (semi-)quantitative, 'high-throughput' profiling of SC lipids and can be used to advance the understanding of skin lipids and the biological processes involved. © 2013.

  12. Lost Opportunity: The High Quality, Reduced Military Force of the 1990’s: Is there a Role for the Nation’s Disadvantaged Youth?

    DTIC Science & Technology

    1990-12-01

    others based on detailed, quantitative "manpower engineering " techniques (White and Hosek, in Scowcroft, 1982, p. 51). 3. The Military Force and...complex technology does not involve a corresponding increase in more highly trained personnel. Trends in industry suggest that job titles are often...theories of motivation, two of which are relevant to this analysis and will be briefly described below. Expectancy theory, with roots in industrial

  13. A method for the assessment of specific energy distribution in a model tumor system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noska, M.A.

    1996-12-31

    Due to the short range of alpha particles in tissue, the calculation of dose from internally deposited alpha emitters requires a detailed analysis of the microscopic distribution of the radionuclide in order to determine the spatial distribution of energy emission events and, from this, the spatial distribution of dose. In the present study, the authors used quantitative autoradiography (QAR) to assess the microdistribution of a radiolabeled monoclonal antibody (MAb) fragment in human glioma xenografts in mice.

  14. Assessment of the Technical Training Received by Source Selection Participants in Air Force Systems Command.

    DTIC Science & Technology

    1986-09-01

    60 48.4 Systems 200/400 15 12.1 Contract Administration (PPM 152) 13 10.5 Logistics Management (Log 224) 2 1.6 Government Contract Law (PPM 302) 20...detail. 1. Systems 200/400 2. Contract Administration (PPM 152) 3. Logistics Management (LOG 224) 4. Government Contract Law (PPM 302) 5. Technical...152 Contract Administration o Log 224 Logistics Management o PPM 302 Government Contract Law o QMT 345 Quantitative Technical, Cost, and Price Analysis

  15. Absolute Quantification of Middle- to High-Abundant Plasma Proteins via Targeted Proteomics.

    PubMed

    Dittrich, Julia; Ceglarek, Uta

    2017-01-01

    The increasing number of peptide and protein biomarker candidates requires expeditious and reliable quantification strategies. The utilization of liquid chromatography coupled to quadrupole tandem mass spectrometry (LC-MS/MS) for the absolute quantitation of plasma proteins and peptides facilitates the multiplexed verification of tens to hundreds of biomarkers from smallest sample quantities. Targeted proteomics assays derived from bottom-up proteomics principles rely on the identification and analysis of proteotypic peptides formed in an enzymatic digestion of the target protein. This protocol proposes a procedure for the establishment of a targeted absolute quantitation method for middle- to high-abundant plasma proteins waiving depletion or enrichment steps. Essential topics as proteotypic peptide identification and LC-MS/MS method development as well as sample preparation and calibration strategies are described in detail.

  16. Quantitative Morphology Measures in Galaxies: Ground-Truthing from Simulations

    NASA Astrophysics Data System (ADS)

    Narayanan, Desika T.; Abruzzo, Matthew W.; Dave, Romeel; Thompson, Robert

    2017-01-01

    The process of galaxy assembly is a prevalent question in astronomy; there are a variety of potentially important effects, including baryonic accretion from the intergalactic medium, as well as major galaxy mergers. Recent years have ushered in the development of quantitative measures of morphology such as the Gini coefficient (G), the second-order moment of the brightest quintile of a galaxy’s light (M20), and the concentration (C), asymmetry (A), and clumpiness (S) of galaxies. To investigate the efficacy of these observational methods at identifying major mergers, we have run a series of very high resolution cosmological zoom simulations, and coupled these with 3D Monte Carlo dust radiative transfer. Our methodology is powerful in that it allows us to “observe” the simulation as an observer would, while maintaining detailed knowledge of the true merger history of the galaxy. In this presentation, we will present our main results from our analysis of these quantitative morphology measures, with a particular focus on high-redshift (z>2) systems.

  17. Numerical analysis of quantitative measurement of hydroxyl radical concentration using laser-induced fluorescence in flame

    NASA Astrophysics Data System (ADS)

    Shuang, Chen; Tie, Su; Yao-Bang, Zheng; Li, Chen; Ting-Xu, Liu; Ren-Bing, Li; Fu-Rong, Yang

    2016-06-01

    The aim of the present work is to quantitatively measure the hydroxyl radical concentration by using LIF (laser-induced fluorescence) in flame. The detailed physical models of spectral absorption lineshape broadening, collisional transition and quenching at elevated pressure are built. The fine energy level structure of the OH molecule is illustrated to understand the process with laser-induced fluorescence emission and others in the case without radiation, which include collisional quenching, rotational energy transfer (RET), and vibrational energy transfer (VET). Based on these, some numerical results are achieved by simulations in order to evaluate the fluorescence yield at elevated pressure. These results are useful for understanding the real physical processes in OH-LIF technique and finding a way to calibrate the signal for quantitative measurement of OH concentration in a practical combustor. Project supported by the National Natural Science Foundation of China (Grant No. 11272338) and the Fund from the Science and Technology on Scramjet Key Laboratory, China (Grant No. STSKFKT2013004).

  18. A thorough experimental study of CH/π interactions in water: quantitative structure-stability relationships for carbohydrate/aromatic complexes.

    PubMed

    Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis

    2015-11-13

    CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.

  19. Measuring Black Men’s Police-Based Discrimination Experiences: Development and Validation of the Police and Law Enforcement (PLE) Scale

    PubMed Central

    English, Devin; Bowleg, Lisa; del Río-González, Ana Maria; Tschann, Jeanne M.; Agans, Robert; Malebranche, David J

    2017-01-01

    Objectives Although social science research has examined police and law enforcement-perpetrated discrimination against Black men using policing statistics and implicit bias studies, there is little quantitative evidence detailing this phenomenon from the perspective of Black men. Consequently, there is a dearth of research detailing how Black men’s perspectives on police and law enforcement-related stress predict negative physiological and psychological health outcomes. This study addresses these gaps with the qualitative development and quantitative test of the Police and Law Enforcement (PLE) scale. Methods In Study 1, we employed thematic analysis on transcripts of individual qualitative interviews with 90 Black men to assess key themes and concepts and develop quantitative items. In Study 2, we used 2 focus groups comprised of 5 Black men each (n=10), intensive cognitive interviewing with a separate sample of Black men (n=15), and piloting with another sample of Black men (n=13) to assess the ecological validity of the quantitative items. For study 3, we analyzed data from a sample of 633 Black men between the ages of 18 and 65 to test the factor structure of the PLE, as we all as its concurrent validity and convergent/discriminant validity. Results Qualitative analyses and confirmatory factor analyses suggested that a 5-item, 1-factor measure appropriately represented respondents’ experiences of police/law enforcement discrimination. As hypothesized, the PLE was positively associated with measures of racial discrimination and depressive symptoms. Conclusions Preliminary evidence suggests that the PLE is a reliable and valid measure of Black men’s experiences of discrimination with police/law enforcement. PMID:28080104

  20. Measuring Black men's police-based discrimination experiences: Development and validation of the Police and Law Enforcement (PLE) Scale.

    PubMed

    English, Devin; Bowleg, Lisa; Del Río-González, Ana Maria; Tschann, Jeanne M; Agans, Robert P; Malebranche, David J

    2017-04-01

    Although social science research has examined police and law enforcement-perpetrated discrimination against Black men using policing statistics and implicit bias studies, there is little quantitative evidence detailing this phenomenon from the perspective of Black men. Consequently, there is a dearth of research detailing how Black men's perspectives on police and law enforcement-related stress predict negative physiological and psychological health outcomes. This study addresses these gaps with the qualitative development and quantitative test of the Police and Law Enforcement (PLE) Scale. In Study 1, we used thematic analysis on transcripts of individual qualitative interviews with 90 Black men to assess key themes and concepts and develop quantitative items. In Study 2, we used 2 focus groups comprised of 5 Black men each (n = 10), intensive cognitive interviewing with a separate sample of Black men (n = 15), and piloting with another sample of Black men (n = 13) to assess the ecological validity of the quantitative items. For Study 3, we analyzed data from a sample of 633 Black men between the ages of 18 and 65 to test the factor structure of the PLE, as we all as its concurrent validity and convergent/discriminant validity. Qualitative analyses and confirmatory factor analyses suggested that a 5-item, 1-factor measure appropriately represented respondents' experiences of police/law enforcement discrimination. As hypothesized, the PLE was positively associated with measures of racial discrimination and depressive symptoms. Preliminary evidence suggests that the PLE is a reliable and valid measure of Black men's experiences of discrimination with police/law enforcement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Applications of mass spectrometry for quantitative protein analysis in formalin-fixed paraffin-embedded tissues

    PubMed Central

    Steiner, Carine; Ducret, Axel; Tille, Jean-Christophe; Thomas, Marlene; McKee, Thomas A; Rubbia-Brandt, Laura A; Scherl, Alexander; Lescuyer, Pierre; Cutler, Paul

    2014-01-01

    Proteomic analysis of tissues has advanced in recent years as instruments and methodologies have evolved. The ability to retrieve peptides from formalin-fixed paraffin-embedded tissues followed by shotgun or targeted proteomic analysis is offering new opportunities in biomedical research. In particular, access to large collections of clinically annotated samples should enable the detailed analysis of pathologically relevant tissues in a manner previously considered unfeasible. In this paper, we review the current status of proteomic analysis of formalin-fixed paraffin-embedded tissues with a particular focus on targeted approaches and the potential for this technique to be used in clinical research and clinical diagnosis. We also discuss the limitations and perspectives of the technique, particularly with regard to application in clinical diagnosis and drug discovery. PMID:24339433

  2. Quantitative genetic analysis of injury liability in infants and toddlers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries.more » Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.« less

  3. Quantitative analysis of autophagic flux by confocal pH-imaging of autophagic intermediates

    PubMed Central

    Maulucci, Giuseppe; Chiarpotto, Michela; Papi, Massimiliano; Samengo, Daniela; Pani, Giovambattista; De Spirito, Marco

    2015-01-01

    Although numerous techniques have been developed to monitor autophagy and to probe its cellular functions, these methods cannot evaluate in sufficient detail the autophagy process, and suffer limitations from complex experimental setups and/or systematic errors. Here we developed a method to image, contextually, the number and pH of autophagic intermediates by using the probe mRFP-GFP-LC3B as a ratiometric pH sensor. This information is expressed functionally by AIPD, the pH distribution of the number of autophagic intermediates per cell. AIPD analysis reveals how intermediates are characterized by a continuous pH distribution, in the range 4.5–6.5, and therefore can be described by a more complex set of states rather than the usual biphasic one (autophagosomes and autolysosomes). AIPD shape and amplitude are sensitive to alterations in the autophagy pathway induced by drugs or environmental states, and allow a quantitative estimation of autophagic flux by retrieving the concentrations of autophagic intermediates. PMID:26506895

  4. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less

  5. Recent developments in qualitative and quantitative analysis of phytochemical constituents and their metabolites using liquid chromatography-mass spectrometry.

    PubMed

    Wu, Haifeng; Guo, Jian; Chen, Shilin; Liu, Xin; Zhou, Yan; Zhang, Xiaopo; Xu, Xudong

    2013-01-01

    Over the past few years, the applications of liquid chromatography coupled with mass spectrometry (LC-MS) in natural product analysis have been dramatically growing because of the increasingly improved separation and detection capabilities of LC-MS instruments. In particular, novel high-resolution hybrid instruments linked to ultra-high-performance LC and the hyphenations of LC-MS with other separation or analytical techniques greatly aid unequivocal identification and highly sensitive quantification of natural products at trace concentrations in complex matrices. With the aim of providing an up-to-date overview of LC-MS applications on the analysis of plant-derived compounds, papers published within the latest years (2007-2012) involving qualitative and quantitative analysis of phytochemical constituents and their metabolites are summarized in the present review. After briefly describing the general characteristics of natural products analysis, the most remarkable features of LC-MS and sample preparation techniques, the present paper mainly focuses on screening and characterization of phenols (including flavonoids), alkaloids, terpenoids, steroids, coumarins, lignans, and miscellaneous compounds in respective herbs and biological samples, as well as traditional Chinese medicine (TCM) prescriptions using tandem mass spectrometer. Chemical fingerprinting analysis using LC-MS is also described. Meanwhile, instrumental peculiarities and methodological details are accentuated. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Evoked and event-related potentials in disorders of consciousness: A quantitative review.

    PubMed

    Kotchoubey, Boris

    2017-09-01

    Sixty-one publications about evoked and event-related potentials (EP and ERP, respectively) in patients with severe Disorders of Consciousness (DoC) were found and analyzed from a quantitative point of view. Most studies are strongly underpowered, resulting in very broad confidence intervals (CI). Results of such studies cannot be correctly interpreted, because, for example, CI>1 (in terms of Cohen's d) indicate that the real effect may be very strong, very weak, or even opposite to the reported effect. Furthermore, strong negative correlations were obtained between sample size and effect size, indicating a possible publication bias. These correlations characterized not only the total data set, but also each thematically selected subset. The minimal criteria of a strong study to EP/ERP in DoC are proposed: at least 25 patients in each patient group; as reliable diagnosis as possible; the complete report of all methodological details and all details of results (including negative results); and the use of appropriate methods of data analysis. Only three of the detected 60 studies (5%) satisfy these criteria. The limitations of the current approach are also discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Luminous Phenomena in the Atmosphere. A New Frontier of New Physics?

    NASA Astrophysics Data System (ADS)

    Teodorani, M.

    1999-03-01

    A main geographic list of anomalous atmospheric light phenomena which are reocurring in several areas of the world is presented. In particular, the Norwegian light-phenomenon occurring in Hessdalen, a prototypical event of this class, is described in great detail. Results obtained in 1984 by the Norwegian scientific organization named 'Project Hessdalen' are discussed. Moreover, the present status and future projects of this organization are presented. It is also shown how the philosophy of research of Project Hessdalen can be adapted to the quantitative investigation of similar light phenomena in other parts of the world. Subsequently, the numerical analysis carried out by the author on the Project Hessdalen 1984 data is shown in detail. After illustrating the several physical theories which have been proposed so far to explain the light phenomenon, a strong emphasis is given on the quantitative definition of instrumental prerequisites and measurable physical parameters. A strategy aimed at defining the investigation methodology and instrumented monitoring in Italian areas of recurrence of the light phenomenon, is presented. An introduction is also given on the documented effects of interaction of the electromagnetic field produced by the light phenomenon with the brain electrical activity of people, by suggesting possible biophysical causes.

  8. The Development, Description and Appraisal of an Emergent Multimethod Research Design to Study Workforce Changes in Integrated Care Interventions

    PubMed Central

    Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G.

    2017-01-01

    Introduction: In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. Theory and methods: The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. Results: The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. Conclusion and discussion: We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here. PMID:29042843

  9. An importance-performance analysis of hospital information system attributes: A nurses' perspective.

    PubMed

    Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J

    2016-02-01

    Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Digital storage and analysis of color Doppler echocardiograms

    NASA Technical Reports Server (NTRS)

    Chandra, S.; Thomas, J. D.

    1997-01-01

    Color Doppler flow mapping has played an important role in clinical echocardiography. Most of the clinical work, however, has been primarily qualitative. Although qualitative information is very valuable, there is considerable quantitative information stored within the velocity map that has not been extensively exploited so far. Recently, many researchers have shown interest in using the encoded velocities to address the clinical problems such as quantification of valvular regurgitation, calculation of cardiac output, and characterization of ventricular filling. In this article, we review some basic physics and engineering aspects of color Doppler echocardiography, as well as drawbacks of trying to retrieve velocities from video tape data. Digital storage, which plays a critical role in performing quantitative analysis, is discussed in some detail with special attention to velocity encoding in DICOM 3.0 (medical image storage standard) and the use of digital compression. Lossy compression can considerably reduce file size with minimal loss of information (mostly redundant); this is critical for digital storage because of the enormous amount of data generated (a 10 minute study could require 18 Gigabytes of storage capacity). Lossy JPEG compression and its impact on quantitative analysis has been studied, showing that images compressed at 27:1 using the JPEG algorithm compares favorably with directly digitized video images, the current goldstandard. Some potential applications of these velocities in analyzing the proximal convergence zones, mitral inflow, and some areas of future development are also discussed in the article.

  11. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  12. More details...
  13. A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images

    PubMed Central

    Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.

    1986-01-01

    The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16

  14. Analysis of Particulate Contamination During Launch of MMS Mission

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos; Barrie, Alexander; Hughes, David; Errigo, Therese

    2010-01-01

    NASA's Magnetospheric MultiScale (MMS) is an unmanned constellation of four identical spacecraft designed to investigate magnetic reconnection by obtaining detailed measurements of plasma properties in Earth's magnetopause and magnetotail. Each of the four identical satellites carries a suite of instruments which characterize the ambient ion and electron energy spectrum and composition. Some of these instruments utilize microchannel plates and are sensitive to particulate contamination. In this paper, we analyze the transport of particulates during pre-launch, launch and ascent events, and use the analysis to obtain quantitative predictions of contamination impact on the instruments. Viewfactor calculation is performed by considering the gravitational and aerodynamic forces acting on the particles.

  15. PET kinetic analysis --pitfalls and a solution for the Logan plot.

    PubMed

    Kimura, Yuichi; Naganawa, Mika; Shidahara, Miho; Ikoma, Yoko; Watabe, Hiroshi

    2007-01-01

    The Logan plot is a widely used algorithm for the quantitative analysis of neuroreceptors using PET because it is easy to use and simple to implement. The Logan plot is also suitable for receptor imaging because its algorithm is fast. However, use of the Logan plot, and interpretation of the formed receptor images should be regarded with caution, because noise in PET data causes bias in the Logan plot estimates. In this paper, we describe the basic concept of the Logan plot in detail and introduce three algorithms for the Logan plot. By comparing these algorithms, we demonstrate the pitfalls of the Logan plot and discuss the solution.

  16. Strain-induced macroscopic magnetic anisotropy from smectic liquid-crystalline elastomer-maghemite nanoparticle hybrid nanocomposites.

    PubMed

    Haberl, Johannes M; Sánchez-Ferrer, Antoni; Mihut, Adriana M; Dietsch, Hervé; Hirt, Ann M; Mezzenga, Raffaele

    2013-06-21

    We combine tensile strength analysis and X-ray scattering experiments to establish a detailed understanding of the microstructural coupling between liquid-crystalline elastomer (LCE) networks and embedded magnetic core-shell ellipsoidal nanoparticles (NPs). We study the structural and magnetic re-organization at different deformations and NP loadings, and the associated shape and magnetic memory features. In the quantitative analysis of a stretching process, the effect of the incorporated NPs on the smectic LCE is found to be prominent during the reorientation of the smectic domains and the softening of the nanocomposite. Under deformation, the soft response of the nanocomposite material allows the organization of the nanoparticles to yield a permanent macroscopically anisotropic magnetic material. Independent of the particle loading, the shape-memory properties and the smectic phase of the LCEs are preserved. Detailed studies on the magnetic properties demonstrate that the collective ensemble of individual particles is responsible for the macroscopic magnetic features of the nanocomposite.

  17. Mergers of Non-spinning Black-hole Binaries: Gravitational Radiation Characteristics

    NASA Technical Reports Server (NTRS)

    Baker, John G.; Boggs, William D.; Centrella, Joan; Kelly, Bernard J.; McWilliams, Sean T.; vanMeter, James R.

    2008-01-01

    We present a detailed descriptive analysis of the gravitational radiation from black-hole binary mergers of non-spinning black holes, based on numerical simulations of systems varying from equal-mass to a 6:1 mass ratio. Our primary goal is to present relatively complete information about the waveforms, including all the leading multipolar components, to interested researchers. In our analysis, we pursue the simplest physical description of the dominant features in the radiation, providing an interpretation of the waveforms in terms of an implicit rotating source. This interpretation applies uniformly to the full wavetrain, from inspiral through ringdown. We emphasize strong relationships among the l = m modes that persist through the full wavetrain. Exploring the structure of the waveforms in more detail, we conduct detailed analytic fitting of the late-time frequency evolution, identifying a key quantitative feature shared by the l = m modes among all mass-ratios. We identify relationships, with a simple interpretation in terms of the implicit rotating source, among the evolution of frequency and amplitude, which hold for the late-time radiation. These detailed relationships provide sufficient information about the late-time radiation to yield a predictive model for the late-time waveforms, an alternative to the common practice of modeling by a sum of quasinormal mode overtones. We demonstrate an application of this in a new effective-one-body-based analytic waveform model.

  18. Mergers of nonspinning black-hole binaries: Gravitational radiation characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, John G.; Centrella, Joan; Kelly, Bernard J.

    2008-08-15

    We present a detailed descriptive analysis of the gravitational radiation from black-hole binary mergers of nonspinning black holes, based on numerical simulations of systems varying from equal mass to a 6 ratio 1 mass ratio. Our primary goal is to present relatively complete information about the waveforms, including all the leading multipolar components, to interested researchers. In our analysis, we pursue the simplest physical description of the dominant features in the radiation, providing an interpretation of the waveforms in terms of an implicit rotating source. This interpretation applies uniformly to the full wave train, from inspiral through ringdown. We emphasizemore » strong relationships among the l=m modes that persist through the full wave train. Exploring the structure of the waveforms in more detail, we conduct detailed analytic fitting of the late-time frequency evolution, identifying a key quantitative feature shared by the l=m modes among all mass ratios. We identify relationships, with a simple interpretation in terms of the implicit rotating source, among the evolution of frequency and amplitude, which hold for the late-time radiation. These detailed relationships provide sufficient information about the late-time radiation to yield a predictive model for the late-time waveforms, an alternative to the common practice of modeling by a sum of quasinormal mode overtones. We demonstrate an application of this in a new effective-one-body-based analytic waveform model.« less

  19. Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.

    PubMed

    Haddaway, Neal R; Rytwinski, Trina

    2018-05-01

    Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle the possible subjectivity in quantitative synthesis described herein to ensure that the extensive efforts expended in producing systematic reviews and other evidence synthesis products is not wasted because of a lack of rigour or reliability in the final synthesis step. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Preserving elemental content in adherent mammalian cells for analysis by synchrotron-based x-ray fluorescence microscopy

    DOE PAGES

    Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry; ...

    2016-08-31

    Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less

  21. Preserving elemental content in adherent mammalian cells for analysis by synchrotron-based x-ray fluorescence microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry

    Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less

  1. Kβ Mainline X-ray Emission Spectroscopy as an Experimental Probe of Metal–Ligand Covalency

    PubMed Central

    2015-01-01

    The mainline feature in metal Kβ X-ray emission spectroscopy (XES) has long been recognized as an experimental marker for the spin state of the metal center. However, even within a series of metal compounds with the same nominal oxidation and spin state, significant changes are observed that cannot be explained on the basis of overall spin. In this work, the origin of these effects is explored, both experimentally and theoretically, in order to develop the chemical information content of Kβ mainline XES. Ligand field expressions are derived that describe the behavior of Kβ mainlines for first row transition metals with any dn count, allowing for a detailed analysis of the factors governing mainline shape. Further, due to limitations associated with existing computational approaches, we have developed a new methodology for calculating Kβ mainlines using restricted active space configuration interaction (RAS–CI) calculations. This approach eliminates the need for empirical parameters and provides a powerful tool for investigating the effects that chemical environment exerts on the mainline spectra. On the basis of a detailed analysis of the intermediate and final states involved in these transitions, we confirm the known sensitivity of Kβ mainlines to metal spin state via the 3p–3d exchange coupling. Further, a quantitative relationship between the splitting of the Kβ mainline features and the metal–ligand covalency is established. Thus, this study furthers the quantitative electronic structural information that can be extracted from Kβ mainline spectroscopy. PMID:24914450

  2. A method for evaluating the murine pulmonary vasculature using micro-computed tomography.

    PubMed

    Phillips, Michael R; Moore, Scott M; Shah, Mansi; Lee, Clara; Lee, Yueh Z; Faber, James E; McLean, Sean E

    2017-01-01

    Significant mortality and morbidity are associated with alterations in the pulmonary vasculature. While techniques have been described for quantitative morphometry of whole-lung arterial trees in larger animals, no methods have been described in mice. We report a method for the quantitative assessment of murine pulmonary arterial vasculature using high-resolution computed tomography scanning. Mice were harvested at 2 weeks, 4 weeks, and 3 months of age. The pulmonary artery vascular tree was pressure perfused to maximal dilation with a radio-opaque casting material with viscosity and pressure set to prevent capillary transit and venous filling. The lungs were fixed and scanned on a specimen computed tomography scanner at 8-μm resolution, and the vessels were segmented. Vessels were grouped into categories based on lumen diameter and branch generation. Robust high-resolution segmentation was achieved, permitting detailed quantitation of pulmonary vascular morphometrics. As expected, postnatal lung development was associated with progressive increase in small-vessel number and arterial branching complexity. These methods for quantitative analysis of the pulmonary vasculature in postnatal and adult mice provide a useful tool for the evaluation of mouse models of disease that affect the pulmonary vasculature. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    PubMed

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  4. Multipurpose contrast enhancement on epiphyseal plates and ossification centers for bone age assessment

    PubMed Central

    2013-01-01

    Background The high variations of background luminance, low contrast and excessively enhanced contrast of hand bone radiograph often impede the bone age assessment rating system in evaluating the degree of epiphyseal plates and ossification centers development. The Global Histogram equalization (GHE) has been the most frequently adopted image contrast enhancement technique but the performance is not satisfying. A brightness and detail preserving histogram equalization method with good contrast enhancement effect has been a goal of much recent research in histogram equalization. Nevertheless, producing a well-balanced histogram equalized radiograph in terms of its brightness preservation, detail preservation and contrast enhancement is deemed to be a daunting task. Method In this paper, we propose a novel framework of histogram equalization with the aim of taking several desirable properties into account, namely the Multipurpose Beta Optimized Bi-Histogram Equalization (MBOBHE). This method performs the histogram optimization separately in both sub-histograms after the segmentation of histogram using an optimized separating point determined based on the regularization function constituted by three components. The result is then assessed by the qualitative and quantitative analysis to evaluate the essential aspects of histogram equalized image using a total of 160 hand radiographs that are implemented in testing and analyses which are acquired from hand bone online database. Result From the qualitative analysis, we found that basic bi-histogram equalizations are not capable of displaying the small features in image due to incorrect selection of separating point by focusing on only certain metric without considering the contrast enhancement and detail preservation. From the quantitative analysis, we found that MBOBHE correlates well with human visual perception, and this improvement shortens the evaluation time taken by inspector in assessing the bone age. Conclusions The proposed MBOBHE outperforms other existing methods regarding comprehensive performance of histogram equalization. All the features which are pertinent to bone age assessment are more protruding relative to other methods; this has shorten the required evaluation time in manual bone age assessment using TW method. While the accuracy remains unaffected or slightly better than using unprocessed original image. The holistic properties in terms of brightness preservation, detail preservation and contrast enhancement are simultaneous taken into consideration and thus the visual effect is contributive to manual inspection. PMID:23565999

  5. Image Analysis of DNA Fiber and Nucleus in Plants.

    PubMed

    Ohmido, Nobuko; Wako, Toshiyuki; Kato, Seiji; Fukui, Kiichi

    2016-01-01

    Advances in cytology have led to the application of a wide range of visualization methods in plant genome studies. Image analysis methods are indispensable tools where morphology, density, and color play important roles in the biological systems. Visualization and image analysis methods are useful techniques in the analyses of the detailed structure and function of extended DNA fibers (EDFs) and interphase nuclei. The EDF is the highest in the spatial resolving power to reveal genome structure and it can be used for physical mapping, especially for closely located genes and tandemly repeated sequences. One the other hand, analyzing nuclear DNA and proteins would reveal nuclear structure and functions. In this chapter, we describe the image analysis protocol for quantitatively analyzing different types of plant genome, EDFs and interphase nuclei.

  6. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry

    PubMed Central

    Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859

  7. Mastitomics, the integrated omics of bovine milk in an experimental model of Streptococcus uberis mastitis: 2. Label-free relative quantitative proteomics† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c6mb00290k Click here for additional data file.

    PubMed Central

    Mudaliar, Manikhandan; Tassi, Riccardo; Thomas, Funmilola C.; McNeilly, Tom N.; Weidt, Stefan K.; McLaughlin, Mark; Wilson, David; Burchmore, Richard; Herzyk, Pawel; Eckersall, P. David

    2016-01-01

    Mastitis, inflammation of the mammary gland, is the most common and costly disease of dairy cattle in the western world. It is primarily caused by bacteria, with Streptococcus uberis as one of the most prevalent causative agents. To characterize the proteome during Streptococcus uberis mastitis, an experimentally induced model of intramammary infection was used. Milk whey samples obtained from 6 cows at 6 time points were processed using label-free relative quantitative proteomics. This proteomic analysis complements clinical, bacteriological and immunological studies as well as peptidomic and metabolomic analysis of the same challenge model. A total of 2552 non-redundant bovine peptides were identified, and from these, 570 bovine proteins were quantified. Hierarchical cluster analysis and principal component analysis showed clear clustering of results by stage of infection, with similarities between pre-infection and resolution stages (0 and 312 h post challenge), early infection stages (36 and 42 h post challenge) and late infection stages (57 and 81 h post challenge). Ingenuity pathway analysis identified upregulation of acute phase protein pathways over the course of infection, with dominance of different acute phase proteins at different time points based on differential expression analysis. Antimicrobial peptides, notably cathelicidins and peptidoglycan recognition protein, were upregulated at all time points post challenge and peaked at 57 h, which coincided with 10 000-fold decrease in average bacterial counts. The integration of clinical, bacteriological, immunological and quantitative proteomics and other-omic data provides a more detailed systems level view of the host response to mastitis than has been achieved previously. PMID:27412694

  8. Smartphone based visual and quantitative assays on upconversional paper sensor.

    PubMed

    Mei, Qingsong; Jing, Huarong; Li, You; Yisibashaer, Wuerzha; Chen, Jian; Nan Li, Bing; Zhang, Yong

    2016-01-15

    The integration of smartphone with paper sensors recently has been gain increasing attentions because of the achievement of quantitative and rapid analysis. However, smartphone based upconversional paper sensors have been restricted by the lack of effective methods to acquire luminescence signals on test paper. Herein, by the virtue of 3D printing technology, we exploited an auxiliary reusable device, which orderly assembled a 980nm mini-laser, optical filter and mini-cavity together, for digitally imaging the luminescence variations on test paper and quantitative analyzing pesticide thiram by smartphone. In detail, copper ions decorated NaYF4:Yb/Tm upconversion nanoparticles were fixed onto filter paper to form test paper, and the blue luminescence on it would be quenched after additions of thiram through luminescence resonance energy transfer mechanism. These variations could be monitored by the smartphone camera, and then the blue channel intensities of obtained colored images were calculated to quantify amounts of thiram through a self-written Android program installed on the smartphone, offering a reliable and accurate detection limit of 0.1μM for the system. This work provides an initial demonstration of integrating upconversion nanosensors with smartphone digital imaging for point-of-care analysis on a paper-based platform. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite

    PubMed Central

    Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.

    2012-01-01

    Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347

  10. Exploiting induced variation to dissect quantitative traits in barley.

    PubMed

    Druka, Arnis; Franckowiak, Jerome; Lundqvist, Udda; Bonar, Nicola; Alexander, Jill; Guzy-Wrobelska, Justyna; Ramsay, Luke; Druka, Ilze; Grant, Iain; Macaulay, Malcolm; Vendramin, Vera; Shahinnia, Fahimeh; Radovic, Slobodanka; Houston, Kelly; Harrap, David; Cardle, Linda; Marshall, David; Morgante, Michele; Stein, Nils; Waugh, Robbie

    2010-04-01

    The identification of genes underlying complex quantitative traits such as grain yield by means of conventional genetic analysis (positional cloning) requires the development of several large mapping populations. However, it is possible that phenotypically related, but more extreme, allelic variants generated by mutational studies could provide a means for more efficient cloning of QTLs (quantitative trait loci). In barley (Hordeum vulgare), with the development of high-throughput genome analysis tools, efficient genome-wide identification of genetic loci harbouring mutant alleles has recently become possible. Genotypic data from NILs (near-isogenic lines) that carry induced or natural variants of genes that control aspects of plant development can be compared with the location of QTLs to potentially identify candidate genes for development--related traits such as grain yield. As yield itself can be divided into a number of allometric component traits such as tillers per plant, kernels per spike and kernel size, mutant alleles that both affect these traits and are located within the confidence intervals for major yield QTLs may represent extreme variants of the underlying genes. In addition, the development of detailed comparative genomic models based on the alignment of a high-density barley gene map with the rice and sorghum physical maps, has enabled an informed prioritization of 'known function' genes as candidates for both QTLs and induced mutant genes.

  11. X-ray Topographic Methods and Application to Analysis of Electronic Materials

    NASA Technical Reports Server (NTRS)

    Mayo, W. E.; Liu, H. Y.; Chaudhuri, J.

    1984-01-01

    Three supplementary X-ray techniques new to semiconductor applications are discussed. These are the Computer Aided Rocking Curve Analyzer, the Divergent Beam Method and a new method based on enhanced X-ray flourescence. The first method is used for quantitative mapping of an elastic or plastic strain field while the other two methods are used only to measure elastic strains. The divergent beam method is used for measuring the full strain tensor while the microfluorescence method is useful for monitoring strain uniformity. These methods are discussed in detail and examples of their application is presented. Among these are determination of the full strain ellipsoid in state-of-the-art liquid phase epitaxy deposited III-V epitaxial films; mapping of the plastic strain concentrations in tensile deformed Si; and quantitative determination of damage in V3Si due to ion implantation.

  12. Camera, Hand Lens, and Microscope Probe (CHAMP): An Instrument Proposed for the 2009 MSL Rover Mission

    NASA Technical Reports Server (NTRS)

    Mungas, Greg S.; Beegle, Luther W.; Boynton, John E.; Lee, Pascal; Shidemantle, Ritch; Fisher, Ted

    2004-01-01

    The Camera, Hand Lens, and Microscope Probe (CHAMP) will allow examination of martian surface features and materials (terrain, rocks, soils, samples) on spatial scales ranging from kilometers to micrometers, thus enabling both microscopy and context imaging with high operational flexibility. CHAMP is designed to allow the detailed and quantitative investigation of a wide range of geologic features and processes on Mars, leading to a better quantitative understanding of the evolution of the martian surface environment through time. In particular, CHAMP will provide key data that will help understand the local region explored by Mars Surface Laboratory (MSL) as a potential habitat for life. CHAMP will also support other anticipated MSL investigations, in particular by helping identify and select the highest priority targets for sample collection and analysis by the MSL's analytical suite.

  13. Noninvasive Dry Eye Assessment Using High-Technology Ophthalmic Examination Devices.

    PubMed

    Yamaguchi, Masahiko; Sakane, Yuri; Kamao, Tomoyuki; Zheng, Xiaodong; Goto, Tomoko; Shiraishi, Atsushi; Ohashi, Yuichi

    2016-11-01

    Recently, the number of dry eye cases has dramatically increased. Thus, it is important that easy screening, exact diagnoses, and suitable treatments be available. We developed 3 original and noninvasive assessments for this disorder. First, a DR-1 dry eye monitor was used to determine the tear meniscus height quantitatively by capturing a tear meniscus digital image that was analyzed by Meniscus Processor software. The DR-1 meniscus height value significantly correlated with the fluorescein meniscus height (r = 0.06, Bland-Altman analysis). At a cutoff value of 0.22 mm, sensitivity of the dry eye diagnosis was 84.1% with 90.9% specificity. Second, the Tear Stability Analysis System was used to quantitatively measure tear film stability using a topographic modeling system corneal shape analysis device. Tear film stability was objectively and quantitatively evaluated every second during sustained eye openings. The Tear Stability Analysis System is currently installed in an RT-7000 autorefractometer and topographer to automate the diagnosis of dry eye. Third, the Ocular Surface Thermographer uses ophthalmic thermography for diagnosis. The decrease in ocular surface temperature in dry eyes was significantly greater than that in normal eyes (P < 0.001) at 10 seconds after eye opening. Decreased corneal temperature correlated significantly with the tear film breakup time (r = 0.572; P < 0.001). When changes in the ocular surface temperature of the cornea were used as indicators for dry eye, sensitivity was 0.83 and specificity was 0.80 after 10 seconds. This article describes the details and potential of these 3 noninvasive dry eye assessment systems.

  14. Detection, monitoring, and quantitative analysis of wildfires with the BIRD satellite

    NASA Astrophysics Data System (ADS)

    Oertel, Dieter A.; Briess, Klaus; Lorenz, Eckehard; Skrbek, Wolfgang; Zhukov, Boris

    2004-02-01

    Increasing concern about environment and interest to avoid losses led to growing demands on space borne fire detection, monitoring and quantitative parameter estimation of wildfires. The global change research community intends to quantify the amount of gaseous and particulate matter emitted from vegetation fires, peat fires and coal seam fires. The DLR Institute of Space Sensor Technology and Planetary Exploration (Berlin-Adlershof) developed a small satellite called BIRD (Bi-spectral Infrared Detection) which carries a sensor package specially designed for fire detection. BIRD was launched as a piggy-back satellite on October 22, 2001 with ISRO"s Polar Satellite Launch Vehicle (PSLV). It is circling the Earth on a polar and sun-synchronous orbit at an altitude of 572 km and it is providing unique data for detailed analysis of high temperature events on Earth surface. The BIRD sensor package is dedicated for high resolution and reliable fire recognition. Active fire analysis is possible in the sub-pixel domain. The leading channel for fire detection and monitoring is the MIR channel at 3.8 μm. The rejection of false alarms is based on procedures using MIR/NIR (Middle Infra Red/Near Infra Red) and MIR/TIR (Middle Infra Red/Thermal Infra Red) radiance ratio thresholds. Unique results of BIRD wildfire detection and analysis over fire prone regions in Australia and Asia will be presented. BIRD successfully demonstrates innovative fire recognition technology for small satellites which permit to retrieve quantitative characteristics of active burning wildfires, such as the equivalent fire temperature, fire area, radiative energy release, fire front length and fire front strength.

  15. Direct and ultrasonic measurements of macroscopic piezoelectricity in sintered hydroxyapatite

    NASA Astrophysics Data System (ADS)

    Tofail, S. A. M.; Haverty, D.; Cox, F.; Erhart, J.; Hána, P.; Ryzhenko, V.

    2009-03-01

    Macroscopic piezoelectricity in hydroxyapatite (HA) ceramic was measured by a direct quasistatic method and an ultrasonic interference technique. The effective symmetry of polycrystalline aggregate was established and a detailed theoretical analysis was carried out to determine by these two methods the shear piezoelectric coefficient, d14, of HA. Piezoelectric nature of HA was proved qualitatively although a specific quantitative value for the d14 coefficient could not be established. Ultrasound method was also employed to anisotropic elastic constants, which agreed well with those measured from the first principles.

  16. Planktic foraminifer census data from Northwind Ridge Core 5, Arctic Ocean

    USGS Publications Warehouse

    Foley, Kevin M.; Poore, Richard Z.

    1991-01-01

    The U.S. Geological Survey recovered 9 piston cores from the Northwind Ridge in the Canada Basin of the Arctic Ocean from a cruise of the USCGC Polar Star during 1988. Preliminary analysis of the cores suggests sediments deposited on Northwind Ridge preserve a detailed record of glacial and interglacial cycles for the last few hundred-thousand to one million years. This report includes quantitative data on foraminifers and selected sediment size-fraction data in samples from Northwind Ridge core PI-88AR P5.

  17. Electrochemistry in hollow-channel paper analytical devices.

    PubMed

    Renault, Christophe; Anderson, Morgan J; Crooks, Richard M

    2014-03-26

    In the present article we provide a detailed analysis of fundamental electrochemical processes in a new class of paper-based analytical devices (PADs) having hollow channels (HCs). Voltammetry and amperometry were applied under flow and no flow conditions yielding reproducible electrochemical signals that can be described by classical electrochemical theory as well as finite-element simulations. The results shown here provide new and quantitative insights into the flow within HC-PADs. The interesting new result is that despite their remarkable simplicity these HC-PADs exhibit electrochemical and hydrodynamic behavior similar to that of traditional microelectrochemical devices.

  18. Involvement of GABA Transporters in Atropine-Treated Myopic Retina As Revealed by iTRAQ Quantitative Proteomics

    PubMed Central

    2015-01-01

    Atropine, a muscarinic antagonist, is known to inhibit myopia progression in several animal models and humans. However, the mode of action is not established yet. In this study, we compared quantitative iTRAQ proteomic analysis in the retinas collected from control and lens-induced myopic (LIM) mouse eyes treated with atropine. The myopic group received a (−15D) spectacle lens over the right eye on postnatal day 10 with or without atropine eye drops starting on postnatal day 24. Axial length was measured by optical low coherence interferometry (OLCI), AC-Master, and refraction was measured by automated infrared photorefractor at postnatal 24, 38, and 52 days. Retinal tissue samples were pooled from six eyes for each group. The experiments were repeated twice, and technical replicates were also performed for liquid chromatography–tandem mass spectrometry (LC–MS/MS) analysis. MetaCore was used to perform protein profiling for pathway analysis. We identified a total of 3882 unique proteins with <1% FDR by analyzing the samples in replicates for two independent experiments. This is the largest number of mouse retina proteome reported to date. Thirty proteins were found to be up-regulated (ratio for myopia/control > global mean ratio + 1 standard deviation), and 28 proteins were down-regulated (ratio for myopia/control < global mean ratio - 1 standard deviation) in myopic eyes as compared with control retinas. Pathway analysis using MetaCore revealed regulation of γ-aminobutyric acid (GABA) levels in the myopic eyes. Detailed analysis of the quantitative proteomics data showed that the levels of GABA transporter 1 (GAT-1) were elevated in myopic retina and significantly reduced after atropine treatment. These results were further validated with immunohistochemistry and Western blot analysis. In conclusion, this study provides a comprehensive quantitative proteomic analysis of atropine-treated mouse retina and suggests the involvement of GABAergic signaling in the antimyopic effects of atropine in mouse eyes. The GABAergic transmission in the neural retina plays a pivotal role in the maintenance of axial eye growth in mammals. PMID:25211393

  19. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  20. Polymer Analysis by Liquid Chromatography/Electrospray Ionization Time-of-Flight Mass Spectrometry.

    PubMed

    Nielen, M W; Buijtenhuijs, F A

    1999-05-01

    Hyphenation of liquid chromatography (LC) techniques with electrospray ionization (ESI) orthogonal acceleration time-of-flight (oa-TOF) mass spectrometry (MS) provides both MS-based structural information and LC-based quantitative data in polymer analysis. In one experimental setup, three different LC modes are interfaced with MS:  size-exclusion chromatography (SEC/MS), gradient polymer elution chromatography (GPEC/MS), and liquid chromatography at the critical point of adsorption (LCCC/MS). In SEC/MS, both absolute mass calibration of the SEC column based on the polymer itself and determination of monomers and end groups from the mass spectra are achieved. GPEC/MS shows detailed chemical heterogeneity of the polymer and the chemical composition distribution within oligomer groups. In LCCC/MS, the retention behavior is primarily governed by chemical heterogeneities, such as different end group functionalities, and quantitative end group calculations can be easily made. The potential of these methods and the benefit of time-of-flight analyzers in polymer analysis are discussed using SEC/MS of a polydisperse poly(methyl methacrylate) sample, GPEC/MS of dipropoxylated bisphenol A/adipic acid polyester resin, LCCC/MS of alkylated poly(ethylene glycol), and LCCC/MS of terephthalic acid/neopentyl glycol polyester resin.

  1. Preparation, quantitative surface analysis, intercalation characteristics and industrial implications of low temperature expandable graphite

    NASA Astrophysics Data System (ADS)

    Peng, Tiefeng; Liu, Bin; Gao, Xuechao; Luo, Liqun; Sun, Hongjuan

    2018-06-01

    Expandable graphite is widely used as a new functional carbon material, especially as fire-retardant; however, its practical application is limited due to the high expansion temperature. In this work, preparation process of low temperature and highly expandable graphite was studied, using natural flake graphite as raw material and KMnO4/HClO4/NH4NO3 as oxidative intercalations. The structure, morphology, functional groups and thermal properties were characterized during expanding process by Fourier transform infrared spectroscopy (FTIR), Raman spectra, thermo-gravimetry differential scanning calorimetry (TG-DSC), X-ray diffraction (XRD), and scanning electron microscope (SEM). The analysis showed that by oxidation intercalation, some oxygen-containing groups were grafted on the edge and within the graphite layer. The intercalation reagent entered the graphite layer to increase the interlayer spacing. After expansion, the original flaky expandable graphite was completely transformed into worm-like expanded graphite. The order of graphite intercalation compounds (GICs) was proposed and determined to be 3 for the prepared expandable graphite, based on quantitative XRD peak analysis. Meanwhile, the detailed intercalation mechanisms were also proposed. The comprehensive investigation paved a benchmark for the industrial application of such sulfur-free expanded graphite.

  2. Quantitative Chemical Imaging and Unsupervised Analysis Using Hyperspectral Coherent Anti-Stokes Raman Scattering Microscopy

    PubMed Central

    2013-01-01

    In this work, we report a method to acquire and analyze hyperspectral coherent anti-Stokes Raman scattering (CARS) microscopy images of organic materials and biological samples resulting in an unbiased quantitative chemical analysis. The method employs singular value decomposition on the square root of the CARS intensity, providing an automatic determination of the components above noise, which are retained. Complex CARS susceptibility spectra, which are linear in the chemical composition, are retrieved from the CARS intensity spectra using the causality of the susceptibility by two methods, and their performance is evaluated by comparison with Raman spectra. We use non-negative matrix factorization applied to the imaginary part and the nonresonant real part of the susceptibility with an additional concentration constraint to obtain absolute susceptibility spectra of independently varying chemical components and their absolute concentration. We demonstrate the ability of the method to provide quantitative chemical analysis on known lipid mixtures. We then show the relevance of the method by imaging lipid-rich stem-cell-derived mouse adipocytes as well as differentiated embryonic stem cells with a low density of lipids. We retrieve and visualize the most significant chemical components with spectra given by water, lipid, and proteins segmenting the image into the cell surrounding, lipid droplets, cytosol, and the nucleus, and we reveal the chemical structure of the cells, with details visualized by the projection of the chemical contrast into a few relevant channels. PMID:24099603

  3. [Myocardial perfusion scintigraphy - short form of the German guideline].

    PubMed

    Lindner, O; Burchert, W; Hacker, M; Schaefer, W; Schmidt, M; Schober, O; Schwaiger, M; vom Dahl, J; Zimmermann, R; Schäfers, M

    2013-01-01

    This guideline is a short summary of the guideline for myocardial perfusion scintigraphy published by the Association of the Scientific Medical Societies in Ger-many (AWMF). The purpose of this guideline is to provide practical assistance for indication and examination procedures as well as image analysis and to present the state-of-the-art of myocardial-perfusion-scintigraphy. After a short introduction on the fundamentals of imaging, precise and detailed information is given on the indications, patient preparation, stress testing, radiopharmaceuticals, examination protocols and techniques, radiation exposure, data reconstruction as well as information on visual and quantitative image analysis and interpretation. In addition possible pitfalls, artefacts and key elements of reporting are described.

  4. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    PubMed

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  5. Experimental study of oscillating plates in viscous fluids: Qualitative and quantitative analysis of the flow physics and hydrodynamic forces

    NASA Astrophysics Data System (ADS)

    Shrestha, Bishwash; Ahsan, Syed N.; Aureli, Matteo

    2018-01-01

    In this paper, we present a comprehensive experimental study on harmonic oscillations of a submerged rigid plate in a quiescent, incompressible, Newtonian, viscous fluid. The fluid-structure interaction problem is analyzed from both qualitative and quantitative perspectives via a detailed particle image velocimetry (PIV) experimental campaign conducted over a broad range of oscillation frequency and amplitude parameters. Our primary goal is to identify the effect of the oscillation characteristics on the mechanisms of fluid-structure interaction and on the dynamics of vortex shedding and convection and to elucidate the behavior of hydrodynamic forces on the oscillating structure. Towards this goal, we study the flow in terms of qualitative aspects of its pathlines, vortex shedding, and symmetry breaking phenomena and identify distinct hydrodynamic regimes in the vicinity of the oscillating structure. Based on these experimental observations, we produce a novel phase diagram detailing the occurrence of distinct hydrodynamic regimes as a function of relevant governing nondimensional parameters. We further study the hydrodynamic forces associated with each regime using both PIV and direct force measurement via a load cell. Our quantitative results on experimental estimation of hydrodynamic forces show good agreement against predictions from the literature, where numerical and semi-analytical models are available. The findings and observations in this work shed light on the relationship between flow physics, vortex shedding, and convection mechanisms and the hydrodynamic forces acting on a rigid oscillating plate and, as such, have relevance to various engineering applications, including energy harvesting devices, biomimetic robotic system, and micro-mechanical sensors and actuators.

  6. Development of quantitative security optimization approach for the picture archives and carrying system between a clinic and a rehabilitation center

    NASA Astrophysics Data System (ADS)

    Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko

    2002-05-01

    The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.

  7. Fuzzy fault tree assessment based on improved AHP for fire and explosion accidents for steel oil storage tanks.

    PubMed

    Shi, Lei; Shuai, Jian; Xu, Kui

    2014-08-15

    Fire and explosion accidents of steel oil storage tanks (FEASOST) occur occasionally during the petroleum and chemical industry production and storage processes and often have devastating impact on lives, the environment and property. To contribute towards the development of a quantitative approach for assessing the occurrence probability of FEASOST, a fault tree of FEASOST is constructed that identifies various potential causes. Traditional fault tree analysis (FTA) can achieve quantitative evaluation if the failure data of all of the basic events (BEs) are available, which is almost impossible due to the lack of detailed data, as well as other uncertainties. This paper makes an attempt to perform FTA of FEASOST by a hybrid application between an expert elicitation based improved analysis hierarchy process (AHP) and fuzzy set theory, and the occurrence possibility of FEASOST is estimated for an oil depot in China. A comparison between statistical data and calculated data using fuzzy fault tree analysis (FFTA) based on traditional and improved AHP is also made. Sensitivity and importance analysis has been performed to identify the most crucial BEs leading to FEASOST that will provide insights into how managers should focus effective mitigation. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Culturally Sensitive Parent Education: A Critical Review of Quantitative Research.

    ERIC Educational Resources Information Center

    Gorman, Jean Cheng; Balter, Lawrence

    1997-01-01

    Critically reviews the quantitative literature on culturally sensitive parent education programs, discussing issues of research methodology and program efficacy in producing change among ethnic minority parents and their children. Culturally sensitive programs for African American and Hispanic families are described in detail. Methodological flaws…

  9. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  10. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  11. Partial differential equation techniques for analysing animal movement: A comparison of different methods.

    PubMed

    Wang, Yi-Shan; Potts, Jonathan R

    2017-03-07

    Recent advances in animal tracking have allowed us to uncover the drivers of movement in unprecedented detail. This has enabled modellers to construct ever more realistic models of animal movement, which aid in uncovering detailed patterns of space use in animal populations. Partial differential equations (PDEs) provide a popular tool for mathematically analysing such models. However, their construction often relies on simplifying assumptions which may greatly affect the model outcomes. Here, we analyse the effect of various PDE approximations on the analysis of some simple movement models, including a biased random walk, central-place foraging processes and movement in heterogeneous landscapes. Perhaps the most commonly-used PDE method dates back to a seminal paper of Patlak from 1953. However, our results show that this can be a very poor approximation in even quite simple models. On the other hand, more recent methods, based on transport equation formalisms, can provide more accurate results, as long as the kernel describing the animal's movement is sufficiently smooth. When the movement kernel is not smooth, we show that both the older and newer methods can lead to quantitatively misleading results. Our detailed analysis will aid future researchers in the appropriate choice of PDE approximation for analysing models of animal movement. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Quantitative nanohistological investigation of scleroderma: an atomic force microscopy-based approach to disease characterization

    PubMed Central

    Strange, Adam P; Aguayo, Sebastian; Ahmed, Tarek; Mordan, Nicola; Stratton, Richard; Porter, Stephen R; Parekh, Susan; Bozec, Laurent

    2017-01-01

    Scleroderma (or systemic sclerosis, SSc) is a disease caused by excess crosslinking of collagen. The skin stiffens and becomes painful, while internally, organ function can be compromised by the less elastic collagen. Diagnosis of SSc is often only possible in advanced cases by which treatment time is limited. A more detailed analysis of SSc may provide better future treatment options and information of disease progression. Recently, the histological stain picrosirius red showing collagen register has been combined with atomic force microscopy (AFM) to study SSc. Skin from healthy individuals and SSc patients was biopsied, stained and studied using AFM. By investigating the crosslinking of collagen at a smaller hierarchical stage, the effects of SSc were more pronounced. Changes in morphology and Young’s elastic modulus were observed and quantified; giving rise to a novel technique, we have termed “quantitative nanohistology”. An increase in nanoscale stiffness in the collagen for SSc compared with healthy individuals was seen by a significant increase in the Young’s modulus profile for the collagen. These markers of stiffer collagen in SSc are similar to the symptoms experienced by patients, giving additional hope that in the future, nanohistology using AFM can be readily applied as a clinical tool, providing detailed information of the state of collagen. PMID:28138238

  13. Simulating observations with HARMONI: the integral field spectrograph for the European Extremely Large Telescope

    NASA Astrophysics Data System (ADS)

    Zieleniewski, Simon; Thatte, Niranjan; Kendrew, Sarah; Houghton, Ryan; Tecza, Matthias; Clarke, Fraser; Fusco, Thierry; Swinbank, Mark

    2014-07-01

    With the next generation of extremely large telescopes commencing construction, there is an urgent need for detailed quantitative predictions of the scientific observations that these new telescopes will enable. Most of these new telescopes will have adaptive optics fully integrated with the telescope itself, allowing unprecedented spatial resolution combined with enormous sensitivity. However, the adaptive optics point spread function will be strongly wavelength dependent, requiring detailed simulations that accurately model these variations. We have developed a simulation pipeline for the HARMONI integral field spectrograph, a first light instrument for the European Extremely Large Telescope. The simulator takes high-resolution input data-cubes of astrophysical objects and processes them with accurate atmospheric, telescope and instrumental effects, to produce mock observed cubes for chosen observing parameters. The output cubes represent the result of a perfect data reduc- tion process, enabling a detailed analysis and comparison between input and output, showcasing HARMONI's capabilities. The simulations utilise a detailed knowledge of the telescope's wavelength dependent adaptive op- tics point spread function. We discuss the simulation pipeline and present an early example of the pipeline functionality for simulating observations of high redshift galaxies.

  14. Fixed Eigenvector Analysis of Thermographic NDE Data

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2011-01-01

    Principal Component Analysis (PCA) has been shown effective for reducing thermographic NDE data. This paper will discuss an alternative method of analysis that has been developed where a predetermined set of eigenvectors is used to process the thermal data from both reinforced carbon-carbon (RCC) and graphiteepoxy honeycomb materials. These eigenvectors can be generated either from an analytic model of the thermal response of the material system under examination, or from a large set of experimental data. This paper provides the details of the analytic model, an overview of the PCA process, as well as a quantitative signal-to-noise comparison of the results of performing both conventional PCA and fixed eigenvector analysis on thermographic data from two specimens, one Reinforced Carbon-Carbon with flat bottom holes and the second a sandwich construction with graphite-epoxy face sheets and aluminum honeycomb core.

  15. Enantioselective determination of metconazole in multi matrices by high-performance liquid chromatography.

    PubMed

    He, Rujian; Fan, Jun; Tan, Qi; Lai, Yecai; Chen, Xiaodong; Wang, Tai; Jiang, Ying; Zhang, Yaomou; Zhang, Weiguang

    2018-02-01

    A reliable and effective HPLC analytical method has been developed to stereoselectively quantify metconazole in soil and flour matrices. Effects of polysaccharide chiral stationary phase, type and content of alcoholic modifier on separation of racemic metconazole have been discussed in detail. Resolution and quantitative determination of metconazole stereoisomers were performed by using an Enantiopak OD column, with the n-hexane-ethanol mixture (97:3, v/v) at the flow rate of 1.0mL/min. Then, extraction and cleanup procedures followed by the modified QuEChERS (quick, easy, cheap, effective, rugged and safe) method were used for metconazole racemate in soil and flour matrices. The residual analysis method was validated. Good linearity (R 2 ≥ 0.9997) and recoveries (94.98-104.89%, RSD ≤ 2.0%) for four metconazole stereoisomers were obtained. In brief, this proposed method showed good accuracy and precision, which might be applied in enantioselective determination, residual quantitative analysis, and degradation of metconazole in food and environmental matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Australia’s first national level quantitative environmental justice assessment of industrial air pollution

    NASA Astrophysics Data System (ADS)

    Chakraborty, Jayajit; Green, Donna

    2014-04-01

    This study presents the first national level quantitative environmental justice assessment of industrial air pollution in Australia. Specifically, our analysis links the spatial distribution of sites and emissions associated with industrial pollution sources derived from the National Pollution Inventory, to Indigenous status and social disadvantage characteristics of communities derived from Australian Bureau of Statistics indicators. Our results reveal a clear national pattern of environmental injustice based on the locations of industrial pollution sources, as well as volume, and toxicity of air pollution released at these locations. Communities with the highest number of polluting sites, emission volume, and toxicity-weighted air emissions indicate significantly greater proportions of Indigenous population and higher levels of socio-economic disadvantage. The quantities and toxicities of industrial air pollution are particularly higher in communities with the lowest levels of educational attainment and occupational status. These findings emphasize the need for more detailed analysis in specific regions and communities where socially disadvantaged groups are disproportionately impacted by industrial air pollution. Our empirical findings also underscore the growing necessity to incorporate environmental justice considerations in environmental planning and policy-making in Australia.

  17. Gap Gene Regulatory Dynamics Evolve along a Genotype Network

    PubMed Central

    Crombach, Anton; Wotton, Karl R.; Jiménez-Guri, Eva; Jaeger, Johannes

    2016-01-01

    Developmental gene networks implement the dynamic regulatory mechanisms that pattern and shape the organism. Over evolutionary time, the wiring of these networks changes, yet the patterning outcome is often preserved, a phenomenon known as “system drift.” System drift is illustrated by the gap gene network—involved in segmental patterning—in dipteran insects. In the classic model organism Drosophila melanogaster and the nonmodel scuttle fly Megaselia abdita, early activation and placement of gap gene expression domains show significant quantitative differences, yet the final patterning output of the system is essentially identical in both species. In this detailed modeling analysis of system drift, we use gene circuits which are fit to quantitative gap gene expression data in M. abdita and compare them with an equivalent set of models from D. melanogaster. The results of this comparative analysis show precisely how compensatory regulatory mechanisms achieve equivalent final patterns in both species. We discuss the larger implications of the work in terms of “genotype networks” and the ways in which the structure of regulatory networks can influence patterns of evolutionary change (evolvability). PMID:26796549

  18. Functional characterization and quantitative expression analysis of two GnRH-related peptide receptors in the mosquito, Aedes aegypti.

    PubMed

    Oryan, Alireza; Wahedi, Azizia; Paluzzi, Jean-Paul V

    2018-03-04

    To cope with stressful events such as flight, organisms have evolved various regulatory mechanisms, often involving control by endocrine-derived factors. In insects, two stress-related factors include the gonadotropin-releasing hormone-related peptides adipokinetic hormone (AKH) and corazonin (CRZ). AKH is a pleiotropic hormone best known as a substrate liberator of proteins, lipids, and carbohydrates. Although a universal function has not yet been elucidated, CRZ has been shown to have roles in pigmentation, ecdysis or act as a cardiostimulatory factor. While both these neuropeptides and their respective receptors (AKHR and CRZR) have been characterized in several organisms, details on their specific roles within the disease vector, Aedes aegypti, remain largely unexplored. Here, we obtained three A. aegypti AKHR transcript variants and further identified the A. aegypti CRZR receptor. Receptor expression using a heterologous functional assay revealed that these receptors exhibit a highly specific response for their native ligands. Developmental quantitative expression analysis of CRZR revealed enrichment during the pupal and adult stages. In adults, quantitative spatial expression analysis revealed CRZR transcript in a variety of organs including head, thoracic ganglia, primary reproductive organs (ovary and testis), as well as male carcass. This suggest CRZ may play a role in ecdysis, and neuronal expression of CRZR indicates a possible role for CRZ within the nervous system. Quantitative developmental expression analysis of AKHR identified significant transcript enrichment in early adult stages. AKHR transcript was observed in the head, thoracic ganglia, accessory reproductive tissues and the carcass of adult females, while it was detected in the abdominal ganglia and enriched significantly in the carcass of adult males, which supports the known function of AKH in energy metabolism. Collectively, given the enrichment of CRZR and AKHR in the primary and secondary sex organs, respectively, of adult mosquitoes, these neuropeptides may play a role in regulating mosquito reproductive biology. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Counter tube window and X-ray fluorescence analyzer study

    NASA Technical Reports Server (NTRS)

    Hertel, R.; Holm, M.

    1973-01-01

    A study was performed to determine the best design tube window and X-ray fluorescence analyzer for quantitative analysis of Venusian dust and condensates. The principal objective of the project was to develop the best counter tube window geometry for the sensing element of the instrument. This included formulation of a mathematical model of the window and optimization of its parameters. The proposed detector and instrument has several important features. The instrument will perform a near real-time analysis of dust in the Venusian atmosphere, and is capable of measuring dust layers less than 1 micron thick. In addition, wide dynamic measurement range will be provided to compensate for extreme variations in count rates. An integral pulse-height analyzer and memory accumulate data and read out spectra for detail computer analysis on the ground.

  20. Studies on the factors modulating indole-3-acetic acid production in endophytic bacterial isolates from Piper nigrum and molecular analysis of ipdc gene.

    PubMed

    Jasim, B; Jimtha John, C; Shimil, V; Jyothis, M; Radhakrishnan, E K

    2014-09-01

    The study mainly aimed quantitative analysis of IAA produced by endophytic bacteria under various conditions including the presence of extract from Piper nigrum. Analysis of genetic basis of IAA production was also conducted by studying the presence and diversity of the ipdc gene among the selected isolates. Five endophytic bacteria isolated previously from P. nigrum were used for the study. The effect of temperature, pH, agitation, tryptophan concentration and plant extract on modulating IAA production of selected isolates was analysed by colorimetric method. Comparative and quantitative analysis of IAA production by colorimetric isolates under optimal culture condition was analysed by HPTLC method. Presence of ipdc gene and thereby biosynthetic basis of IAA production among the selected isolates were studied by PCR-based amplification and subsequent insilico analysis of sequence obtained. Among the selected bacterial isolates from P. nigrum, isolate PnB 8 (Klebsiella pneumoniae) was found to have the maximum yield of IAA under various conditions optimized and was confirmed by colorimetric, HPLC and HPTLC analysis. Very interestingly, the study showed stimulating effect of phytochemicals from P. nigrum on IAA production by endophytic bacteria isolated from same plant. This study is unique because of the selection of endophytes from same source for comparative and quantitative analysis of IAA production under various conditions. Study on stimulatory effect of phytochemicals on bacterial IAA production as explained in the study is a novel approach. Studies on molecular basis of IAA production which was confirmed by sequence analysis of ipdc gene make the study scientifically attractive. Even though microbial production of IAA is well known, current report on detailed optimization, effect of plant extract and molecular confirmation of IAA biosynthesis is comparatively novel in its approach. © 2014 The Society for Applied Microbiology.

  1. Bottom-up low molecular weight heparin analysis using liquid chromatography-Fourier transform mass spectrometry for extensive characterization.

    PubMed

    Li, Guoyun; Steppich, Julia; Wang, Zhenyu; Sun, Yi; Xue, Changhu; Linhardt, Robert J; Li, Lingyun

    2014-07-01

    Low molecular weight heparins (LMWHs) are heterogeneous, polydisperse, and highly negatively charged mixtures of glycosaminoglycan chains prescribed as anticoagulants. The detailed characterization of LMWH is important for the drug quality assurance and for new drug research and development. In this study, online hydrophilic interaction chromatography (HILIC) Fourier transform mass spectrometry (FTMS) was applied to analyze the oligosaccharide fragments of LMWHs generated by heparin lyase II digestion. More than 40 oligosaccharide fragments of LMWH were quantified and used to compare LMWHs prepared by three different manufacturers. The quantified fragment structures included unsaturated disaccharides/oligosaccharides arising from the prominent repeating units of these LMWHs, 3-O-sulfo containing tetrasaccharides arising from their antithrombin III binding sites, 1,6-anhydro ring-containing oligosaccharides formed during their manufacture, saturated uronic acid oligosaccharides coming from some chain nonreducing ends, and oxidized linkage region oligosaccharides coming from some chain reducing ends. This bottom-up approach provides rich detailed structural analysis and quantitative information with high accuracy and reproducibility. When combined with the top-down approach, HILIC LC-FTMS based analysis should be suitable for the advanced quality control and quality assurance in LMWH production.

  2. Strategies for the profiling, characterisation and detailed structural analysis of N-linked oligosaccharides.

    PubMed

    Tharmalingam, Tharmala; Adamczyk, Barbara; Doherty, Margaret A; Royle, Louise; Rudd, Pauline M

    2013-02-01

    Many post-translational modifications, including glycosylation, are pivotal for the structural integrity, location and functional activity of glycoproteins. Sub-populations of proteins that are relocated or functionally changed by such modifications can change resting proteins into active ones, mediating specific effector functions, as in the case of monoclonal antibodies. To ensure safe and efficacious drugs it is essential to employ appropriate robust, quantitative analytical strategies that can (i) perform detailed glycan structural analysis, (ii) characterise specific subsets of glycans to assess known critical features of therapeutic activities (iii) rapidly profile glycan pools for at-line monitoring or high level batch to batch screening. Here we focus on these aspects of glycan analysis, showing how state-of-the-art technologies are required at all stages during the production of recombinant glycotherapeutics. These data can provide insights into processing pathways and suggest markers for intervention at critical control points in bioprocessing and also critical decision points in disease and drug monitoring in patients. Importantly, these tools are now enabling the first glycome/genome studies in large populations, allowing the integration of glycomics into other 'omics platforms in a systems biology context.

  3. Stream network analysis from orbital and suborbital imagery, Colorado River Basin, Texas

    NASA Technical Reports Server (NTRS)

    Baker, V. R. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Orbital SL-2 imagery (earth terrain camera S-190B), received September 5, 1973, was subjected to quantitative network analysis and compared to 7.5 minute topographic mapping (scale: 1/24,000) and U.S.D.A. conventional black and white aerial photography (scale: 1/22,200). Results can only be considered suggestive because detail on the SL-2 imagery was badly obscured by heavy cloud cover. The upper Bee Creek basin was chosen for analysis because it appeared in a relatively cloud-free portion of the orbital imagery. Drainage maps were drawn from the three sources digitized into a computer-compatible format, and analyzed by the WATER system computer program. Even at its small scale (1/172,000) and with bad haze the orbital photo showed much drainage detail. The contour-like character of the Glen Rose Formation's resistant limestone units allowed channel definition. The errors in pattern recognition can be attributed to local areas of dense vegetation and to other areas of very high albedo caused by surficial exposure of caliche. The latter effect caused particular difficulty in the determination of drainage divides.

  4. UV absorption in metal decorated boron nitride flakes: a theoretical analysis of excited states

    NASA Astrophysics Data System (ADS)

    Chopra, Siddheshwar; Plasser, Felix

    2017-10-01

    The excited states of single metal atom (X = Co, Al and Cu) doped boron nitride flake (MBNF) B15N14H14-X and pristine boron nitride (B15N15H14) are studied by time-dependent density functional theory. The immediate effect of metal doping is a red shift of the onset of absorption from about 220 nm for pristine BNF to above 300 nm for all metal-doped variants with the biggest effect for MBNF-Co, which shows appreciable intensity even above 400 nm. These energy shifts are analysed by detailed wavefunction analysis protocols using visualisation methods, such as the natural transition orbital analysis and electron-hole correlation plots, as well as quantitative analysis of the exciton size and electron-hole populations. The analysis shows that the Co and Cu atoms provide strong contributions to the relevant states whereas the aluminium atom is only involved to a lesser extent.

  5. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  6. Detection of isotype switch rearrangement in bulk culture by PCR.

    PubMed

    Max, E E; Mills, F C; Chu, C

    2001-05-01

    When a B lymphocyte changes from synthesizing IgM to synthesizing IgG, IgA, or IgE, this isotype switch is generally accompanied by a unique DNA rearrangement. The protocols in this unit describe two polymerase chain reaction (PCR)-based strategies for detecting switch rearrangements in bulk culture. The first involves direct PCR across the switch junctions, providing the opportunity for characterizing the recombination products by nucleotide sequence analysis; however, because of characteristics inherent to the PCR methodology this strategy cannot easily be used as a quantitative assay for recombination. A support protocol details the preparation of the 5' Su PCR probe for this protocol. The second basic protocol describes a method known as digestion-circularization PCR (DCPCR) that is more amenable to quantitation but yields no information on structure of the recombination products. Both techniques should be capable of detecting reciprocal deletion circles as well as functional recombination products remaining on the expressed chromosome.

  7. Quantitative imaging of heterogeneous dynamics in drying and aging paints

    PubMed Central

    van der Kooij, Hanne M.; Fokkink, Remco; van der Gucht, Jasper; Sprakel, Joris

    2016-01-01

    Drying and aging paint dispersions display a wealth of complex phenomena that make their study fascinating yet challenging. To meet the growing demand for sustainable, high-quality paints, it is essential to unravel the microscopic mechanisms underlying these phenomena. Visualising the governing dynamics is, however, intrinsically difficult because the dynamics are typically heterogeneous and span a wide range of time scales. Moreover, the high turbidity of paints precludes conventional imaging techniques from reaching deep inside the paint. To address these challenges, we apply a scattering technique, Laser Speckle Imaging, as a versatile and quantitative tool to elucidate the internal dynamics, with microscopic resolution and spanning seven decades of time. We present a toolbox of data analysis and image processing methods that allows a tailored investigation of virtually any turbid dispersion, regardless of the geometry and substrate. Using these tools we watch a variety of paints dry and age with unprecedented detail. PMID:27682840

  8. Evolution, Energy Landscapes and the Paradoxes of Protein Folding

    PubMed Central

    Wolynes, Peter G.

    2014-01-01

    Protein folding has been viewed as a difficult problem of molecular self-organization. The search problem involved in folding however has been simplified through the evolution of folding energy landscapes that are funneled. The funnel hypothesis can be quantified using energy landscape theory based on the minimal frustration principle. Strong quantitative predictions that follow from energy landscape theory have been widely confirmed both through laboratory folding experiments and from detailed simulations. Energy landscape ideas also have allowed successful protein structure prediction algorithms to be developed. The selection constraint of having funneled folding landscapes has left its imprint on the sequences of existing protein structural families. Quantitative analysis of co-evolution patterns allows us to infer the statistical characteristics of the folding landscape. These turn out to be consistent with what has been obtained from laboratory physicochemical folding experiments signalling a beautiful confluence of genomics and chemical physics. PMID:25530262

  9. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  10. QSAR study of anthranilic acid sulfonamides as inhibitors of methionine aminopeptidase-2 using LS-SVM and GRNN based on principal components.

    PubMed

    Shahlaei, Mohsen; Sabet, Razieh; Ziari, Maryam Bahman; Moeinifard, Behzad; Fassihi, Afshin; Karbakhsh, Reza

    2010-10-01

    Quantitative relationships between molecular structure and methionine aminopeptidase-2 inhibitory activity of a series of cytotoxic anthranilic acid sulfonamide derivatives were discovered. We have demonstrated the detailed application of two efficient nonlinear methods for evaluation of quantitative structure-activity relationships of the studied compounds. Components produced by principal component analysis as input of developed nonlinear models were used. The performance of the developed models namely PC-GRNN and PC-LS-SVM were tested by several validation methods. The resulted PC-LS-SVM model had a high statistical quality (R(2)=0.91 and R(CV)(2)=0.81) for predicting the cytotoxic activity of the compounds. Comparison between predictability of PC-GRNN and PC-LS-SVM indicates that later method has higher ability to predict the activity of the studied molecules. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.

  11. Purity assessment of organic calibration standards using a combination of quantitative NMR and mass balance.

    PubMed

    Davies, Stephen R; Jones, Kai; Goldys, Anna; Alamgir, Mahuiddin; Chan, Benjamin K H; Elgindy, Cecile; Mitchell, Peter S R; Tarrant, Gregory J; Krishnaswami, Maya R; Luo, Yawen; Moawad, Michael; Lawes, Douglas; Hook, James M

    2015-04-01

    Quantitative NMR spectroscopy (qNMR) has been examined for purity assessment using a range of organic calibration standards of varying structural complexities, certified using the traditional mass balance approach. Demonstrated equivalence between the two independent purity values confirmed the accuracy of qNMR and highlighted the benefit of using both methods in tandem to minimise the potential for hidden bias, thereby conferring greater confidence in the overall purity assessment. A comprehensive approach to purity assessment is detailed, utilising, where appropriate, multiple peaks in the qNMR spectrum, chosen on the basis of scientific reason and statistical analysis. Two examples are presented in which differences between the purity assignment by qNMR and mass balance are addressed in different ways depending on the requirement of the end user, affording fit-for-purpose calibration standards in a cost-effective manner.

  12. Antisite defects in layered multiferroic CuCr0.9In0.1P2S6

    NASA Astrophysics Data System (ADS)

    He, Qian; Belianinov, Alex; Dziaugys, Andrius; Maksymovych, Petro; Vysochanskii, Yulian; Kalinin, Sergei V.; Borisevich, Albina Y.

    2015-11-01

    The CuCr1-xInxP2S6 system represents a large family of metal chalcogenophosphates that are unique and promising candidates for 2D materials with functionalities such as ferroelectricity. In this work, we carried out detailed microstructural and chemical characterization of these compounds using aberration-corrected STEM, in order to understand the origin of these different ordering phenomena. Quantitative STEM-HAADF imaging and analysis identified the stacking order of an 8-layer thin flake, which leads to the identification of anti-site In3+(Cu+) doping. We believe that these findings will pave the way towards understanding the ferroic coupling phenomena in van der Waals lamellar compounds, as well as their potential applications in 2-D electronics.The CuCr1-xInxP2S6 system represents a large family of metal chalcogenophosphates that are unique and promising candidates for 2D materials with functionalities such as ferroelectricity. In this work, we carried out detailed microstructural and chemical characterization of these compounds using aberration-corrected STEM, in order to understand the origin of these different ordering phenomena. Quantitative STEM-HAADF imaging and analysis identified the stacking order of an 8-layer thin flake, which leads to the identification of anti-site In3+(Cu+) doping. We believe that these findings will pave the way towards understanding the ferroic coupling phenomena in van der Waals lamellar compounds, as well as their potential applications in 2-D electronics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr04779j

  13. Hyperspectral Imaging Using Intracellular Spies: Quantitative Real-Time Measurement of Intracellular Parameters In Vivo during Interaction of the Pathogenic Fungus Aspergillus fumigatus with Human Monocytes

    PubMed Central

    Mohebbi, Sara; Erfurth, Florian; Hennersdorf, Philipp; Brakhage, Axel A.; Saluz, Hans Peter

    2016-01-01

    Hyperspectral imaging (HSI) is a technique based on the combination of classical spectroscopy and conventional digital image processing. It is also well suited for the biological assays and quantitative real-time analysis since it provides spectral and spatial data of samples. The method grants detailed information about a sample by recording the entire spectrum in each pixel of the whole image. We applied HSI to quantify the constituent pH variation in a single infected apoptotic monocyte as a model system. Previously, we showed that the human-pathogenic fungus Aspergillus fumigatus conidia interfere with the acidification of phagolysosomes. Here, we extended this finding to monocytes and gained a more detailed analysis of this process. Our data indicate that melanised A. fumigatus conidia have the ability to interfere with apoptosis in human monocytes as they enable the apoptotic cell to recover from mitochondrial acidification and to continue with the cell cycle. We also showed that this ability of A. fumigatus is dependent on the presence of melanin, since a non-pigmented mutant did not stop the progression of apoptosis and consequently, the cell did not recover from the acidic pH. By conducting the current research based on the HSI, we could measure the intracellular pH in an apoptotic infected human monocyte and show the pattern of pH variation during 35 h of measurements. As a conclusion, we showed the importance of melanin for determining the fate of intracellular pH in a single apoptotic cell. PMID:27727286

  14. Rapid recovery from aphasia after infarction of Wernicke's area.

    PubMed

    Yagata, Stephanie A; Yen, Melodie; McCarron, Angelica; Bautista, Alexa; Lamair-Orosco, Genevieve; Wilson, Stephen M

    2017-01-01

    Aphasia following infarction of Wernicke's area typically resolves to some extent over time. The nature of this recovery process and its time course have not been characterized in detail, especially in the acute/subacute period. The goal of this study was to document recovery after infarction of Wernicke's area in detail in the first 3 months after stroke. Specifically, we aimed to address two questions about language recovery. First, which impaired language domains improve over time, and which do not? Second, what is the time course of recovery? We used quantitative analysis of connected speech and a brief aphasia battery to document language recovery in two individuals with aphasia following infarction of the posterior superior temporal gyrus. Speech samples were acquired daily between 2 and 16 days post stroke, and also at 1 month and 3 months. Speech samples were transcribed and coded using the CHAT system, in order to quantify multiple language domains. A brief aphasia battery was also administered at a subset of five time points during the 3 months. Both patients showed substantial recovery of language function over this time period. Most, but not all, language domains showed improvements, including fluency, lexical access, phonological retrieval and encoding, and syntactic complexity. The time course of recovery was logarithmic, with the greatest gains taking place early in the course of recovery. There is considerable potential for amelioration of language deficits when damage is relatively circumscribed to the posterior superior temporal gyrus. Quantitative analysis of connected speech samples proved to be an effective, albeit time-consuming, approach to tracking day-by-day recovery in the acute/subacute post-stroke period.

  15. Mapping montane vegetation in Southern California from color infrared imagery

    NASA Technical Reports Server (NTRS)

    Minnich, R. A.; Bowden, L. W.; Pease, R. W.

    1969-01-01

    Mapping a large area in California like the San Bernardino Mountains, demonstrated that color infrared photography is suitable for detailed mapping and offers potential for quantitative mapping. The level of information presented is comparable or superior to the most detailed mapping by ground survey.

  16. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  17. LIBS: a potential tool for industrial/agricultural waste water analysis

    NASA Astrophysics Data System (ADS)

    Karpate, Tanvi; K. M., Muhammed Shameem; Nayak, Rajesh; V. K., Unnikrishnan; Santhosh, C.

    2016-04-01

    Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique with various advantages and has the ability to detect any element in real time. This technique holds a potential for environmental monitoring and various such analysis has been done in soil, glass, paint, water, plastic etc confirms the robustness of this technique for such applications. Compared to the currently available water quality monitoring methods and techniques, LIBS has several advantages, viz. no need for sample preparation, fast and easy operation, and chemical free during the process. In LIBS, powerful pulsed laser generates plasma which is then analyzed to get quantitative and qualitative details of the elements present in the sample. Another main advantage of LIBS technique is that it can perform in standoff mode for real time analysis. Water samples from industries and agricultural strata tend to have a lot of pollutants making it harmful for consumption. The emphasis of this project is to determine such harmful pollutants present in trace amounts in industrial and agricultural wastewater. When high intensity laser is made incident on the sample, a plasma is generated which gives a multielemental emission spectra. LIBS analysis has shown outstanding success for solids samples. For liquid samples, the analysis is challenging as the liquid sample has the chances of splashing due to the high energy of laser and thus making it difficult to generate plasma. This project also deals with determining the most efficient method for testing of water sample for qualitative as well as quantitative analysis using LIBS.

  18. A correlative and quantitative imaging approach enabling characterization of primary cell-cell communication: Case of human CD4+ T cell-macrophage immunological synapses.

    PubMed

    Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie

    2018-05-22

    Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.

  19. TECHNOLOGICAL INNOVATION IN NEUROSURGERY: A QUANTITATIVE STUDY

    PubMed Central

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-01-01

    Object Technological innovation within healthcare may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technologically intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical technique. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation respectively. Methods A patent database was searched between 1960 and 2010 using the search terms “neurosurgeon” OR “neurosurgical” OR “neurosurgery”. The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top performing technology cluster was then selected as an exemplar for more detailed analysis of individual patents. Results In all, 11,672 patents and 208,203 publications relating to neurosurgery were identified. The top performing technology clusters over the 50 years were: image guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes and endoscopes. Image guidance and neuromodulation devices demonstrated a highly correlated rapid rise in patents and publications, suggesting they are areas of technology expansion. In-depth analysis of neuromodulation patents revealed that the majority of high performing patents were related to Deep Brain Stimulation (DBS). Conclusions Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery. PMID:25699414

  20. Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue

    PubMed Central

    Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath

    2009-01-01

    Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697

  1. Quantitative characterization of material surface — Application to Ni + Mo electrolytic composite coatings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kubisztal, J., E-mail: julian.kubisztal@us.edu.pl

    A new approach to numerical analysis of maps of material surface has been proposed and discussed in detail. It was concluded that the roughness factor RF and the root mean square roughness S{sub q} show a saturation effect with increasing size of the analysed maps what allows determining the optimal map dimension representative of the examined material. A quantitative method of determining predominant direction of the surface texture based on the power spectral density function is also proposed and discussed. The elaborated method was applied in surface analysis of Ni + Mo composite coatings. It was shown that co-deposition ofmore » molybdenum particles in nickel matrix leads to an increase in surface roughness. In addition, a decrease in size of the embedded Mo particles in Ni matrix causes an increase of both the surface roughness and the surface texture. It was also stated that the relation between the roughness factor and the double layer capacitance C{sub dl} of the studied coatings is linear and allows determining the double layer capacitance of the smooth nickel electrode. - Highlights: •Optimization of the procedure for the scanning of the material surface •Quantitative determination of the surface roughness and texture intensity •Proposition of the parameter describing privileged direction of the surface texture •Determination of the double layer capacitance of the smooth electrode.« less

  2. Landslide inventories: The essential part of seismic landslide hazard analyses

    USGS Publications Warehouse

    Harp, E.L.; Keefer, D.K.; Sato, H.P.; Yagi, H.

    2011-01-01

    A detailed and accurate landslide inventory is an essential part of seismic landslide hazard analysis. An ideal inventory would cover the entire area affected by an earthquake and include all of the landslides that are possible to detect down to sizes of 1-5. m in length. The landslides must also be located accurately and mapped as polygons depicting their true shapes. Such mapped landslide distributions can then be used to perform seismic landslide hazard analysis and other quantitative analyses. Detailed inventory maps of landslide triggered by earthquakes began in the early 1960s with the use of aerial photography. In recent years, advances in technology have resulted in the accessibility of satellite imagery with sufficiently high resolution to identify and map all but the smallest of landslides triggered by a seismic event. With this ability to view any area of the globe, we can acquire imagery for any earthquake that triggers significant numbers of landslides. However, a common problem of incomplete coverage of the full distributions of landslides has emerged along with the advent of high resolution satellite imagery. ?? 2010.

  3. Analysis of bias voltage dependent spectral response in Ga0.51In0.49P/Ga0.99In0.01As/Ge triple junction solar cell

    NASA Astrophysics Data System (ADS)

    Sogabe, Tomah; Ogura, Akio; Okada, Yoshitaka

    2014-02-01

    Spectral response measurement plays great role in characterizing solar cell device because it directly reflects the efficiency by which the device converts the sunlight into an electrical current. Based on the spectral response results, the short circuit current of each subcell can be quantitatively determined. Although spectral response dependence on wavelength, i.e., the well-known external quantum efficiency (EQE), has been widely used in characterizing multijunction solar cell and has been well interpreted, detailed analysis of spectral response dependence on bias voltage (SR -Vbias) has not been reported so far. In this work, we have performed experimental and numerical studies on the SR -Vbias for Ga0.51In0.49P/Ga0.99In0.01As/Ge triple junction solar cell. Phenomenological description was given to clarify the mechanism of operation matching point variation in SR -Vbias measurements. The profile of SR-Vbias curve was explained in detail by solving the coupled two-diode current-voltage characteristic transcend formula for each subcell.

  4. Donkey's milk detailed lipid composition.

    PubMed

    Gastaldi, Daniela; Bertino, Enrico; Monti, Giovanna; Baro, Cristina; Fabris, Claudio; Lezo, Antonela; Medana, Claudio; Baiocchi, Claudio; Mussap, Michele; Galvano, Fabio; Conti, Amedeo

    2010-01-01

    Donkey's milk (DM) has recently aroused scientific interest, above all among paediatric allergologists. A deeper knowledge of both proteins and fats in donkey's milk is necessary to evaluate the immunological, physiological and nutritional properties. By using the most refined techniques for fatty acids analysis, the paper offers a detailed comparative analysis of the lipid fractions of DM as well as of human and cow milk, also indicating the distribution of fatty-acid moieties among sn-1/3 and sn-2 positions of the glycerol backbone. In DM the position of fatty acids on glycerol backbone, above all of long chain saturated fatty acids, is very similar to that of human milk: this fact, in conjunction with the relatively high contents of medium-chain triglycerides, makes the lipids in DM, through quantitatively reduced, highly bioavailable. The high PUFA n-3 content of donkey's milk, and especially its low n-6/n-3 ratio, acquires particular interest in subjects affected by cow's milk protein allergy. Whole DM might also constitute the basis for formulas suitable for subjects in the first year of life.

  5. Detailed Quantitative Classifications of Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Nair, Preethi

    2018-01-01

    Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.

  6. Animation-Based Learning in Geology: Impact of Animations Coupled with Seductive Details

    ERIC Educational Resources Information Center

    Clayton, Rodney L.

    2016-01-01

    Research is not clear on how to address the difficulty students have conceptualizing geologic processes and phenomena. This study investigated how animations coupled with seductive details effect learners' situational interest and emotions. A quantitative quasi-experimental study employing an independent-measures factorial design was used. The…

  7. Estimating tree heights from shadows on vertical aerial photographs

    Treesearch

    Earl J. Rogers

    1947-01-01

    Aerial photographs are now being applied more and more to practical forestry - especially to forest survey. Many forest characteristics can be recognized on aerial photographs in greater detail than is possible through ground methods alone. The basic need is for tools and methods for interpreting the detail in quantitative terms.

  8. Combining conversation analysis and event sequencing to study health communication.

    PubMed

    Pecanac, Kristen E

    2018-06-01

    Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.

  9. Micro-Macro Analysis of Complex Networks

    PubMed Central

    Marchiori, Massimo; Possamai, Lino

    2015-01-01

    Complex systems have attracted considerable interest because of their wide range of applications, and are often studied via a “classic” approach: study a specific system, find a complex network behind it, and analyze the corresponding properties. This simple methodology has produced a great deal of interesting results, but relies on an often implicit underlying assumption: the level of detail on which the system is observed. However, in many situations, physical or abstract, the level of detail can be one out of many, and might also depend on intrinsic limitations in viewing the data with a different level of abstraction or precision. So, a fundamental question arises: do properties of a network depend on its level of observability, or are they invariant? If there is a dependence, then an apparently correct network modeling could in fact just be a bad approximation of the true behavior of a complex system. In order to answer this question, we propose a novel micro-macro analysis of complex systems that quantitatively describes how the structure of complex networks varies as a function of the detail level. To this extent, we have developed a new telescopic algorithm that abstracts from the local properties of a system and reconstructs the original structure according to a fuzziness level. This way we can study what happens when passing from a fine level of detail (“micro”) to a different scale level (“macro”), and analyze the corresponding behavior in this transition, obtaining a deeper spectrum analysis. The obtained results show that many important properties are not universally invariant with respect to the level of detail, but instead strongly depend on the specific level on which a network is observed. Therefore, caution should be taken in every situation where a complex network is considered, if its context allows for different levels of observability. PMID:25635812

  10. Characterization of the structural details of residual austenite in the weld metal of a 9Cr1MoNbV welded rotor

    NASA Astrophysics Data System (ADS)

    Liu, Xia; Ji, Hui-jun; Liu, Peng; Wang, Peng; Lu, Feng-gui; Gao, Yu-lai

    2014-06-01

    The existence of residual austenite in weld metal plays an important role in determining the properties and dimensional accuracy of welded rotors. An effective corrosive agent and the metallographic etching process were developed to clearly reveal the characteristics of residual austenite in the weld metal of a 9Cr1MoNbV welded rotor. Moreover, the details of the distribution, shape, length, length-to-width ratio, and the content of residual austenite were systematically characterized using the Image-Pro Plus image analysis software. The results revealed that the area fraction of residual austenite was approximately 6.3% in the observed weld seam; the average area, length, and length-to-width ratio of dispersed residual austenite were quantitatively evaluated to be (5.5 ± 0.1) μm2, (5.0 ± 0.1) μm, and (2.2 ± 0.1), respectively. The newly developed corrosive agent and etching method offer an appropriate approach to characterize residual austenite in the weld metal of welded rotors in detail.

  11. A model of blood-ammonia homeostasis based on a quantitative analysis of nitrogen metabolism in the multiple organs involved in the production, catabolism, and excretion of ammonia in humans.

    PubMed

    Levitt, David G; Levitt, Michael D

    2018-01-01

    Increased blood ammonia (NH 3 ) is an important causative factor in hepatic encephalopathy, and clinical treatment of hepatic encephalopathy is focused on lowering NH 3 . Ammonia is a central element in intraorgan nitrogen (N) transport, and modeling the factors that determine blood-NH 3 concentration is complicated by the need to account for a variety of reactions carried out in multiple organs. This review presents a detailed quantitative analysis of the major factors determining blood-NH 3 homeostasis - the N metabolism of urea, NH 3 , and amino acids by the liver, gastrointestinal system, muscle, kidney, and brain - with the ultimate goal of creating a model that allows for prediction of blood-NH 3 concentration. Although enormous amounts of NH 3 are produced during normal liver amino-acid metabolism, this NH 3 is completely captured by the urea cycle and does not contribute to blood NH 3 . While some systemic NH 3 derives from renal and muscle metabolism, the primary site of blood-NH 3 production is the gastrointestinal tract, as evidenced by portal vein-NH 3 concentrations that are about three times that of systemic blood. Three mechanisms, in order of quantitative importance, release NH 3 in the gut: 1) hydrolysis of urea by bacterial urease, 2) bacterial protein deamination, and 3) intestinal mucosal glutamine metabolism. Although the colon is conventionally assumed to be the major site of gut-NH 3 production, evidence is reviewed that indicates that the stomach (via Helicobacter pylori metabolism) and small intestine and may be of greater importance. In healthy subjects, most of this gut NH 3 is removed by the liver before reaching the systemic circulation. Using a quantitative model, loss of this "first-pass metabolism" due to portal collateral circulation can account for the hyperammonemia observed in chronic liver disease, and there is usually no need to implicate hepatocyte malfunction. In contrast, in acute hepatic necrosis, hyperammonemia results from damaged hepatocytes. Although muscle-NH 3 uptake is normally negligible, it can become important in severe hyperammonemia. The NH 3 -lowering actions of intestinal antibiotics (rifaximin) and lactulose are discussed in detail, with particular emphasis on the seeming lack of importance of the frequently emphasized acidifying action of lactulose in the colon.

  12. Jahn-Teller versus quantum effects in the spin-orbital material LuVO 3

    DOE PAGES

    Skoulatos, M.; Toth, S.; Roessli, B.; ...

    2015-04-13

    In this article, we report on combined neutron and resonant x-ray scattering results, identifying the nature of the spin-orbital ground state and magnetic excitations in LuVO 3 as driven by the orbital parameter. In particular, we distinguish between models based on orbital-Peierls dimerization, taken as a signature of quantum effects in orbitals, and Jahn-Teller distortions, in favor of the latter. In order to solve this long-standing puzzle, polarized neutron beams were employed as a prerequisite in order to solve details of the magnetic structure, which allowed quantitative intensity analysis of extended magnetic-excitation data sets. The results of this detailed studymore » enabled us to draw definite conclusions about the classical versus quantum behavior of orbitals in this system and to discard the previous claims about quantum effects dominating the orbital physics of LuVO 3 and similar systems.« less

  13. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  14. Quantitation of human milk proteins and their glycoforms using multiple reaction monitoring (MRM).

    PubMed

    Huang, Jincui; Kailemia, Muchena J; Goonatilleke, Elisha; Parker, Evan A; Hong, Qiuting; Sabia, Rocchina; Smilowitz, Jennifer T; German, J Bruce; Lebrilla, Carlito B

    2017-01-01

    Human milk plays a substantial role in the child growth, development and determines their nutritional and health status. Despite the importance of the proteins and glycoproteins in human milk, very little quantitative information especially on their site-specific glycosylation is known. As more functions of milk proteins and other components continue to emerge, their fine-detailed quantitative information is becoming a key factor in milk research efforts. The present work utilizes a sensitive label-free MRM method to quantify seven milk proteins (α-lactalbumin, lactoferrin, secretory immunoglobulin A, immunoglobulin G, immunoglobulin M, α1-antitrypsin, and lysozyme) using their unique peptides while at the same time, quantifying their site-specific N-glycosylation relative to the protein abundance. The method is highly reproducible, has low limit of quantitation, and accounts for differences in glycosylation due to variations in protein amounts. The method described here expands our knowledge about human milk proteins and provides vital details that could be used in monitoring the health of the infant and even the mother. Graphical Abstract The glycopeptides EICs generated from QQQ.

  15. Quantitation of human milk proteins and their glycoforms using multiple reaction monitoring (MRM)

    PubMed Central

    Huang, Jincui; Kailemia, Muchena J.; Goonatilleke, Elisha; Parker, Evan A.; Hong, Qiuting; Sabia, Rocchina; Smilowitz, Jennifer T.; German, J. Bruce

    2017-01-01

    Human milk plays a substantial role in the child growth, development and determines their nutritional and health status. Despite the importance of the proteins and glycoproteins in human milk, very little quantitative information especially on their site-specific glycosylation is known. As more functions of milk proteins and other components continue to emerge, their fine-detailed quantitative information is becoming a key factor in milk research efforts. The present work utilizes a sensitive label-free MRM method to quantify seven milk proteins (α-lactalbumin, lactoferrin, secretory immunoglobulin A, immunoglobulin G, immunoglobulin M, α1-antitrypsin, and lysozyme) using their unique peptides while at the same time, quantifying their site-specific N-glycosylation relative to the protein abundance. The method is highly reproducible, has low limit of quantitation, and accounts for differences in glycosylation due to variations in protein amounts. The method described here expands our knowledge about human milk proteins and provides vital details that could be used in monitoring the health of the infant and even the mother. PMID:27796459

  16. Correlation analysis of the physiological factors controlling fundamental voice frequency.

    PubMed

    Atkinson, J E

    1978-01-01

    A technique has been developed to obtain a quantitative measure of correlation between electromyographic (EMG) activity of various laryngeal muscles, subglottal air pressure, and the fundamental frequency of vibration of the vocal folds (Fo). Data were collected and analyzed on one subject, a native speaker of American English. The results show that an analysis of this type can provide a useful measure of correlation between the physiological and acoustical events in speech and, furthermore, can yield detailed insights into the organization and nature of the speech production process. In particular, based on these results, a model is suggested of Fo control involving laryngeal state functions that seems to agree with present knowledge of laryngeal control and experimental evidence.

  17. Detection of multiple chemicals based on external cavity quantum cascade laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Sun, Juan; Ding, Junya; Liu, Ningwu; Yang, Guangxiang; Li, Jingsong

    2018-02-01

    A laser spectroscopy system based on a broadband tunable external cavity quantum cascade laser (ECQCL) and a mini quartz crystal tuning fork (QCTF) detector was developed for standoff detection of volatile organic compounds (VOCs). The self-established spectral analysis model based on multiple algorithms for quantitative and qualitative analysis of VOC components (i.e. ethanol and acetone) was detailedly investigated in both closed cell and open path configurations. A good agreement was obtained between the experimentally observed spectra and the standard reference spectra. For open path detection of VOCs, the sensor system was demonstrated at a distance of 30 m. The preliminary laboratory results show that standoff detection of VOCs at a distance of over 100 m is very promising.

  18. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  19. Research progress in Asia on methods of processing laser-induced breakdown spectroscopy data

    NASA Astrophysics Data System (ADS)

    Guo, Yang-Min; Guo, Lian-Bo; Li, Jia-Ming; Liu, Hong-Di; Zhu, Zhi-Hao; Li, Xiang-You; Lu, Yong-Feng; Zeng, Xiao-Yan

    2016-10-01

    Laser-induced breakdown spectroscopy (LIBS) has attracted much attention in terms of both scientific research and industrial application. An important branch of LIBS research in Asia, the development of data processing methods for LIBS, is reviewed. First, the basic principle of LIBS and the characteristics of spectral data are briefly introduced. Next, two aspects of research on and problems with data processing methods are described: i) the basic principles of data preprocessing methods are elaborated in detail on the basis of the characteristics of spectral data; ii) the performance of data analysis methods in qualitative and quantitative analysis of LIBS is described. Finally, a direction for future development of data processing methods for LIBS is also proposed.

  20. Experimental analysis of bruises in human volunteers using radiometric depth profiling and diffuse reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Vidovič, Luka; Milanič, Matija; Majaron, Boris

    2015-07-01

    We combine pulsed photothermal radiometry (PPTR) depth profiling with diffuse reflectance spectroscopy (DRS) measurements for a comprehensive analysis of bruise evolution in vivo. While PPTR enables extraction of detailed depth distribution and concentration profiles of selected absorbers (e.g. melanin, hemoglobin), DRS provides information in a wide range of visible wavelengths and thus offers an additional insight into dynamics of the hemoglobin degradation products. Combining the two approaches enables us to quantitatively characterize bruise evolution dynamics. Our results indicate temporal variations of the bruise evolution parameters in the course of bruise self-healing process. The obtained parameter values and trends represent a basis for a future development of an objective technique for bruise age determination.

  1. Coordinated Noninvasive Studies (CNS) Project

    NASA Astrophysics Data System (ADS)

    Lauter, Judith

    1988-11-01

    Research activities during this period include: data collection related to the interface between complex-sound production and perception, specifically, studies on speech acoustics including two experiments on voice-onset-time variability in productions by speakers of several languages, and a series on acoustical characteristics of emotional expression; data collection regarding individual differences in the effect of stimulus characteristic on relative ear advantages; continuing data analysis and new collections documenting individual differences in auditory evoked potentials, with details related to auditory-systems asymmetries preliminary tests regarding the match between behavioral measures of relative ear advantages and quantitative-electroencephalographic asymmetries observed during auditory stimulation; pilot testing using a combination of Nuclear Magnetic Resonance's (NMR) anatomical-imaging and chemical-spectral-analysis capabilities to study physiological activation in the human brain.

  2. NOTE: An innovative phantom for quantitative and qualitative investigation of advanced x-ray imaging technologies

    NASA Astrophysics Data System (ADS)

    Chiarot, C. B.; Siewerdsen, J. H.; Haycocks, T.; Moseley, D. J.; Jaffray, D. A.

    2005-11-01

    Development, characterization, and quality assurance of advanced x-ray imaging technologies require phantoms that are quantitative and well suited to such modalities. This note reports on the design, construction, and use of an innovative phantom developed for advanced imaging technologies (e.g., multi-detector CT and the numerous applications of flat-panel detectors in dual-energy imaging, tomosynthesis, and cone-beam CT) in diagnostic and image-guided procedures. The design addresses shortcomings of existing phantoms by incorporating criteria satisfied by no other single phantom: (1) inserts are fully 3D—spherically symmetric rather than cylindrical; (2) modules are quantitative, presenting objects of known size and contrast for quality assurance and image quality investigation; (3) features are incorporated in ideal and semi-realistic (anthropomorphic) contexts; and (4) the phantom allows devices to be inserted and manipulated in an accessible module (right lung). The phantom consists of five primary modules: (1) head, featuring contrast-detail spheres approximate to brain lesions; (2) left lung, featuring contrast-detail spheres approximate to lung modules; (3) right lung, an accessible hull in which devices may be placed and manipulated; (4) liver, featuring conrast-detail spheres approximate to metastases; and (5) abdomen/pelvis, featuring simulated kidneys, colon, rectum, bladder, and prostate. The phantom represents a two-fold evolution in design philosophy—from 2D (cylindrically symmetric) to fully 3D, and from exclusively qualitative or quantitative to a design accommodating quantitative study within an anatomical context. It has proven a valuable tool in investigations throughout our institution, including low-dose CT, dual-energy radiography, and cone-beam CT for image-guided radiation therapy and surgery.

  3. A simple estimate of funneling-assisted charge collection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edmonds, L.D.

    In this paper, funneling is qualitatively discussed in detail and quantitative analysis is given for the total (time-integrated) collected charge. It is shown that for an n{sup +}/p junction, the total collected charge Q{sub T} is given by Q{sub R} = (1 + {mu}{sub n}/{mu}{sub p})Q{sub D} + 2Q{sub diff} where Q{degrees}D is the charge initially liberated in the depletion region and Q{sub diff} is charge collected by diffusion. This equation does not apply to very short ion tracks or to device having a thin epilayer.

  4. Total body calcium analysis. [neutron irradiation

    NASA Technical Reports Server (NTRS)

    Lewellen, T. K.; Nelp, W. B.

    1974-01-01

    A technique to quantitate total body calcium in humans is developed. Total body neutron irradiation is utilized to produce argon 37. The radio argon, which diffuses into the blood stream and is excreted through the lungs, is recovered from the exhaled breath and counted inside a proportional detector. Emphasis is placed on: (1) measurement of the rate of excretion of radio argon following total body neutron irradiation; (2) the development of the radio argon collection, purification, and counting systems; and (3) development of a patient irradiation facility using a 14 MeV neutron generator. Results and applications are discussed in detail.

  5. Problems in the use of interference filters for spectrophotometric determination of total ozone

    NASA Technical Reports Server (NTRS)

    Basher, R. E.; Matthews, W. A.

    1977-01-01

    An analysis of the use of ultraviolet narrow-band interference filters for total ozone determination is given with reference to the New Zealand filter spectrophotometer under the headings of filter monochromaticity, temperature dependence, orientation dependence, aging, and specification tolerances and nonuniformity. Quantitative details of each problem are given, together with the means used to overcome them in the New Zealand instrument. The tuning of the instrument's filter center wavelengths to a common set of values by tilting the filters is also described, along with a simple calibration method used to adjust and set these center wavelengths.

  6. Study on the Application of TOPSIS Method to the Introduction of Foreign Players in CBA Games

    NASA Astrophysics Data System (ADS)

    Zhongyou, Xing

    The TOPSIS method is a multiple attribute decision-making method. This paper introduces the current situation of the introduction of foreign players in CBA games, presents the principles and calculation steps of TOPSIS method in detail, and applies it to the quantitative evaluation of the comprehensively competitive ability during the introduction of foreign players. Through the analysis of practical application, we found that the TOPSIS method has relatively high rationality and applicability when it is used to evaluate the comprehensively competitive ability during the introduction of foreign players.

  7. Chemical kinetics of homogeneous atmospheric oxidation of sulfur dioxide

    NASA Technical Reports Server (NTRS)

    Sander, S. P.; Seinfeld, J. H.

    1976-01-01

    A systematic evaluation of known homogeneous SO2 reactions which might be important in air pollution chemistry is carried out. A mechanism is developed to represent the chemistry of NOx/hydrocarbon/SO2 systems, and the mechanism is used to analyze available experimental data appropriate for quantitative analysis of SO2 oxidation kinetics. Detailed comparisons of observed and predicted concentration behavior are presented. In all cases, observed SO2 oxidation rates cannot be explained solely on the basis of those SO2 reactions for which rate constants have been measured. The role of ozone-olefin reactions in SO2 oxidation is elucidated.

  8. Mean intensity of the vortex Bessel-Gaussian beam in turbulent atmosphere

    NASA Astrophysics Data System (ADS)

    Lukin, Igor P.

    2017-11-01

    In this work the question of stability of the vortex Bessel-Gaussian optical beams formed in turbulent atmosphere is theoretically considered. The detailed analysis of features of spatial structure of distribution of mean intensity of vortex Bessel-Gaussian optical beams in turbulent atmosphere are analyzed. The quantitative criterion of possibility of formation of vortex Bessel-Gaussian optical beams in turbulent atmosphere is derived. It is shown that stability of the form of a vortex Bessel-Gaussian optical beam during propagation in turbulent atmosphere increases with increase of value of a topological charge of this optical beam.

  9. Subsonic/transonic stall flutter investigation of a rotating rig

    NASA Technical Reports Server (NTRS)

    Jutras, R. R.; Fost, R. B.; Chi, R. M.; Beacher, B. F.

    1981-01-01

    Stall flutter is investigated by obtaining detailed quantitative steady and aerodynamic and aeromechanical measurements in a typical fan rotor. The experimental investigation is made with a 31.3 percent scale model of the Quiet Engine Program Fan C rotor system. Both subsonic/transonic (torsional mode) flutter and supersonic (flexural) flutter are investigated. Extensive steady and unsteady data on the blade deformations and aerodynamic properties surrounding the rotor are acquired while operating in both the steady and flutter modes. Analysis of this data shows that while there may be more than one traveling wave present during flutter, they are all forward traveling waves.

  10. Cyclic voltammetric analysis of C 1-C 4 alcohol electrooxidations with Pt/C and Pt-Ru/C microporous electrodes

    NASA Astrophysics Data System (ADS)

    Lee, Choong-Gon; Umeda, Minoru; Uchida, Isamu

    The effect of temperature on methanol, ethanol, 2-propanol, and 2-butanol electrooxidation is investigated with Pt/C and Pt-Ru/C microporous electrodes. Cyclic voltammetry is employed in temperatures ranging from 25 to 80 °C to provide quantitative and qualitative information on the kinetics of alcohol oxidation. Methanol displays the greatest activity atom alcohols. The addition of ruthenium reduces the poisoning effect, although it is ineffective with secondary alcohols. Secondary alcohols undergo a different oxidation mechanism at higher temperatures. Microporous electrodes provide detailed information on alcohol oxidation.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokaras, D.; Andrianis, M.; Lagoyannis, A.

    The cascade L-shell x-ray emission as an incident polarized and unpolarized monochromatic radiation overpass the 1s ionization threshold is investigated for the metallic Fe by means of moderate resolution, quantitative x-ray spectrometry. A full ab initio theoretical investigation of the L-shell x-ray emission processes is performed based on a detailed straightforward construction of the cascade decay trees within the Pauli-Fock approximation. The agreement obtained between experiments and the presented theory is indicated and discussed with respect to the accuracy of advanced atomic models as well as its significance for the characterization capabilities of x-ray fluorescence (XRF) analysis.

  12. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  13. FLIPPER, a combinatorial probe for correlated live imaging and electron microscopy, allows identification and quantitative analysis of various cells and organelles.

    PubMed

    Kuipers, Jeroen; van Ham, Tjakko J; Kalicharan, Ruby D; Veenstra-Algra, Anneke; Sjollema, Klaas A; Dijk, Freark; Schnell, Ulrike; Giepmans, Ben N G

    2015-04-01

    Ultrastructural examination of cells and tissues by electron microscopy (EM) yields detailed information on subcellular structures. However, EM is typically restricted to small fields of view at high magnification; this makes quantifying events in multiple large-area sample sections extremely difficult. Even when combining light microscopy (LM) with EM (correlated LM and EM: CLEM) to find areas of interest, the labeling of molecules is still a challenge. We present a new genetically encoded probe for CLEM, named "FLIPPER", which facilitates quantitative analysis of ultrastructural features in cells. FLIPPER consists of a fluorescent protein (cyan, green, orange, or red) for LM visualization, fused to a peroxidase allowing visualization of targets at the EM level. The use of FLIPPER is straightforward and because the module is completely genetically encoded, cells can be optimally prepared for EM examination. We use FLIPPER to quantify cellular morphology at the EM level in cells expressing a normal and disease-causing point-mutant cell-surface protein called EpCAM (epithelial cell adhesion molecule). The mutant protein is retained in the endoplasmic reticulum (ER) and could therefore alter ER function and morphology. To reveal possible ER alterations, cells were co-transfected with color-coded full-length or mutant EpCAM and a FLIPPER targeted to the ER. CLEM examination of the mixed cell population allowed color-based cell identification, followed by an unbiased quantitative analysis of the ER ultrastructure by EM. Thus, FLIPPER combines bright fluorescent proteins optimized for live imaging with high sensitivity for EM labeling, thereby representing a promising tool for CLEM.

  14. Clustering and Network Analysis of Reverse Phase Protein Array Data.

    PubMed

    Byron, Adam

    2017-01-01

    Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.

  15. Data Generated by Quantitative Liquid Chromatography-Mass Spectrometry Proteomics Are Only the Start and Not the Endpoint: Optimization of Quantitative Concatemer-Based Measurement of Hepatic Uridine-5'-Diphosphate-Glucuronosyltransferase Enzymes with Reference to Catalytic Activity.

    PubMed

    Achour, Brahim; Dantonio, Alyssa; Niosi, Mark; Novak, Jonathan J; Al-Majdoub, Zubida M; Goosen, Theunis C; Rostami-Hodjegan, Amin; Barber, Jill

    2018-06-01

    Quantitative proteomic methods require optimization at several stages, including sample preparation, liquid chromatography-tandem mass spectrometry (LC-MS/MS), and data analysis, with the final analysis stage being less widely appreciated by end-users. Previously reported measurement of eight uridine-5'-diphospho-glucuronosyltransferases (UGT) generated by two laboratories [using stable isotope-labeled (SIL) peptides or quantitative concatemer (QconCAT)] reflected significant disparity between proteomic methods. Initial analysis of QconCAT data showed lack of correlation with catalytic activity for several UGTs (1A4, 1A6, 1A9, 2B15) and moderate correlations for UGTs 1A1, 1A3, and 2B7 ( R s = 0.40-0.79, P < 0.05; R 2 = 0.30); good correlations were demonstrated between cytochrome P450 activities and abundances measured in the same experiments. Consequently, a systematic review of data analysis, starting from unprocessed LC-MS/MS data, was undertaken, with the aim of improving accuracy, defined by correlation against activity. Three main criteria were found to be important: choice of monitored peptides and fragments, correction for isotope-label incorporation, and abundance normalization using fractional protein mass. Upon optimization, abundance-activity correlations improved significantly for six UGTs ( R s = 0.53-0.87, P < 0.01; R 2 = 0.48-0.73); UGT1A9 showed moderate correlation ( R s = 0.47, P = 0.02; R 2 = 0.34). No spurious abundance-activity relationships were identified. However, methods remained suboptimal for UGT1A3 and UGT1A9; here hydrophobicity of standard peptides is believed to be limiting. This commentary provides a detailed data analysis strategy and indicates, using examples, the significance of systematic data processing following acquisition. The proposed strategy offers significant improvement on existing guidelines applicable to clinically relevant proteins quantified using QconCAT. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.

  16. Quantitative analysis of nitrogen containing compounds in microalgae based bio-oils using comprehensive two-dimensional gas-chromatography coupled to nitrogen chemiluminescence detector and time of flight mass spectrometer.

    PubMed

    Toraman, Hilal E; Franz, Kristina; Ronsse, Frederik; Van Geem, Kevin M; Marin, Guy B

    2016-08-19

    Insight in the composition of the algae derived bio-oils is crucial for the development of efficient conversion processes and better upgrading strategies for microalgae. Comprehensive two-dimensional gas chromatography (GC×GC) coupled to nitrogen chemiluminescence detector (NCD) and time-of-flight mass spectrometer (TOF-MS) allows to obtain the detailed quantitative composition of the nitrogen containing compounds in the aqueous and the organic fraction of fast pyrolysis bio-oils from microalgae. Normal phase (apolar×mid-polar) and reverse phase column (polar×apolar) combination are investigated to optimize the separation of the detected nitrogen containing compounds. The reverse phase column combination gives the most detailed information in terms of the nitrogen containing compounds. The combined information from the GC×GC-TOF-MS (qualitative) and GC×GC-NCD (quantitative) with the use of a well-chosen internal standard, i.e. caprolactam, enables the identification and quantification of nitrogen containing compounds belonging to 13 different classes: amines, imidazoles, amides, imides, nitriles, pyrazines, pyridines, indoles, pyrazoles, pyrimidines, quinolines, pyrimidinediones and other nitrogen containing compounds which were not assigned to a specific class. The aqueous fraction mostly consists of amines (4.0wt%) and imidazoles (2.8wt%) corresponding to approximately 80wt% of the total identified nitrogen containing compounds. On the other hand, the organic fraction shows a more diverse distribution of nitrogen containing compounds with the majority of the compounds quantified as amides (3.0wt%), indoles (2.0wt%), amines (1.7wt%) and imides (1.3wt%) corresponding to approximately 65wt% of the total identified nitrogen containing compounds. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Characterization of dilute species within CVD-grown silicon nanowires doped using trimethylboron: protected lift-out specimen preparation for atom probe tomography.

    PubMed

    Prosa, T J; Alvis, R; Tsakalakos, L; Smentkowski, V S

    2010-08-01

    Three-dimensional quantitative compositional analysis of nanowires is a challenge for standard techniques such as secondary ion mass spectrometry because of specimen size and geometry considerations; however, it is precisely the size and geometry of nanowires that makes them attractive candidates for analysis via atom probe tomography. The resulting boron composition of various trimethylboron vapour-liquid-solid grown silicon nanowires were measured both with time-of-flight secondary ion mass spectrometry and pulsed-laser atom probe tomography. Both characterization techniques yielded similar results for relative composition. Specialized specimen preparation for pulsed-laser atom probe tomography was utilized and is described in detail whereby individual silicon nanowires are first protected, then lifted out, trimmed, and finally wet etched to remove the protective layer for subsequent three-dimensional analysis.

  18. Analysis of X-Ray Line Spectra from a Transient Plasma Under Solar Flare Conditions - Part Three - Diagnostics for Measuring Electron Temperature and Density

    NASA Astrophysics Data System (ADS)

    Sylwester, J.; Mewe, R.; Schrijver, J.

    1980-06-01

    In this paper, the third in a series dealing with plasmas out of equilibrium we present quantitative methods of analysis of non-stationary flare plasma parameters. The method is designed to be used for the interpretation of the SMM XRP Bent Crystal Spectrometer spectra. Our analysis is based on measurements of 11 specific lines in the 1.77-3.3 Å range. Using the proposed method we are able to derive information about temperature, density, emission measure, and other related parameters of the flare plasma. It is shown that the measurements, to be made by XRP can give detailed information on these parameters and their time evolution. The method is then tested on some artificial flares, and proves to be useful and accurate.

  19. Increased Depth and Breadth of Plasma Protein Quantitation via Two-Dimensional Liquid Chromatography/Multiple Reaction Monitoring-Mass Spectrometry with Labeled Peptide Standards.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Borchers, Christoph H

    2016-01-01

    Absolute quantitative strategies are emerging as a powerful and preferable means of deriving concentrations in biological samples for systems biology applications. Method development is driven by the need to establish new-and validate current-protein biomarkers of high-to-low abundance for clinical utility. In this chapter, we describe a methodology involving two-dimensional (2D) reversed-phase liquid chromatography (RPLC), operated under alkaline and acidic pH conditions, combined with multiple reaction monitoring (MRM)-mass spectrometry (MS) (also called selected reaction monitoring (SRM)-MS) and a complex mixture of stable isotope-labeled standard (SIS) peptides, to quantify a broad and diverse panel of 253 proteins in human blood plasma. The quantitation range spans 8 orders of magnitude-from 15 mg/mL (for vitamin D-binding protein) to 450 pg/mL (for protein S100-B)-and includes 31 low-abundance proteins (defined as being <10 ng/mL) of potential disease relevance. The method is designed to assess candidates at the discovery and/or verification phases of the biomarker pipeline and can be adapted to examine smaller or alternate panels of proteins for higher sample throughput. Also detailed here is the application of our recently developed software tool-Qualis-SIS-for protein quantitation (via regression analysis of standard curves) and quality assessment of the resulting data. Overall, this chapter provides the blueprint for the replication of this quantitative proteomic method by proteomic scientists of all skill levels.

  20. Teaching bioinformatics and neuroinformatics by using free web-based tools.

    PubMed

    Grisham, William; Schottler, Natalie A; Valli-Marill, Joanne; Beck, Lisa; Beatty, Jackson

    2010-01-01

    This completely computer-based module's purpose is to introduce students to bioinformatics resources. We present an easy-to-adopt module that weaves together several important bioinformatic tools so students can grasp how these tools are used in answering research questions. Students integrate information gathered from websites dealing with anatomy (Mouse Brain Library), quantitative trait locus analysis (WebQTL from GeneNetwork), bioinformatics and gene expression analyses (University of California, Santa Cruz Genome Browser, National Center for Biotechnology Information's Entrez Gene, and the Allen Brain Atlas), and information resources (PubMed). Instructors can use these various websites in concert to teach genetics from the phenotypic level to the molecular level, aspects of neuroanatomy and histology, statistics, quantitative trait locus analysis, and molecular biology (including in situ hybridization and microarray analysis), and to introduce bioinformatic resources. Students use these resources to discover 1) the region(s) of chromosome(s) influencing the phenotypic trait, 2) a list of candidate genes-narrowed by expression data, 3) the in situ pattern of a given gene in the region of interest, 4) the nucleotide sequence of the candidate gene, and 5) articles describing the gene. Teaching materials such as a detailed student/instructor's manual, PowerPoints, sample exams, and links to free Web resources can be found at http://mdcune.psych.ucla.edu/modules/bioinformatics.

  1. Methodology development for quantitative optimization of security enhancement in medical information systems -Case study in a PACS and a multi-institutional radiotherapy database-.

    PubMed

    Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari

    2002-01-01

    The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.

  2. On-line multiple component analysis for efficient quantitative bioprocess development.

    PubMed

    Dietzsch, Christian; Spadiut, Oliver; Herwig, Christoph

    2013-02-20

    On-line monitoring devices for the precise determination of a multitude of components are a prerequisite for fast bioprocess quantification. On-line measured values have to be checked for quality and consistency, in order to extract quantitative information from these data. In the present study we characterized a novel on-line sampling and analysis device comprising an automatic photometric robot. We connected this on-line device to a bioreactor and concomitantly measured six components (i.e. glucose, glycerol, ethanol, acetate, phosphate and ammonium) during different batch cultivations of Pichia pastoris. The on-line measured data did not show significant deviations from off-line taken samples and were consequently used for incremental rate and yield calculations. In this respect we highlighted the importance of data quality and discussed the phenomenon of error propagation. On-line calculated rates and yields depicted the physiological responses of the P. pastoris cells in unlimited and limited cultures. A more detailed analysis of the physiological state was possible by considering the off-line determined biomass dry weight and the calculation of specific rates. Here we present a novel device for on-line monitoring of bioprocesses, which ensures high data quality in real-time and therefore refers to a valuable tool for Process Analytical Technology (PAT). Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Proteomic characterization of EL4 lymphoma-derived tumors upon chemotherapy treatment reveals potential roles for lysosomes and caspase-6 during tumor cell death in vivo.

    PubMed

    Kramer, David A; Eldeeb, Mohamed A; Wuest, Melinda; Mercer, John; Fahlman, Richard P

    2017-06-01

    The murine mouse lymphoblastic lymphoma cell line (EL4) tumor model is an established in vivo apoptosis model for the investigation of novel cancer imaging agents and immunological treatments due to the rapid and significant response of the EL4 tumors to cyclophosphamide and etoposide combination chemotherapy. Despite the utility of this model system in cancer research, little is known regarding the molecular details of in vivo tumor cell death. Here, we report the first in-depth quantitative proteomic analysis of the changes that occur in these tumors upon cyclophosphamide and etoposide treatment in vivo. Using a label-free quantitative proteomic approach a total of 5838 proteins were identified in the treated and untreated tumors, of which 875 were determined to change in abundance with statistical significance. Initial analysis of the data reveals changes that may have been predicted, such as the downregulation of ribosomes, but demonstrates the robustness of the dataset. Analysis of the dataset also reveals the unexpected downregulation of caspase-3 and an upregulation of caspase-6 in addition to a global upregulation of lysosomal proteins in the bulk of the tumor. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Theoretical Analysis of an Iron Mineral-Based Magnetoreceptor Model in Birds

    PubMed Central

    Solov'yov, Ilia A.; Greiner, Walter

    2007-01-01

    Sensing the magnetic field has been established as an essential part of navigation and orientation of various animals for many years. Only recently has the first detailed receptor concept for magnetoreception been published based on histological and physical results. The considered mechanism involves two types of iron minerals (magnetite and maghemite) that were found in subcellular compartments within sensory dendrites of the upper beak of several bird species. But so far a quantitative evaluation of the proposed receptor is missing. In this article, we develop a theoretical model to quantitatively and qualitatively describe the magnetic field effects among particles containing iron minerals. The analysis of forces acting between these subcellular compartments shows a particular dependence on the orientation of the external magnetic field. The iron minerals in the beak are found in the form of crystalline maghemite platelets and assemblies of magnetite nanoparticles. We demonstrate that the pull or push to the magnetite assemblies, which are connected to the cell membrane, may reach a value of 0.2 pN—sufficient to excite specific mechanoreceptive membrane channels in the nerve cell. The theoretical analysis of the assumed magnetoreceptor system in the avian beak skin clearly shows that it might indeed be a sensitive biological magnetometer providing an essential part of the magnetic map for navigation. PMID:17496012

  5. Four Forms of the Fourier Transform - for Freshmen, using Matlab

    NASA Astrophysics Data System (ADS)

    Simons, F. J.; Maloof, A. C.

    2016-12-01

    In 2015, a Fall "Freshman Seminar" at Princeton University (http://geoweb.princeton.edu/people/simons/FRS-SESC.html) taught students to combine field observations of the natural world with quantitative modeling and interpretation, to answer questions like: "How have Earth and human histories been recorded in the geology of Princeton, the Catskills, France and Spain?" (where we took the students on a data-gathering field trip during Fall Break), and "What experiments and analysis can a first-year (possibly non-future-major) do to query such archives of the past?" In the classroom, through problem sets, and around campus, students gained practical experience collecting geological and geophysical data in a geographic context, and analyzing these data using statistical techniques such as regression, time-series and image analysis, with the programming language Matlab. In this presentation I will detail how we instilled basic Matlab skills for quantitative geoscience data analysis through a 6-week progression of topics and exercises. In the 6 weeks after the Fall Break trip, we strengthened these competencies to make our students fully proficient for further learning, as evidenced by their end-of-term independent research work.The particular case study is focused on introducing power-spectral analysis to Freshmen, in a way that even the least quantitative among them could functionally understand. Not counting (0) "inspection", the four ways by which we have successfully instilled the concept of power-spectral analysis in a hands-on fashion are (1) "correlation", (2) "inversion", (3) "stacking", and formal (4) "Fourier transformation". These four provide the main "mappings". Along the way, of course, we also make sure that the students understand that "power-spectral density estimation" is not the same as "Fourier transformation", nor that every Fourier transform has to be "Fast". Hence, concepts from analysis-of-variance techniques, regression, and hypothesis testing, arise in this context, and will be discussed.

  6. Quantitative spectral and orientational analysis in surface sum frequency generation vibrational spectroscopy (SFG-VS)

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua

    Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D = /, and comparison between the PNA method with the commonly used polarization intensity ratio (PIR) method is discussed. The polarization and incident angle dependencies of the SFG-VS intensity are also reviewed, in the light of how experimental arrangements can be optimized to effectively abstract crucial information from the SFG-VS experiments. The values and models of the local field factors in the molecular layers are discussed. In order to examine the validity and limitations of the bond polarizability derivative model, the general expressions for molecular hyperpolarizability tensors and their expression with the bond polarizability derivative model for C3v, C2v and C∞v molecular groups are given in the two appendixes. We show that the bond polarizability derivative model can quantitatively describe many aspects of the intensities observed in the SFG-VS spectrum of the vapour/neat liquid interfaces in different polarizations. Using the polarization analysis in SFG-VS, polarization selection rules or guidelines are developed for assignment of the SFG-VS spectrum. Using the selection rules, SFG-VS spectra of vapour/diol, and vapour/n-normal alcohol (n˜ 1-8) interfaces are assigned, and some of the ambiguity and confusion, as well as their implications in previous IR and Raman assignment, are duly discussed. The ability to assign a SFG-VS spectrum using the polarization selection rules makes SFG-VS not only an effective and useful vibrational spectroscopy technique for interface studies, but also a complementary vibrational spectroscopy method in general condensed phase studies. These developments will put quantitative orientational and spectral analysis in SFG-VS on a more solid foundation. The formulations, concepts and issues discussed in this review are expected to find broad applications for investigations on molecular interfaces in the future.

  7. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  8. SWATH-MS Quantitative Analysis of Proteins in the Rice Inferior and Superior Spikelets during Grain Filling

    PubMed Central

    Zhu, Fu-Yuan; Chen, Mo-Xian; Su, Yu-Wen; Xu, Xuezhong; Ye, Neng-Hui; Cao, Yun-Ying; Lin, Sheng; Liu, Tie-Yuan; Li, Hao-Xuan; Wang, Guan-Qun; Jin, Yu; Gu, Yong-Hai; Chan, Wai-Lung; Lo, Clive; Peng, Xinxiang; Zhu, Guohui; Zhang, Jianhua

    2016-01-01

    Modern rice cultivars have large panicle but their yield potential is often not fully achieved due to poor grain-filling of late-flowering inferior spikelets (IS). Our earlier work suggested a broad transcriptional reprogramming during grain filling and showed a difference in gene expression between IS and earlier-flowering superior spikelets (SS). However, the links between the abundances of transcripts and their corresponding proteins are unclear. In this study, a SWATH-MS (sequential window acquisition of all theoretical spectra-mass spectrometry) -based quantitative proteomic analysis has been applied to investigate SS and IS proteomes. A total of 304 proteins of widely differing functionality were observed to be differentially expressed between IS and SS. Detailed gene ontology analysis indicated that several biological processes including photosynthesis, protein metabolism, and energy metabolism are differentially regulated. Further correlation analysis revealed that abundances of most of the differentially expressed proteins are not correlated to the respective transcript levels, indicating that an extra layer of gene regulation which may exist during rice grain filling. Our findings raised an intriguing possibility that these candidate proteins may be crucial in determining the poor grain-filling of IS. Therefore, we hypothesize that the regulation of proteome changes not only occurs at the transcriptional, but also at the post-transcriptional level, during grain filling in rice. PMID:28066479

  9. Mixed methods research.

    PubMed

    Halcomb, Elizabeth; Hickman, Louise

    2015-04-08

    Mixed methods research involves the use of qualitative and quantitative data in a single research project. It represents an alternative methodological approach, combining qualitative and quantitative research approaches, which enables nurse researchers to explore complex phenomena in detail. This article provides a practical overview of mixed methods research and its application in nursing, to guide the novice researcher considering a mixed methods research project.

  10. Contrast-detail phantom scoring methodology.

    PubMed

    Thomas, Jerry A; Chakrabarti, Kish; Kaczmarek, Richard; Romanyukha, Alexander

    2005-03-01

    Published results of medical imaging studies which make use of contrast detail mammography (CDMAM) phantom images for analysis are difficult to compare since data are often not analyzed in the same way. In order to address this situation, the concept of ideal contrast detail curves is suggested. The ideal contrast detail curves are constructed based on the requirement of having the same product of the diameter and contrast (disk thickness) of the minimal correctly determined object for every row of the CDMAM phantom image. A correlation and comparison of five different quality parameters of the CDMAM phantom image determined for obtained ideal contrast detail curves is performed. The image quality parameters compared include: (1) contrast detail curve--a graph correlation between "minimal correct reading" diameter and disk thickness; (2) correct observation ratio--the ratio of the number of correctly identified objects to the actual total number of objects multiplied by 100; (3) image quality figure--the sum of the product of the diameter of the smallest scored object and its relative contrast; (4) figure-of-merit--the zero disk diameter value obtained from extrapolation of the contrast detail curve to the origin (e.g., zero disk diameter); and (5) k-factor--the product of the thickness and the diameter of the smallest correctly identified disks. The analysis carried out showed the existence of a nonlinear relationship between the above parameters, which means that use of different parameters of CDMAM image quality potentially can cause different conclusions about changes in image quality. Construction of the ideal contrast detail curves for CDMAM phantom is an attempt to determine the quantitative limits of the CDMAM phantom as employed for image quality evaluation. These limits are determined by the relationship between certain parameters of a digital mammography system and the set of the gold disks sizes in the CDMAM phantom. Recommendations are made on selections of CDMAM phantom regions which should be used for scoring at different image quality and which scoring methodology may be most appropriate. Special attention is also paid to the use of the CDMAM phantom for image quality assessment of digital mammography systems particularly in the vicinity of the Nyquist frequency.

  11. Assessing the detail needed to capture rainfall-runoff dynamics with physics-based hydrologic response simulation

    USGS Publications Warehouse

    Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.

    2011-01-01

    Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.

  12. Complexity-aware simple modeling.

    PubMed

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2017-03-01

    This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented for demonstration purposes, where risk posed by adverse weather and natural hazards to primary road, water supply and power networks is assessed. The application examples show that the proposed model provides a useful tool for screening of potential undesirable events, contributing to a targeted reduction of the risk.

  14. Detecting Autophagy and Autophagy Flux in Chronic Myeloid Leukemia Cells Using a Cyto-ID Fluorescence Spectrophotometric Assay.

    PubMed

    Guo, Sujuan; Pridham, Kevin J; Sheng, Zhi

    2016-01-01

    Autophagy is a catabolic process whereby cellular components are degraded to fuel cells for longer survival during stress. Hence, autophagy plays a vital role in determining cell fate and is central for homeostasis and pathogenesis of many human diseases including chronic myeloid leukemia (CML). It has been well established that autophagy is important for the leukemogenesis as well as drug resistance in CML. Thus, autophagy is an intriguing therapeutic target. However, current approaches that detect autophagy lack reliability and often fail to provide quantitative measurements. To overcome this hurdle and facilitate the development of autophagy-related therapies, we have recently developed an autophagy assay termed as the Cyto-ID fluorescence spectrophotometric assay. This method uses a cationic fluorescence dye, Cyto-ID, which specifically labels autophagic compartments and is detected by a spectrophotometer to permit a large-scale and quantitative analysis. As such, it allows rapid, reliable, and quantitative detection of autophagy and estimation of autophagy flux. In this chapter, we further provide technical details of this method and step-by-step protocols for measuring autophagy or autophagy flux in CML cell lines as well as primary hematopoietic cells.

  15. Acoustic Emission Monitoring for Assessment of Steel Bridge Details

    NASA Astrophysics Data System (ADS)

    Kosnik, D. E.; Hopwood, T.; Corr, D. J.

    2011-06-01

    Acoustic emission (AE) testing was deployed on details of two large steel Interstate Highway bridges: one cantilever through-truss and one trapezoidal box girder bridge. Quantitative measurements of activity levels at known and suspected crack locations were made by monitoring AE under normal service loads (e.g., live traffic and wind). AE indications were used to direct application of radiography, resulting in identification of a previously unknown flaw, and to inform selection of a retrofit detail.

  16. A quantitative validated model reveals two phases of transcriptional regulation for the gap gene giant in Drosophila.

    PubMed

    Hoermann, Astrid; Cicin-Sain, Damjan; Jaeger, Johannes

    2016-03-15

    Understanding eukaryotic transcriptional regulation and its role in development and pattern formation is one of the big challenges in biology today. Most attempts at tackling this problem either focus on the molecular details of transcription factor binding, or aim at genome-wide prediction of expression patterns from sequence through bioinformatics and mathematical modelling. Here we bridge the gap between these two complementary approaches by providing an integrative model of cis-regulatory elements governing the expression of the gap gene giant (gt) in the blastoderm embryo of Drosophila melanogaster. We use a reverse-engineering method, where mathematical models are fit to quantitative spatio-temporal reporter gene expression data to infer the regulatory mechanisms underlying gt expression in its anterior and posterior domains. These models are validated through prediction of gene expression in mutant backgrounds. A detailed analysis of our data and models reveals that gt is regulated by domain-specific CREs at early stages, while a late element drives expression in both the anterior and the posterior domains. Initial gt expression depends exclusively on inputs from maternal factors. Later, gap gene cross-repression and gt auto-activation become increasingly important. We show that auto-regulation creates a positive feedback, which mediates the transition from early to late stages of regulation. We confirm the existence and role of gt auto-activation through targeted mutagenesis of Gt transcription factor binding sites. In summary, our analysis provides a comprehensive picture of spatio-temporal gene regulation by different interacting enhancer elements for an important developmental regulator. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Quantitative assessment of key parameters in qualitative vulnerability methods applied in karst systems based on an integrated numerical modelling approach

    NASA Astrophysics Data System (ADS)

    Doummar, Joanna; Kassem, Assaad

    2017-04-01

    In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.

  18. Crater studies: Part A: lunar crater morphometry

    USGS Publications Warehouse

    Pike, Richard J.

    1973-01-01

    Morphometry, the quantitative study of shape, complements the visual observation and photointerpretation in analyzing the most outstanding landforms of the Moon, its craters (refs. 32-1 and 32-2). All three of these interpretative tools, which were developed throughout the long history of telescopic lunar study preceding the Apollo Program, will continue to be applicable to crater analysis until detailed field work becomes possible. Although no large (>17.5 km diameter) craters were examined in situ on any of the Apollo landings, the photographs acquired from the command modules will markedly strengthen results of less direct investigations of the craters. For morphometry, the most useful materials are the orbital metric and panoramic photographs from the final three Apollo missions. These photographs permit preparation of contour maps, topographic profiles, and other numerical data that accurately portray for the first time the surface geometry of lunar craters of all sizes. Interpretations of craters no longer need be compromised by inadequate topographic data. In the pre-Apollo era, hypotheses for the genesis of lunar craters usually were constructed without any numerical descriptive data. Such speculations will have little credibility unless supported by accurate, quantitative data, especially those generated from Apollo orbital photographs. This paper presents a general study of the surface geometry of 25 far-side craters and a more detailed study of rim-crest evenness for 15 near-side and far-side craters. Analysis of this preliminary sample of Apollo 15 and 17 data, which includes craters between 1.5 and 275 km in diameter, suggests that most genetic interpretations of craters made from pre-Apollo topographic measurements may require no drastic revision. All measurements were made from topographic profiles generated on a stereoplotter at the Photogrammetric Unit of the U.S. Geological Survey, Center of Astrogeology, Flagstaff, Arizona.

  19. Rapid recovery from aphasia after infarction of Wernicke's area

    PubMed Central

    Yagata, Stephanie A.; Yen, Melodie; McCarron, Angelica; Bautista, Alexa; Lamair-Orosco, Genevieve

    2017-01-01

    Background Aphasia following infarction of Wernicke's area typically resolves to some extent over time. The nature of this recovery process and its time course have not been characterized in detail, especially in the acute/subacute period. Aims The goal of this study was to document recovery after infarction of Wernicke's area in detail in the first 3 months after stroke. Specifically, we aimed to address two questions about language recovery. First, which impaired language domains improve over time, and which do not? Second, what is the time course of recovery? Methods & Procedures We used quantitative analysis of connected speech and a brief aphasia battery to document language recovery in two individuals with aphasia following infarction of the posterior superior temporal gyrus. Speech samples were acquired daily between 2 and 16 days post stroke, and also at 1 month and 3 months. Speech samples were transcribed and coded using the CHAT system, in order to quantify multiple language domains. A brief aphasia battery was also administered at a subset of five time points during the 3 months. Outcomes & Results Both patients showed substantial recovery of language function over this time period. Most, but not all, language domains showed improvements, including fluency, lexical access, phonological retrieval and encoding, and syntactic complexity. The time course of recovery was logarithmic, with the greatest gains taking place early in the course of recovery. Conclusions There is considerable potential for amelioration of language deficits when damage is relatively circumscribed to the posterior superior temporal gyrus. Quantitative analysis of connected speech samples proved to be an effective, albeit time-consuming, approach to tracking day-by-day recovery in the acute/subacute post-stroke period. PMID:29051682

  20. The quantitative assessment of epicardial fat distribution on human hearts: Implications for epicardial electrophysiology.

    PubMed

    Mattson, Alexander R; Soto, Mario J; Iaizzo, Paul A

    2018-07-01

    Epicardial electrophysiological procedures rely on dependable interfacing with the myocardial tissue. For example, epicardial pacing systems must generate sustainable chronic pacing capture, while epicardial ablations must effectively deliver energy to the target hyper-excitable myocytes. The human heart has a significant adipose layer which may impede epicardial procedures. The objective of this study was to quantitatively assess the relative location of epicardial adipose on the human heart, to define locations where epicardial therapies might be performed successfully. We studied perfusion-fixed human hearts (n = 105) in multiple isolated planes including: left ventricular margin, diaphragmatic surface, and anterior right ventricle. Relative adipose distribution was quantitatively assessed via planar images, using a custom-generated image analysis algorithm. In these specimens, 76.7 ± 13.8% of the left ventricular margin, 72.7 ± 11.3% of the diaphragmatic surface, and 92.1 ± 8.7% of the anterior right margin were covered with superficial epicardial adipose layers. Percent adipose coverage significantly increased with age (P < 0.001) and history of coronary artery disease (P < 0.05). No significant relationships were identified between relative percent adipose coverage and gender, body weight or height, BMI, history of hypertension, and/or history of congestive heart failure. Additionally, we describe two-dimensional probability distributions of epicardial adipose coverage for each of the three analysis planes. In this study, we detail the quantitative assessment and probabilistic mapping of the distribution of superficial epicardial adipose on the adult human heart. These findings have implications relative to performing epicardial procedures and/or designing procedures or tools to successfully perform such treatments. Clin. Anat. 31:661-666, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  1. Exploring and Harnessing Haplotype Diversity to Improve Yield Stability in Crops.

    PubMed

    Qian, Lunwen; Hickey, Lee T; Stahl, Andreas; Werner, Christian R; Hayes, Ben; Snowdon, Rod J; Voss-Fels, Kai P

    2017-01-01

    In order to meet future food, feed, fiber, and bioenergy demands, global yields of all major crops need to be increased significantly. At the same time, the increasing frequency of extreme weather events such as heat and drought necessitates improvements in the environmental resilience of modern crop cultivars. Achieving sustainably increase yields implies rapid improvement of quantitative traits with a very complex genetic architecture and strong environmental interaction. Latest advances in genome analysis technologies today provide molecular information at an ultrahigh resolution, revolutionizing crop genomic research, and paving the way for advanced quantitative genetic approaches. These include highly detailed assessment of population structure and genotypic diversity, facilitating the identification of selective sweeps and signatures of directional selection, dissection of genetic variants that underlie important agronomic traits, and genomic selection (GS) strategies that not only consider major-effect genes. Single-nucleotide polymorphism (SNP) markers today represent the genotyping system of choice for crop genetic studies because they occur abundantly in plant genomes and are easy to detect. SNPs are typically biallelic, however, hence their information content compared to multiallelic markers is low, limiting the resolution at which SNP-trait relationships can be delineated. An efficient way to overcome this limitation is to construct haplotypes based on linkage disequilibrium, one of the most important features influencing genetic analyses of crop genomes. Here, we give an overview of the latest advances in genomics-based haplotype analyses in crops, highlighting their importance in the context of polyploidy and genome evolution, linkage drag, and co-selection. We provide examples of how haplotype analyses can complement well-established quantitative genetics frameworks, such as quantitative trait analysis and GS, ultimately providing an effective tool to equip modern crops with environment-tailored characteristics.

  2. Conceptual design of a crewed reusable space transportation system aimed at parabolic flights: stakeholder analysis, mission concept selection, and spacecraft architecture definition

    NASA Astrophysics Data System (ADS)

    Fusaro, Roberta; Viola, Nicole; Fenoglio, Franco; Santoro, Francesco

    2017-03-01

    This paper proposes a methodology to derive architectures and operational concepts for future earth-to-orbit and sub-orbital transportation systems. In particular, at first, it describes the activity flow, methods, and tools leading to the generation of a wide range of alternative solutions to meet the established goal. Subsequently, the methodology allows selecting a small number of feasible options among which the optimal solution can be found. For the sake of clarity, the first part of the paper describes the methodology from a theoretical point of view, while the second part proposes the selection of mission concepts and of a proper transportation system aimed at sub-orbital parabolic flights. Starting from a detailed analysis of the stakeholders and their needs, the major objectives of the mission have been derived. Then, following a system engineering approach, functional analysis tools as well as concept of operations techniques allowed generating a very high number of possible ways to accomplish the envisaged goals. After a preliminary pruning activity, aimed at defining the feasibility of these concepts, more detailed analyses have been carried out. Going on through the procedure, the designer should move from qualitative to quantitative evaluations, and for this reason, to support the trade-off analysis, an ad-hoc built-in mission simulation software has been exploited. This support tool aims at estimating major mission drivers (mass, heat loads, manoeuverability, earth visibility, and volumetric efficiency) as well as proving the feasibility of the concepts. Other crucial and multi-domain mission drivers, such as complexity, innovation level, and safety have been evaluated through the other appropriate analyses. Eventually, one single mission concept has been selected and detailed in terms of layout, systems, and sub-systems, highlighting also logistic, safety, and maintainability aspects.

  3. iTRAQ-Based Quantitative Proteomic Analysis of the Antimicrobial Mechanism of Peptide F1 against Escherichia coli.

    PubMed

    Miao, Jianyin; Chen, Feilong; Duan, Shan; Gao, Xiangyang; Liu, Guo; Chen, Yunjiao; Dixon, William; Xiao, Hang; Cao, Yong

    2015-08-19

    Antimicrobial peptides have received increasing attention in the agricultural and food industries due to their potential to control pathogens. However, to facilitate the development of novel peptide-based antimicrobial agents, details regarding the molecular mechanisms of these peptides need to be elucidated. The aim of this study was to investigate the antimicrobial mechanism of peptide F1, a bacteriocin found in Tibetan kefir, against Escherichia coli at protein levels using iTRAQ-based quantitative proteomic analysis. In response to treatment with peptide F1, 31 of the 280 identified proteins in E. coli showed alterations in their expression, including 10 down-regulated proteins and 21 up-regulated proteins. These 31 proteins all possess different molecular functions and are involved in different molecular pathways, as is evident in referencing the Kyoto Encyclopedia of Genes and Genomes pathways. Specifically, pathways that were significantly altered in E. coli in response to peptide F1 treatment include the tricarboxylic acid cycle, oxidative phosphorylation, glycerophospholipid metabolism, and the cell cycle-caulobacter pathways, which was also associated with inhibition of the cell growth, induction of morphological changes, and cell death. The results provide novel insights into the molecular mechanisms of antimicrobial peptides.

  4. Pressure potential and stability analysis in an acoustical noncontact transportation

    NASA Astrophysics Data System (ADS)

    Li, J.; Liu, C. J.; Zhang, W. J.

    2017-01-01

    Near field acoustic traveling wave is one of the most popular principles in noncontact manipulations and transportations. The stability behavior is a key factor in the industrial applications of acoustical noncontact transportation. We present here an in-depth analysis of the transportation stability of a planar object levitated in near field acoustic traveling waves. To more accurately describe the pressure distributions on the radiation surface, a 3D nonlinear traveling wave model is presented. A closed form solution is derived based on the pressure potential to quantitatively calculate the restoring forces and moments under small disturbances. The physical explanations of the effects of fluid inertia and the effects of non-uniform pressure distributions are provided in detail. It is found that a vibration rail with tapered cross section provides more stable transportation than a rail with rectangular cross section. The present study sheds light on the issue of quantitative evaluation of stability in acoustic traveling waves and proposes three main factors that influence the stability: (a) vibration shape, (b) pressure distribution and (c) restoring force/moment. It helps to provide a better understanding of the physics behind the near field acoustic transportation and provide useful design and optimization tools for industrial applications.

  5. A Local Vision on Soil Hydrology (John Dalton Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Roth, K.

    2012-04-01

    After shortly looking back to some research trails of the past decades, and touching on the role of soils in our environmental machinery, a vision on the future of soil hydrology is offered. It is local in the sense of being based on limited experience as well as in the sense of focussing on local spatial scales, from 1 m to 1 km. Cornerstones of this vision are (i) rapid developments of quantitative observation technology, illustrated with the example of ground-penetrating radar (GPR), and (ii) the availability of ever more powerful compute facilities which allow to simulate increasingly complicated model representations in unprecedented detail. Together, they open a powerful and flexible approach to the quantitative understanding of soil hydrology where two lines are fitted: (i) potentially diverse measurements of the system of interest and their analysis and (ii) a comprehensive model representation, including architecture, material properties, forcings, and potentially unknown aspects, together with the same analysis as for (i). This approach pushes traditional inversion to operate on analyses, not on the underlying state variables, and to become flexible with respect to architecture and unknown aspects. The approach will be demonstrated for simple situations at test sites.

  6. Quantitative analysis of flavanones from citrus fruits by using mesoporous molecular sieve-based miniaturized solid phase extraction coupled to ultrahigh-performance liquid chromatography and quadrupole time-of-flight mass spectrometry.

    PubMed

    Cao, Wan; Ye, Li-Hong; Cao, Jun; Xu, Jing-Jing; Peng, Li-Qing; Zhu, Qiong-Yao; Zhang, Qian-Yun; Hu, Shuai-Shuai

    2015-08-07

    An analytical procedure based on miniaturized solid phase extraction (SPE) and ultrahigh-performance liquid chromatography coupled with quadrupole time-of-flight tandem mass spectrometry was developed and validated for determination of six flavanones in Citrus fruits. The mesoporous molecular sieve SBA-15 as a solid sorbent was characterised by Fourier transform-infrared spectroscopy and scanning electron microscopy. Additionally, compared with reported extraction techniques, the mesoporous SBA-15 based SPE method possessed the advantages of shorter analysis time and higher sensitivity. Furthermore, considering the different nature of the tested compounds, all of the parameters, including the SBA-15 amount, solution pH, elution solvent, and the sorbent type, were investigated in detail. Under the optimum condition, the instrumental detection and quantitation limits calculated were less than 4.26 and 14.29ngmL(-1), respectively. The recoveries obtained for all the analytes were ranging from 89.22% to 103.46%. The experimental results suggested that SBA-15 was a promising material for the purification and enrichment of target flavanones from complex citrus fruit samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Computed tomography in hypersensitivity pneumonitis: main findings, differential diagnosis and pitfalls.

    PubMed

    Dias, Olívia Meira; Baldi, Bruno Guedes; Pennati, Francesca; Aliverti, Andrea; Chate, Rodrigo Caruso; Sawamura, Márcio Valente Yamada; Carvalho, Carlos Roberto Ribeiro de; Albuquerque, André Luis Pereira de

    2018-01-01

    Hypersensitivity pneumonitis (HP) is a disease with variable clinical presentation in which inflammation in the lung parenchyma is caused by the inhalation of specific organic antigens or low molecular weight substances in genetically susceptible individuals. Alterations of the acute, subacute and chronic forms may eventually overlap, and the diagnosis based on temporality and presence of fibrosis (acute/inflammatory HP vs. chronic HP) seems to be more feasible and useful in clinical practice. Differential diagnosis of chronic HP with other interstitial fibrotic diseases is challenging due to the overlap of the clinical history, and the functional and imaging findings of these pathologies in the terminal stages. Areas covered: This article reviews the essential features of HP with emphasis on imaging features. Moreover, the main methodological limitations of high-resolution computed tomography (HRCT) interpretation are discussed, as well as new perspectives with volumetric quantitative CT analysis as a useful tool for retrieving detailed and accurate information from the lung parenchyma. Expert commentary: Mosaic attenuation is a prominent feature of this disease, but air trapping in chronic HP seems overestimated. Quantitative analysis has the potential to estimate the involvement of the pulmonary parenchyma more accurately and could correlate better with pulmonary function results.

  8. Universality and diversity of folding mechanics for three-helix bundle proteins.

    PubMed

    Yang, Jae Shick; Wallin, Stefan; Shakhnovich, Eugene I

    2008-01-22

    In this study we evaluate, at full atomic detail, the folding processes of two small helical proteins, the B domain of protein A and the Villin headpiece. Folding kinetics are studied by performing a large number of ab initio Monte Carlo folding simulations using a single transferable all-atom potential. Using these trajectories, we examine the relaxation behavior, secondary structure formation, and transition-state ensembles (TSEs) of the two proteins and compare our results with experimental data and previous computational studies. To obtain a detailed structural information on the folding dynamics viewed as an ensemble process, we perform a clustering analysis procedure based on graph theory. Moreover, rigorous p(fold) analysis is used to obtain representative samples of the TSEs and a good quantitative agreement between experimental and simulated Phi values is obtained for protein A. Phi values for Villin also are obtained and left as predictions to be tested by future experiments. Our analysis shows that the two-helix hairpin is a common partially stable structural motif that gets formed before entering the TSE in the studied proteins. These results together with our earlier study of Engrailed Homeodomain and recent experimental studies provide a comprehensive, atomic-level picture of folding mechanics of three-helix bundle proteins.

  9. Technical aspects and recommendations for single-cell qPCR.

    PubMed

    Ståhlberg, Anders; Kubista, Mikael

    2018-02-01

    Single cells are basic physiological and biological units that can function individually as well as in groups in tissues and organs. It is central to identify, characterize and profile single cells at molecular level to be able to distinguish different kinds, to understand their functions and determine how they interact with each other. During the last decade several technologies for single-cell profiling have been developed and used in various applications, revealing many novel findings. Quantitative PCR (qPCR) is one of the most developed methods for single-cell profiling that can be used to interrogate several analytes, including DNA, RNA and protein. Single-cell qPCR has the potential to become routine methodology but the technique is still challenging, as it involves several experimental steps and few molecules are handled. Here, we discuss technical aspects and provide recommendation for single-cell qPCR analysis. The workflow includes experimental design, sample preparation, single-cell collection, direct lysis, reverse transcription, preamplification, qPCR and data analysis. Detailed reporting and sharing of experimental details and data will promote further development and make validation studies possible. Efforts aiming to standardize single-cell qPCR open up means to move single-cell analysis from specialized research settings to standard research laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. An image-processing method to detect sub-optical features based on understanding noise in intensity measurements.

    PubMed

    Bhatia, Tripta

    2018-07-01

    Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.

  11. Detailing the relation between renal T2* and renal tissue pO2 using an integrated approach of parametric magnetic resonance imaging and invasive physiological measurements.

    PubMed

    Pohlmann, Andreas; Arakelyan, Karen; Hentschel, Jan; Cantow, Kathleen; Flemming, Bert; Ladwig, Mechthild; Waiczies, Sonia; Seeliger, Erdmann; Niendorf, Thoralf

    2014-08-01

    This study was designed to detail the relation between renal T2* and renal tissue pO2 using an integrated approach that combines parametric magnetic resonance imaging (MRI) and quantitative physiological measurements (MR-PHYSIOL). Experiments were performed in 21 male Wistar rats. In vivo modulation of renal hemodynamics and oxygenation was achieved by brief periods of aortic occlusion, hypoxia, and hyperoxia. Renal perfusion pressure (RPP), renal blood flow (RBF), local cortical and medullary tissue pO2, and blood flux were simultaneously recorded together with T2*, T2 mapping, and magnetic resonance-based kidney size measurements (MR-PHYSIOL). Magnetic resonance imaging was carried out on a 9.4-T small-animal magnetic resonance system. Relative changes in the invasive quantitative parameters were correlated with relative changes in the parameters derived from MRI using Spearman analysis and Pearson analysis. Changes in T2* qualitatively reflected tissue pO2 changes induced by the interventions. T2* versus pO2 Spearman rank correlations were significant for all interventions, yet quantitative translation of T2*/pO2 correlations obtained for one intervention to another intervention proved not appropriate. The closest T2*/pO2 correlation was found for hypoxia and recovery. The interlayer comparison revealed closest T2*/pO2 correlations for the outer medulla and showed that extrapolation of results obtained for one renal layer to other renal layers must be made with due caution. For T2* to RBF relation, significant Spearman correlations were deduced for all renal layers and for all interventions. T2*/RBF correlations for the cortex and outer medulla were even superior to those between T2* and tissue pO2. The closest T2*/RBF correlation occurred during hypoxia and recovery. Close correlations were observed between T2* and kidney size during hypoxia and recovery and for occlusion and recovery. In both cases, kidney size correlated well with renal vascular conductance, as did renal vascular conductance with T2*. Our findings indicate that changes in T2* qualitatively mirror changes in renal tissue pO2 but are also associated with confounding factors including vascular volume fraction and tubular volume fraction. Our results demonstrate that MR-PHYSIOL is instrumental to detail the link between renal tissue pO2 and T2* in vivo. Unravelling the link between regional renal T2* and tissue pO2, including the role of the T2* confounding parameters vascular and tubular volume fraction and oxy-hemoglobin dissociation curve, requires further research. These explorations are essential before the quantitative capabilities of parametric MRI can be translated from experimental research to improved clinical understanding of hemodynamics/oxygenation in kidney disorders.

  12. Optimising hydrogen peroxide measurement in exhaled breath condensate.

    PubMed

    Brooks, Wendy M; Lash, Heath; Kettle, Anthony J; Epton, Michael J

    2006-01-01

    Exhaled breath condensate (EBC) analysis has been proposed as a non-invasive method of assessing airway pathology. A number of substances, including hydrogen peroxide (H2O2), have been measured in EBC, without adequate published details of validation and optimisation. To explore factors that affect accurate quantitation of H2O2 in EBC. H2O2 was measured in EBC samples using fluorometry with 4-hydroxyphenylacetic acid. A number of factors that might alter quantitation were studied including pH and buffering conditions, reagent storage, and assay temperature. Standard curve slope was significantly altered by pH, leading to a potential difference in H2O2 quantification of up to 42%. These differences were resolved by increasing the buffering capacity of the reaction mix. H2O2 added to EBC remained stable for 1 h when stored on ice. The assay was unaffected by freezing assay reagents. The limit of detection for H2O2 ranged from 3.4 nM to 8.8 nM depending on the buffer used. The reagents required for this assay can be stored for several months allowing valuable consistency in longitudinal studies. The quantitation of H2O2 in EBC is pH-dependent but increasing assay buffering reduces this effect. Sensitive reproducible quantitation of H2O2 in EBC requires rigorous optimisation.

  13. Four Linked Genes Participate in Controlling Sporulation Efficiency in Budding Yeast

    PubMed Central

    Ben-Ari, Giora; Zenvirth, Drora; Sherman, Amir; David, Lior; Klutstein, Michael; Lavi, Uri; Hillel, Jossi; Simchen, Giora

    2006-01-01

    Quantitative traits are conditioned by several genetic determinants. Since such genes influence many important complex traits in various organisms, the identification of quantitative trait loci (QTLs) is of major interest, but still encounters serious difficulties. We detected four linked genes within one QTL, which participate in controlling sporulation efficiency in Saccharomyces cerevisiae. Following the identification of single nucleotide polymorphisms by comparing the sequences of 145 genes between the parental strains SK1 and S288c, we analyzed the segregating progeny of the cross between them. Through reciprocal hemizygosity analysis, four genes, RAS2, PMS1, SWS2, and FKH2, located in a region of 60 kilobases on Chromosome 14, were found to be associated with sporulation efficiency. Three of the four “high” sporulation alleles are derived from the “low” sporulating strain. Two of these sporulation-related genes were verified through allele replacements. For RAS2, the causative variation was suggested to be a single nucleotide difference in the upstream region of the gene. This quantitative trait nucleotide accounts for sporulation variability among a set of ten closely related winery yeast strains. Our results provide a detailed view of genetic complexity in one “QTL region” that controls a quantitative trait and reports a single nucleotide polymorphism-trait association in wild strains. Moreover, these findings have implications on QTL identification in higher eukaryotes. PMID:17112318

  14. Technological innovation in neurosurgery: a quantitative study.

    PubMed

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-07-01

    Technological innovation within health care may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technology-intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical techniques. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation, respectively. The authors searched a patent database for articles published between 1960 and 2010 using the Boolean search term "neurosurgeon OR neurosurgical OR neurosurgery." The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top-performing technology cluster was then selected as an exemplar for a more detailed analysis of individual patents. In all, 11,672 patents and 208,203 publications related to neurosurgery were identified. The top-performing technology clusters during these 50 years were image-guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes, and endoscopes. In relation to image-guidance and neuromodulation devices, the authors found a highly correlated rapid rise in the numbers of patents and publications, which suggests that these are areas of technology expansion. An in-depth analysis of neuromodulation-device patents revealed that the majority of well-performing patents were related to deep brain stimulation. Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery.

  15. Dual-isotope Cryo-imaging Quantitative Autoradiography (CIQA): Anvestigating Antibody-Drug Conjugate Distribution And Payload Delivery Through Imaging.

    PubMed

    Ilovich, Ohad; Qutaish, Mohammed; Hesterman, Jacob; Orcutt, Kelly; Hoppin, Jack; Polyak, Ildiko; Seaman, Marc; Abu-Yousif, Adnan; Cvet, Donna; Bradley, Daniel

    2018-05-04

    In vitro properties of antibody drug conjugates (ADCs) such as binding, internalization, and cytotoxicity are often well characterized prior to in vivo studies. Interpretation of in vivo studies could significantly be enhanced by molecular imaging tools. We present here a dual-isotope cryo-imaging quantitative autoradiography (CIQA) methodology combined with advanced 3D imaging and analysis allowing for the simultaneous study of both antibody and payload distribution in tissues of interest. in a pre-clinical setting. Methods: TAK-264, an investigational anti-guanylyl cyclase C (GCC) targeting ADC was synthesized utilizing tritiated Monomethyl auristatin E (MMAE). The tritiated ADC was then conjugated to DTPA, labeled with indium-111 and evaluated in vivo in GCC-positive and GCC-negative tumor-bearing animals. Results: Cryo-imaging Quantitative Autoradiography (CIQA) reveals the time course of drug release from ADC and its distribution into various tumor regions seemingly impenetrablethat are less accessible to the antibody. For GCC-positive tumors, a representative section obtained 96 hours post tracer injection showed only 0.8% of the voxels have co-localized signal versus over 15% of the voxels for a GCC-negative tumor section., suggesting successful and specific cleaving of the toxin in the antigen positive lesions. Conclusion: The combination of a veteran established autoradiography technology with advanced image analysis methodologies affords an experimental tool that can support detailed characterization of ADC tumor penetration and pharmacokinetics. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  16. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, Enhanced Pearson eText with Loose-Leaf Version--Access Card Package. Fifth Edition

    ERIC Educational Resources Information Center

    Creswell, John W.

    2015-01-01

    "Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research" offers a truly balanced, inclusive, and integrated overview of the processes involved in educational research. This text first examines the general steps in the research process and then details the procedures for conducting specific types…

  17. 40 CFR Appendix C to Part 300 - Swirling Flask Dispersant Effectiveness Test, Revised Standard Dispersant Toxicity Test, and...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the SIM mode at a scan rate of 1.5 scans/second to maximize the linear quantitative range and... Research Group, Texas A&M University, 833 Graham Rd., College Station, TX, 77845, (409) 690-0095. 8... following information is contained in the detailed quantitative reports: average RRF derived from the...

  18. Mobile diagnostics: next-generation technologies for in vitro diagnostics.

    PubMed

    Shin, Joonchul; Chakravarty, Sudesna; Choi, Wooseok; Lee, Kyungyeon; Han, Dongsik; Hwang, Hyundoo; Choi, Jaekyu; Jung, Hyo-Il

    2018-03-26

    The emergence of a wide range of applications of smartphones along with advances in 'liquid biopsy' has significantly propelled medical research particularly in the field of in vitro diagnostics (IVD). Herein, we have presented a detailed analysis of IVD, its associated critical concerns and probable solutions. It also demonstrates the transition in terms of analytes from minimally invasive (blood) to non-invasive (urine, saliva and sweat) and depicts how the different features of a smartphone can be integrated for specific diagnostic purposes. This review basically highlights recent advances in the applications of smartphone-based biosensors in IVD taking into account the following factors: accuracy and portability; quantitative and qualitative analysis; and centralization and decentralization tests. Furthermore, the critical concerns and future direction of diagnostics based on smartphones are also discussed.

  19. Charge-displacement analysis for excited states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronca, Enrico, E-mail: enrico@thch.unipg.it; Tarantelli, Francesco, E-mail: francesco.tarantelli@unipg.it; Dipartimento di Chimica, Biologia e Biotecnologie, Università degli Studi di Perugia, via Elce di Sotto 8, I-06123 Perugia

    2014-02-07

    We extend the Charge-Displacement (CD) analysis, already successfully employed to describe the nature of intermolecular interactions [L. Belpassi et al., J. Am. Chem. Soc. 132, 13046 (2010)] and various types of controversial chemical bonds [L. Belpassi et al., J. Am. Chem. Soc. 130, 1048 (2008); N. Salvi et al., Chem. Eur. J. 16, 7231 (2010)], to study the charge fluxes accompanying electron excitations, and in particular the all-important charge-transfer (CT) phenomena. We demonstrate the usefulness of the new approach through applications to exemplary excitations in a series of molecules, encompassing various typical situations from valence, to Rydberg, to CT excitations.more » The CD functions defined along various spatial directions provide a detailed and insightful quantitative picture of the electron displacements taking place.« less

  20. UV-laser microdissection and mRNA expression analysis of individual neurons from postmortem Parkinson's disease brains.

    PubMed

    Gründemann, Jan; Schlaudraff, Falk; Liss, Birgit

    2011-01-01

    Cell specificity of gene expression analysis is essential to avoid tissue sample related artifacts, in particular when the relative number of target cells present in the compared tissues varies dramatically, e.g., when comparing dopamine neurons in midbrain tissues from control subjects with those from Parkinson's disease (PD) cases. Here, we describe a detailed protocol that combines contact-free UV-laser microdissection and quantitative PCR of reverse-transcribed RNA of individual neurons from postmortem human midbrain tissue from PD patients and unaffected controls. Among expression changes in a variety of dopamine neuron marker, maintenance, and cell-metabolism genes, we found that α-synuclein mRNA levels were significantly elevated in individual neuromelanin-positive dopamine midbrain neurons from PD brains when compared to those from matched controls.

  1. Measurement of Galactic Logarithmic Spiral Arm Pitch Angle Using Two-dimensional Fast Fourier Transform Decomposition

    NASA Astrophysics Data System (ADS)

    Davis, Benjamin L.; Berrier, Joel C.; Shields, Douglas W.; Kennefick, Julia; Kennefick, Daniel; Seigar, Marc S.; Lacy, Claud H. S.; Puerari, Ivânio

    2012-04-01

    A logarithmic spiral is a prominent feature appearing in a majority of observed galaxies. This feature has long been associated with the traditional Hubble classification scheme, but historical quotes of pitch angle of spiral galaxies have been almost exclusively qualitative. We have developed a methodology, utilizing two-dimensional fast Fourier transformations of images of spiral galaxies, in order to isolate and measure the pitch angles of their spiral arms. Our technique provides a quantitative way to measure this morphological feature. This will allow comparison of spiral galaxy pitch angle to other galactic parameters and test spiral arm genesis theories. In this work, we detail our image processing and analysis of spiral galaxy images and discuss the robustness of our analysis techniques.

  2. Pathophysiology of Degenerative Mitral Regurgitation: New 3-Dimensional Imaging Insights.

    PubMed

    Antoine, Clemence; Mantovani, Francesca; Benfari, Giovanni; Mankad, Sunil V; Maalouf, Joseph F; Michelena, Hector I; Enriquez-Sarano, Maurice

    2018-01-01

    Despite its high prevalence, little is known about mechanisms of mitral regurgitation in degenerative mitral valve disease apart from the leaflet prolapse itself. Mitral valve is a complex structure, including mitral annulus, mitral leaflets, papillary muscles, chords, and left ventricular walls. All these structures are involved in physiological and pathological functioning of this valvuloventricular complex but up to now were difficult to analyze because of inherent limitations of 2-dimensional imaging. The advent of 3-dimensional echocardiography, computed tomography, and cardiac magnetic resonance imaging overcoming these limitations provides new insights into mechanistic analysis of degenerative mitral regurgitation. This review will detail the contribution of quantitative and qualitative dynamic analysis of mitral annulus and mitral leaflets by new imaging methods in the understanding of degenerative mitral regurgitation pathophysiology. © 2018 American Heart Association, Inc.

  3. Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.

    PubMed

    Xia, Jianguo; Wishart, David S

    2016-09-07

    MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  4. Synthesizing qualitative and quantitative evidence on non-financial access barriers: implications for assessment at the district level.

    PubMed

    O'Connell, Thomas S; Bedford, K Juliet A; Thiede, Michael; McIntyre, Di

    2015-06-09

    A key element of the global drive to universal health coverage is ensuring access to needed health services for everyone, and to pursue this goal in an equitable way. This requires concerted efforts to reduce disparities in access through understanding and acting on barriers facing communities with the lowest utilisation levels. Financial barriers dominate the empirical literature on health service access. Unless the full range of access barriers are investigated, efforts to promote equitable access to health care are unlikely to succeed. This paper therefore focuses on exploring the nature and extent of non-financial access barriers. We draw upon two structured literature reviews on barriers to access and utilization of maternal, newborn and child health services in Ghana, Bangladesh, Vietnam and Rwanda. One review analyses access barriers identified in published literature using qualitative research methods; the other in published literature using quantitative analysis of household survey data. We then synthesised the key qualitative and quantitative findings through a conjoint iterative analysis. Five dominant themes on non-financial access barriers were identified: ethnicity; religion; physical accessibility; decision-making, gender and autonomy; and knowledge, information and education. The analysis highlighted that non-financial factors pose considerable barriers to access, many of which relate to the acceptability dimension of access and are challenging to address. Another key finding is that quantitative research methods, while yielding important findings, are inadequate for understanding non-financial access barriers in sufficient detail to develop effective responses. Qualitative research is critical in filling this gap. The analysis also indicates that the nature of non-financial access barriers vary considerably, not only between countries but also between different communities within individual countries. To adequately understand access barriers as a basis for developing effective strategies to address them, mixed-methods approaches are required. From an equity perspective, communities with the lowest utilisation levels should be prioritised and the access barriers specific to that community identified. It is, therefore, critical to develop approaches that can be used at the district level to diagnose and act upon access barriers if we are to pursue an equitable path to universal health coverage.

  5. Global analysis of the yeast lipidome by quantitative shotgun mass spectrometry.

    PubMed

    Ejsing, Christer S; Sampaio, Julio L; Surendranath, Vineeth; Duchoslav, Eva; Ekroos, Kim; Klemm, Robin W; Simons, Kai; Shevchenko, Andrej

    2009-02-17

    Although the transcriptome, proteome, and interactome of several eukaryotic model organisms have been described in detail, lipidomes remain relatively uncharacterized. Using Saccharomyces cerevisiae as an example, we demonstrate that automated shotgun lipidomics analysis enabled lipidome-wide absolute quantification of individual molecular lipid species by streamlined processing of a single sample of only 2 million yeast cells. By comparative lipidomics, we achieved the absolute quantification of 250 molecular lipid species covering 21 major lipid classes. This analysis provided approximately 95% coverage of the yeast lipidome achieved with 125-fold improvement in sensitivity compared with previous approaches. Comparative lipidomics demonstrated that growth temperature and defects in lipid biosynthesis induce ripple effects throughout the molecular composition of the yeast lipidome. This work serves as a resource for molecular characterization of eukaryotic lipidomes, and establishes shotgun lipidomics as a powerful platform for complementing biochemical studies and other systems-level approaches.

  6. The effect of mud therapy on pain relief in patients with knee osteoarthritis: a meta-analysis of randomized controlled trials.

    PubMed

    Liu, Hua; Zeng, Chao; Gao, Shu-guang; Yang, Tuo; Luo, Wei; Li, Yu-sheng; Xiong, Yi-lin; Sun, Jin-peng; Lei, Guang-hua

    2013-10-01

    A meta-analysis was conducted to examine the effect of mud therapy on pain relief in patients with knee osteoarthritis (OA). A detailed search of PubMed®/MEDLINE® was undertaken to identify randomized controlled trials and prospective comparative studies published before 9 March 2013 that compared mud therapy with control group treatments in patients with knee OA. A quantitative meta-analysis of seven studies (410 patients) was performed. There was a significant difference between the groups in the visual analogue scale pain score (standardized mean difference [SMD] -0.73) and Western Ontario and McMaster Universities Osteoarthritis Index pain score (SMD -0.30), with differences in favour of mud therapy. Mud therapy is a favourable option for pain relief in patients with knee OA. Additional high-quality randomized controlled trials need to be conducted to explore this issue further and to confirm this conclusion.

  7. Multi-component quantitation of meso/nanostructural surfaces and its application to local chemical compositions of copper meso/nanostructures self-organized on silica

    NASA Astrophysics Data System (ADS)

    Huang, Chun-Yi; Chang, Hsin-Wei; Chang, Che-Chen

    2018-03-01

    Knowledge about the chemical compositions of meso/nanomaterials is fundamental to development of their applications in advanced technologies. Auger electron spectroscopy (AES) is an effective analysis method for the characterization of meso/nanomaterial structures. Although a few studies have reported the use of AES for the analysis of the local composition of these structures, none have explored in detail the validity of the meso/nanoanalysis results generated by the AES instrument. This paper addresses the limitations of AES and the corrections necessary to offset them for this otherwise powerful meso/nanoanalysis tool. The results of corrections made to the AES multi-point analysis of high-density copper-based meso/nanostructures provides major insights into their local chemical compositions and technological prospects, which the primitive composition output of the AES instrument failed to provide.

  8. Regimes of wrinkling in pressurized elastic shells

    PubMed Central

    2017-01-01

    We consider the point indentation of a pressurized elastic shell. It has previously been shown that such a shell is subject to a wrinkling instability as the indentation depth is quasi-statically increased. Here we present detailed analysis of this wrinkling instability using a combination of analytical techniques and finite-element simulations. In particular, we study how the number of wrinkles observed at the onset of instability grows with increasing pressurization. We also study how, for fixed pressurization, the number of wrinkles changes both spatially and with increasing indentation depth beyond onset. This ‘Far from threshold’ analysis exploits the largeness of the wrinkle wavenumber that is observed at high pressurization and leads to quantitative differences with the standard ‘Near threshold’ stability analysis. This article is part of the themed issue ‘Patterning through instabilities in complex media: theory and applications.’ PMID:28373387

  9. Water quality monitor (EMPAX instrument)

    NASA Technical Reports Server (NTRS)

    Kelliher, Warren C.; Clark, Ben; Thornton, Mike

    1991-01-01

    The impetus of the Viking Mission to Mars led to the first miniaturization of a X-ray Fluorescence Spectrometer (XRFS). Two units were flown on the Viking Mission and successfully operated for two years analyzing the elemental composition of the Martian soil. Under a Bureau of Mines/NASA Technology Utilization project, this XRFS design was utilized to produce a battery powered, portable unit for elemental analysis of geological samples. This paper will detail design improvements and additional sampling capabilities that were incorporated into a second generation portable XRFS that was funded by the EPA/NASA Technology Utilization project. The unit, Environment Monitoring with Portable Analysis by X-ray (EMPAX), was developed specifically for quantitative determination of the need of EPA and and any industry affected by environmental concerns, the EMPAX fulfills a critical need to provide on-site, real-time analysis of toxic metal contamination. A patent was issued on EMPAX, but a commercial manufacturer is still being sought.

  10. Automated quantitative histology reveals vascular morphodynamics during Arabidopsis hypocotyl secondary growth.

    PubMed

    Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S

    2014-02-11

    Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001.

  11. Enamel microcracks in terms of orthodontic treatment: A novel method for their detection and evaluation.

    PubMed

    Dumbryte, Irma; Linkeviciene, Laura; Linkevicius, Tomas; Malinauskas, Mangirdas

    2017-07-26

    The study aimed at introducing current available techniques for enamel microcracks (EMCs) detection, and presenting a method for direct quantitative analysis of an individual EMC. Measurements of the detailed EMCs characteristics (location, length, and width) were taken from the reconstructed images of the buccal tooth surface (teeth extracted from two age groups of patients) employing a scanning electron microscopy (SEM) and our derived formulas before and after ceramic brackets removal. Measured parameters of EMCs for younger age group were 2.41 µm (width), 3.68 mm (length) before and 2.73 µm, 3.90 mm after debonding; for older -4.03 µm, 4.35 mm before and 4.80 µm, 4.37 mm after brackets removal. Following debonding EMCs increased for both groups, however the changes in width and length were statistically insignificant. Regardless of the age group, proposed method enabled precise detection of the same EMC before and after debonding, and quantitative examination of its characteristics.

  12. Is procrastination all that "bad"? A qualitative study of academic procrastination and self-worth in postgraduate university students.

    PubMed

    Abramowski, Anna

    2018-01-01

    Most of the existing literature investigated the construct of procrastination using quantitative paradigms-primarily self-administered questionnaires. However, such approaches seem to limit insight, elaboration, and deeper understanding of central facets that might influence procrastination. The present qualitative study explored how a sample of postgraduate students from Cambridge University represented academic procrastination framed within their personal perspectives and context using semistructured interviews. This study extends the existing quantitative literature by adding students' personal narratives and voices. Ten postgraduate students were interviewed and the data were analyzed using thematic analysis. The preponderance of the literature on academic procrastination has described it as a maladaptive and detrimental behavior. However, the present study found evidence which supports the existence of a positive form of procrastination as well which suggests that procrastination can sometimes be worthwhile and allow further thinking time, allowing students to do a task and enable them to give more attention to detail which suggests a reconsideration of the negative image commonly associated with procrastination.

  13. Automated quantitative histology reveals vascular morphodynamics during Arabidopsis hypocotyl secondary growth

    PubMed Central

    Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S

    2014-01-01

    Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001 PMID:24520159

  14. On A Problem Of Propagation Of Shock Waves Generated By Explosive Volcanic Eruptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gusev, V. A.; Sobissevitch, A. L.

    2008-06-24

    Interdisciplinary study of flows of matter and energy in geospheres has become one of the most significant advances in Earth sciences. It is carried out by means of direct quantitative estimations based on detailed analysis of geological and geophysical observations and experimental data. The actual contribution is the interdisciplinary study of nonlinear acoustics and physical volcanology dedicated to shock wave propagation in a viscous and inhomogeneous medium. The equations governing evolution of shock waves with an arbitrary initial profile and an arbitrary cross-section of a beam are obtained. For the case of low viscous medium, the asymptotic solution meant tomore » calculate a profile of a shock wave in an arbitrary point has been derived. The analytical solution of the problem on propagation of shock pulses from atmosphere into a two-phase fluid-saturated geophysical medium is analysed. Quantitative estimations were carried out with respect to experimental results obtained in the course of real explosive volcanic eruptions.« less

  15. Quantitation of heavy ion damage to the mammalian brain - Some preliminary findings

    NASA Technical Reports Server (NTRS)

    Cox, A. B.; Kraft, L. M.

    1984-01-01

    For several years, studies have been conducted regarding late effects of particulate radiations in mammalian tissues, taking into account the brains of rodents and lagomorphs. Recently, it has become feasible to quantify pathological damage and morpho-physiologic alterations accurately in large numbers of histological specimens. New investigative procedures make use of computer-assisted automated image analysis systems. Details regarding the employed methodology are discussed along with the results of the information. The radiations of high linear energy transfer (LET) cause apparently earlier and more dramatic shrinkage of olfactory glomeruli in exposed rabbit brains than comparable doses of Co-60 gamma photons.

  16. Venom gland transcriptomics for identifying, cataloging, and characterizing venom proteins in snakes.

    PubMed

    Brahma, Rajeev Kungur; McCleary, Ryan J R; Kini, R Manjunatha; Doley, Robin

    2015-01-01

    Snake venoms are cocktails of protein toxins that play important roles in capture and digestion of prey. Significant qualitative and quantitative variation in snake venom composition has been observed among and within species. Understanding these variations in protein components is instrumental in interpreting clinical symptoms during human envenomation and in searching for novel venom proteins with potential therapeutic applications. In the last decade, transcriptomic analyses of venom glands have helped in understanding the composition of various snake venoms in great detail. Here we review transcriptomic analysis as a powerful tool for understanding venom profile, variation and evolution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Fluorescence Spectroscopy for the Monitoring of Food Processes.

    PubMed

    Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd

    Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.

  18. Persistent stability of a chaotic system

    NASA Astrophysics Data System (ADS)

    Huber, Greg; Pradas, Marc; Pumir, Alain; Wilkinson, Michael

    2018-02-01

    We report that trajectories of a one-dimensional model for inertial particles in a random velocity field can remain stable for a surprisingly long time, despite the fact that the system is chaotic. We provide a detailed quantitative description of this effect by developing the large-deviation theory for fluctuations of the finite-time Lyapunov exponent of this system. Specifically, the determination of the entropy function for the distribution reduces to the analysis of a Schrödinger equation, which is tackled by semi-classical methods. The system has 'generic' instability properties, and we consider the broader implications of our observation of long-term stability in chaotic systems.

  19. Optical actuators for fly-by-light applications

    NASA Astrophysics Data System (ADS)

    Chee, Sonny H. S.; Liu, Kexing; Measures, Raymond M.

    1993-04-01

    A review of optomechanical interfaces is presented. A detailed quantitative and qualitative analysis of the University of Toronto Institute for Aerospace Studies (UTIAS) box, optopneumatics, optical activation of a bimetal, optical activation of the shape memory effect, and optical activation of the pyroelectric effects is given. The UTIAS box is found to display a good conversion efficiency and a high bandwidth. A preliminary UTIAS box design has achieved a conversion efficiency of about 1/6 of the theoretical limit and a bandwidth of 2 Hz. In comparison to previous optomechanical interfaces, the UTIAS box has the highest pressure development to optical power ratio (at least an order of magnitude greater).

  20. Using databases in medical education research: AMEE Guide No. 77.

    PubMed

    Cleland, Jennifer; Scott, Neil; Harrild, Kirsten; Moffat, Mandy

    2013-05-01

    This AMEE Guide offers an introduction to the use of databases in medical education research. It is intended for those who are contemplating conducting research in medical education but are new to the field. The Guide is structured around the process of planning your research so that data collection, management and analysis are appropriate for the research question. Throughout we consider contextual possibilities and constraints to educational research using databases, such as the resources available, and provide concrete examples of medical education research to illustrate many points. The first section of the Guide explains the difference between different types of data and classifying data, and addresses the rationale for research using databases in medical education. We explain the difference between qualitative research and qualitative data, the difference between categorical and quantitative data, and the difference types of data which fall into these categories. The Guide reviews the strengths and weaknesses of qualitative and quantitative research. The next section is structured around how to work with quantitative and qualitative databases and provides guidance on the many practicalities of setting up a database. This includes how to organise your database, including anonymising data and coding, as well as preparing and describing your data so it is ready for analysis. The critical matter of the ethics of using databases in medical educational research, including using routinely collected data versus data collected for research purposes, and issues of confidentiality, is discussed. Core to the Guide is drawing out the similarities and differences in working with different types of data and different types of databases. Future AMEE Guides in the research series will address statistical analysis of data in more detail.

  1. Differential Mobility Spectrometry for Improved Selectivity in Hydrophilic Interaction Liquid Chromatography-Tandem Mass Spectrometry Analysis of Paralytic Shellfish Toxins

    NASA Astrophysics Data System (ADS)

    Beach, Daniel G.

    2017-08-01

    Paralytic shellfish toxins (PSTs) are neurotoxins produced by dinoflagellates and cyanobacteria that cause paralytic shellfish poisoning in humans. PST quantitation by LC-MS is challenging because of their high polarity, lability as gas-phase ions, and large number of potentially interfering analogues. Differential mobility spectrometry (DMS) has the potential to improve the performance of LC-MS methods for PSTs in terms of selectivity and limits of detection. This work describes a comprehensive investigation of the separation of 16 regulated PSTs by DMS and the development of highly selective LC-DMS-MS methods for PST quantitation. The effects of all DMS parameters on the separation of PSTs from one another were first investigated in detail. The labile nature of 11α-gonyautoxin epimers gave unique insight into fragmentation of labile analytes before, during, and after the DMS analyzer. Two sets of DMS parameters were identified that either optimized the resolution of PSTs from one another or transmitted them at a limited number of compensation voltage (CV) values corresponding to structural subclasses. These were used to develop multidimensional LC-DMS-MS/MS methods using existing HILIC-MS/MS parameters. In both cases, improved selectivity was observed when using DMS, and the quantitative capabilities of a rapid UPLC-DMS-MS/MS method were evaluated. Limits of detection of the developed method were similar to those without DMS, and differences were highly analyte-dependant. Analysis of shellfish matrix reference materials showed good agreement with established methods. The developed methods will be useful in cases where specific matrix interferences are encountered in the LC-MS/MS analysis of PSTs in complex biological samples.

  2. Quantitative Investigation of Protein-Nucleic Acid Interactions by Biosensor Surface Plasmon Resonance.

    PubMed

    Wang, Shuo; Poon, Gregory M K; Wilson, W David

    2015-01-01

    Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.

  3. Quantitative Studies on the Propagation and Extinction of Near-Limit Premixed Flames Under Normal and Microgravity

    NASA Technical Reports Server (NTRS)

    Dong, Y.; Spedding, G. R.; Egolfopoulos, F. N.; Miller, F. J.

    2003-01-01

    The main objective of this research is to introduce accurate fluid mechanics measurements diagnostics in the 2.2-s drop tower for the determination of the detailed flow-field at the states of extinction. These results are important as they can then be compared with confidence with detailed numerical simulations so that important insight is provided into near-limit phenomena that are controlled by not well-understood kinetics and thermal radiation processes. Past qualitative studies did enhance our general understanding on the subject. However, quantitative studies are essential for the validation of existing models that subsequently be used to describe near-limit phenomena that can initiate catastrophic events in micro- and/or reduced gravity environments.

  4. Let's talk about empathy!

    PubMed

    Robieux, Léonore; Karsenti, Lucille; Pocard, Marc; Flahault, Cécile

    2018-01-01

    Research faces a challenge to find a shared, adequate and scientific definition of empathy. Our work aimed to analyze what clinical empathy is in the specific context of cancer care and to identify the effect of empathy in it. This study gives voice to physicians with extensive experience in cancer care. This original research combines qualitative data collection and quantitative data analysis. Semi-structured individual interviews were conducted with 25 physicians. The content of the interviews was analyzed according to the Content Analysis Technique. Empathy is described according to six dimensions that give a strong role to interpersonal and cognitive skills. This description integrates previous and various conceptualizations of clinical empathy. Physicians detail the beneficial effects of clinical empathy on patients' outcomes and well-being as well as physicians' practices. Physician interviews also revealed the relationship between empathic concerns and physicians' emotional difficulties. Empathy in cancer care is a complex process and a multicomponent competence. This operational description of clinical empathy has three main implications: to draw up a training program for physicians, to detail recommendations for physicians' work-related quality of life and to develop new tools to measure empathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Modelling and analysis of creep deformation and fracture in a 1 Cr 1/2 Mo ferritic steel

    NASA Astrophysics Data System (ADS)

    Dyson, B. F.; Osgerby, D.

    A quantitative model, based upon a proposed new mechanism of creep deformation in particle-hardened alloys, has been validated by analysis of creep data from a 13CrMo 4 4 (1Cr 1/2 Mo) material tested under a range of stresses and temperatures. The methodology that has been used to extract the model parameters quantifies, as a first approximation, only the main degradation (damage) processes - in the case of the 1CR 1/2 Mo steel, these are considered to be the parallel operation of particle-coarsening and a progressively increasing stress due to a constant-load boundary condition. These 'global' model parameters can then be modified (only slightly) as required to obtain a detailed description and 'fit' to the rupture lifetime and strain/time trajectory of any individual test. The global model parameter approach may be thought of as predicting average behavior and the detailed fits as taking account of uncertainties (scatter) due to variability in the material. Using the global parameter dataset, predictions have also been made of behavior under biaxial stressing; constant straining rate; constant total strain (stress relaxation) and the likely success or otherwise of metallographic and mechanical remanent lifetime procedures.

  6. Analysis of Invasion Dynamics of Matrix-Embedded Cells in a Multisample Format.

    PubMed

    Van Troys, Marleen; Masuzzo, Paola; Huyck, Lynn; Bakkali, Karima; Waterschoot, Davy; Martens, Lennart; Ampe, Christophe

    2018-01-01

    In vitro tests of cancer cell invasion are the "first line" tools of preclinical researchers for screening the multitude of chemical compounds or cell perturbations that may aid in halting or treating cancer malignancy. In order to have predictive value or to contribute to designing personalized treatment regimes, these tests need to take into account the cancer cell environment and measure effects on invasion in sufficient detail. The in vitro invasion assays presented here are a trade-off between feasibility in a multisample format and mimicking the complexity of the tumor microenvironment. They allow testing multiple samples and conditions in parallel using 3D-matrix-embedded cells and deal with the heterogeneous behavior of an invading cell population in time. We describe the steps to take, the technical problems to tackle and useful software tools for the entire workflow: from the experimental setup to the quantification of the invasive capacity of the cells. The protocol is intended to guide researchers to standardize experimental set-ups and to annotate their invasion experiments in sufficient detail. In addition, it provides options for image processing and a solution for storage, visualization, quantitative analysis, and multisample comparison of acquired cell invasion data.

  7. Quantitative analysis of cell columns in the cerebral cortex.

    PubMed

    Buxhoeveden, D P; Switala, A E; Roy, E; Casanova, M F

    2000-04-01

    We present a quantified imaging method that describes the cell column in mammalian cortex. The minicolumn is an ideal template with which to examine cortical organization because it is a basic unit of function, complete in itself, which interacts with adjacent and distance columns to form more complex levels of organization. The subtle details of columnar anatomy should reflect physiological changes that have occurred in evolution as well as those that might be caused by pathologies in the brain. In this semiautomatic method, images of Nissl-stained tissue are digitized or scanned into a computer imaging system. The software detects the presence of cell columns and describes details of their morphology and of the surrounding space. Columns are detected automatically on the basis of cell-poor and cell-rich areas using a Gaussian distribution. A line is fit to the cell centers by least squares analysis. The line becomes the center of the column from which the precise location of every cell can be measured. On this basis several algorithms describe the distribution of cells from the center line and in relation to the available surrounding space. Other algorithms use cluster analyses to determine the spatial orientation of every column.

  8. Bidirectional Retroviral Integration Site PCR Methodology and Quantitative Data Analysis Workflow.

    PubMed

    Suryawanshi, Gajendra W; Xu, Song; Xie, Yiming; Chou, Tom; Kim, Namshin; Chen, Irvin S Y; Kim, Sanggu

    2017-06-14

    Integration Site (IS) assays are a critical component of the study of retroviral integration sites and their biological significance. In recent retroviral gene therapy studies, IS assays, in combination with next-generation sequencing, have been used as a cell-tracking tool to characterize clonal stem cell populations sharing the same IS. For the accurate comparison of repopulating stem cell clones within and across different samples, the detection sensitivity, data reproducibility, and high-throughput capacity of the assay are among the most important assay qualities. This work provides a detailed protocol and data analysis workflow for bidirectional IS analysis. The bidirectional assay can simultaneously sequence both upstream and downstream vector-host junctions. Compared to conventional unidirectional IS sequencing approaches, the bidirectional approach significantly improves IS detection rates and the characterization of integration events at both ends of the target DNA. The data analysis pipeline described here accurately identifies and enumerates identical IS sequences through multiple steps of comparison that map IS sequences onto the reference genome and determine sequencing errors. Using an optimized assay procedure, we have recently published the detailed repopulation patterns of thousands of Hematopoietic Stem Cell (HSC) clones following transplant in rhesus macaques, demonstrating for the first time the precise time point of HSC repopulation and the functional heterogeneity of HSCs in the primate system. The following protocol describes the step-by-step experimental procedure and data analysis workflow that accurately identifies and quantifies identical IS sequences.

  9. Basal paravian functional anatomy illuminated by high-detail body outline

    PubMed Central

    Wang, Xiaoli; Pittman, Michael; Zheng, Xiaoting; Kaye, Thomas G.; Falk, Amanda R.; Hartman, Scott A.; Xu, Xing

    2017-01-01

    Body shape is a fundamental expression of organismal biology, but its quantitative reconstruction in fossil vertebrates is rare. Due to the absence of fossilized soft tissue evidence, the functional consequences of basal paravian body shape and its implications for the origins of avians and flight are not yet fully understood. Here we reconstruct the quantitative body outline of a fossil paravian Anchiornis based on high-definition images of soft tissues revealed by laser-stimulated fluorescence. This body outline confirms patagia-bearing arms, drumstick-shaped legs and a slender tail, features that were probably widespread among paravians. Finely preserved details also reveal similarities in propatagial and footpad form between basal paravians and modern birds, extending their record to the Late Jurassic. The body outline and soft tissue details suggest significant functional decoupling between the legs and tail in at least some basal paravians. The number of seemingly modern propatagial traits hint that feathering was a significant factor in how basal paravians utilized arm, leg and tail function for aerodynamic benefit. PMID:28248287

  10. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    PubMed Central

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-01-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions. PMID:24957323

  11. New Methods for Analysis of Spatial Distribution and Coaggregation of Microbial Populations in Complex Biofilms

    PubMed Central

    Almstrand, Robert; Daims, Holger; Persson, Frank; Sörensson, Fred

    2013-01-01

    In biofilms, microbial activities form gradients of substrates and electron acceptors, creating a complex landscape of microhabitats, often resulting in structured localization of the microbial populations present. To understand the dynamic interplay between and within these populations, quantitative measurements and statistical analysis of their localization patterns within the biofilms are necessary, and adequate automated tools for such analyses are needed. We have designed and applied new methods for fluorescence in situ hybridization (FISH) and digital image analysis of directionally dependent (anisotropic) multispecies biofilms. A sequential-FISH approach allowed multiple populations to be detected in a biofilm sample. This was combined with an automated tool for vertical-distribution analysis by generating in silico biofilm slices and the recently developed Inflate algorithm for coaggregation analysis of microbial populations in anisotropic biofilms. As a proof of principle, we show distinct stratification patterns of the ammonia oxidizers Nitrosomonas oligotropha subclusters I and II and the nitrite oxidizer Nitrospira sublineage I in three different types of wastewater biofilms, suggesting niche differentiation between the N. oligotropha subclusters, which could explain their coexistence in the same biofilms. Coaggregation analysis showed that N. oligotropha subcluster II aggregated closer to Nitrospira than did N. oligotropha subcluster I in a pilot plant nitrifying trickling filter (NTF) and a moving-bed biofilm reactor (MBBR), but not in a full-scale NTF, indicating important ecophysiological differences between these phylogenetically closely related subclusters. By using high-resolution quantitative methods applicable to any multispecies biofilm in general, the ecological interactions of these complex ecosystems can be understood in more detail. PMID:23892743

  12. Quantitative comparison of cognitive behavioral therapy and music therapy research: a methodological best-practices analysis to guide future investigation for adult psychiatric patients.

    PubMed

    Silverman, Michael J

    2008-01-01

    While the music therapy profession is relatively young and small in size, it can treat a variety of clinical populations and has established a diverse research base. However, although the profession originated working with persons diagnosed with mental illnesses, there is a considerable lack of quantitative research concerning the effects of music therapy with this population. Music therapy clinicians and researchers have reported on this lack of evidence and the difficulty in conducting psychosocial research on their interventions (Choi, 1997; Silverman, 2003a). While published studies have provided suggestions for future research, no studies have provided detailed propositions for the methodology and design of meticulous high quality randomized controlled psychiatric music therapy research. How do other psychotherapies accomplish their databases and could the music therapy field borrow from their rigorous "methodological best practices" to strengthen its own literature base? Therefore, as the National Institutes of Mental Health state the treatment of choice for evidence-based psychotherapy is cognitive behavioral therapy (CBT), aspects of this psychotherapy's literature base were analyzed. The purpose of this literature analysis was to (a) analyze and identify components of high-quality quantitative CBT research for adult psychiatric consumers, (b) analyze and identify the variables and other elements of existing quantitative psychiatric music therapy research for adult consumers, and (c) compare the two data sets to identify the best methodological designs and variables for future quantitative music therapy research with the mental health population. A table analyzing randomized and thoroughly controlled studies involving the use of CBT for persons with severe mental illnesses is included to determine chief components of high-quality experimental research designs and implementation of quantitative clinical research. The table also shows the same analyzed components for existing quantitative psychiatric music therapy research with adult consumers, thus highlighting potential areas and elements for future investigations. A second table depicts a number of potential dependent measures and their sources to be evaluated in future music therapy studies. A third table providing suggestions for future research is derived from a synthesis of the tables and is included to guide researchers and encourage the advancement and expansion of the current literature base. The body of the paper is a discussion of the results of the literature analysis derived from the tables, meta-analyses, and reviews of literature. It is hoped that this report will lead to the addition of future high-quality quantitative research to the psychiatric music therapy literature base and thus provide evidence-based services to as many persons with mental illnesses as possible.

  13. Resistant Behaviors by People with Alzheimer Dementia and Traumatic Brain Injury

    DTIC Science & Technology

    2017-09-01

    participants has completed the information for the research team to have collected quantitative data on caregiver burden and family quality of life for...those adverse behaviors. The combined qualitative, quantitative , and economic analyses will also provide pertinent information regarding the general...other achievements. Include a discussion of stated goals not met. Description shall include pertinent data and graphs in sufficient detail to explain

  14. SPICE Module for the Satellite Orbit Analysis Program (SOAP)

    NASA Technical Reports Server (NTRS)

    Coggi, John; Carnright, Robert; Hildebrand, Claude

    2008-01-01

    A SPICE module for the Satellite Orbit Analysis Program (SOAP) precisely represents complex motion and maneuvers in an interactive, 3D animated environment with support for user-defined quantitative outputs. (SPICE stands for Spacecraft, Planet, Instrument, Camera-matrix, and Events). This module enables the SOAP software to exploit NASA mission ephemeris represented in the JPL Ancillary Information Facility (NAIF) SPICE formats. Ephemeris types supported include position, velocity, and orientation for spacecraft and planetary bodies including the Sun, planets, natural satellites, comets, and asteroids. Entire missions can now be imported into SOAP for 3D visualization, playback, and analysis. The SOAP analysis and display features can now leverage detailed mission files to offer the analyst both a numerically correct and aesthetically pleasing combination of results that can be varied to study many hypothetical scenarios. The software provides a modeling and simulation environment that can encompass a broad variety of problems using orbital prediction. For example, ground coverage analysis, communications analysis, power and thermal analysis, and 3D visualization that provide the user with insight into complex geometric relations are included. The SOAP SPICE module allows distributed science and engineering teams to share common mission models of known pedigree, which greatly reduces duplication of effort and the potential for error. The use of the software spans all phases of the space system lifecycle, from the study of future concepts to operations and anomaly analysis. It allows SOAP software to correctly position and orient all of the principal bodies of the Solar System within a single simulation session along with multiple spacecraft trajectories and the orientation of mission payloads. In addition to the 3D visualization, the user can define numeric variables and x-y plots to quantitatively assess metrics of interest.

  15. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    PubMed

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  17. Oestrus synchronisation and superovulation alter the cervicovaginal mucus proteome of the ewe.

    PubMed

    Maddison, Jessie W; Rickard, Jessica P; Bernecic, Naomi C; Tsikis, Guillaume; Soleilhavoup, Clement; Labas, Valerie; Combes-Soia, Lucie; Harichaux, Gregoire; Druart, Xavier; Leahy, Tamara; de Graaf, Simon P

    2017-02-23

    Although essential for artificial insemination (AI) and MOET (multiple ovulation and embryo transfer), oestrus synchronisation and superovulation are associated with increased female reproductive tract mucus production and altered sperm transport. The effects of such breeding practices on the ovine cervicovaginal (CV) mucus proteome have not been detailed. The aim of this study was to qualitatively and quantitatively investigate the Merino CV mucus proteome in naturally cycling (NAT) ewes at oestrus and mid-luteal phase, and quantitatively compare CV oestrus mucus proteomes of NAT, progesterone synchronised (P4) and superovulated (SOV) ewes. Quantitative analysis revealed 60 proteins were more abundant during oestrus and 127 were more abundant during the luteal phase, with 27 oestrus specific and 40 luteal specific proteins identified. The oestrus proteins most disparate in abundance compared to mid-luteal phase were ceruloplasmin (CP), chitinase-3-like protein 1 (CHI3L1), clusterin (CLU), alkaline phosphatase (ALPL) and mucin-16 (MUC16). Exogenous hormones greatly altered the proteome with 51 and 32 proteins more abundant and 98 and 53 proteins less abundant, in P4 and SOV mucus, respectively when compared to NAT mucus. Investigation of the impact of these proteomic changes on sperm motility and longevity within mucus may help improve sperm transport and fertility following cervical AI. This manuscript is the first to detail the proteome of ovine cervicovaginal mucus using qualitative and quantitative proteomic methods over the oestrous cycle in naturally cycling ewes, and also after application of common oestrus synchronisation and superovulation practices. The investigation of the mucus proteome throughout both the follicular and luteal periods of the oestrous cycle, and also after oestrous synchronisation and superovulation provides information about the endocrine control and the effects that exogenous hormones have on protein expression in the female reproductive tract. This information contributes to the field by providing important information on the changes that occur to the cervicovaginal mucus proteome after use of exogenous hormones in controlled breeding programs, which are commonly used on farm and also in a research setting. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Improved medical image fusion based on cascaded PCA and shift invariant wavelet transforms.

    PubMed

    Reena Benjamin, J; Jayasree, T

    2018-02-01

    In the medical field, radiologists need more informative and high-quality medical images to diagnose diseases. Image fusion plays a vital role in the field of biomedical image analysis. It aims to integrate the complementary information from multimodal images, producing a new composite image which is expected to be more informative for visual perception than any of the individual input images. The main objective of this paper is to improve the information, to preserve the edges and to enhance the quality of the fused image using cascaded principal component analysis (PCA) and shift invariant wavelet transforms. A novel image fusion technique based on cascaded PCA and shift invariant wavelet transforms is proposed in this paper. PCA in spatial domain extracts relevant information from the large dataset based on eigenvalue decomposition, and the wavelet transform operating in the complex domain with shift invariant properties brings out more directional and phase details of the image. The significance of maximum fusion rule applied in dual-tree complex wavelet transform domain enhances the average information and morphological details. The input images of the human brain of two different modalities (MRI and CT) are collected from whole brain atlas data distributed by Harvard University. Both MRI and CT images are fused using cascaded PCA and shift invariant wavelet transform method. The proposed method is evaluated based on three main key factors, namely structure preservation, edge preservation, contrast preservation. The experimental results and comparison with other existing fusion methods show the superior performance of the proposed image fusion framework in terms of visual and quantitative evaluations. In this paper, a complex wavelet-based image fusion has been discussed. The experimental results demonstrate that the proposed method enhances the directional features as well as fine edge details. Also, it reduces the redundant details, artifacts, distortions.

  19. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  20. Spectrally And Temporally Resolved Low-Light Level Video Microscopy

    NASA Astrophysics Data System (ADS)

    Wampler, John E.; Furukawa, Ruth; Fechheimer, Marcus

    1989-12-01

    The IDG law-light video microscope system was designed to aid studies of localization of subcellular luminescence sources and stimulus/response coupling in single living cells using luminescent probes. Much of the motivation for design of this instrument system came from the pioneering efforts of Dr. Reynolds (Reynolds, Q. Rev. Biophys. 5, 295-347; Reynolds and Taylor, Bioscience 30, 586-592) who showed the value of intensified video camera systems for detection and localizion of fluorescence and bioluminescence signals from biological tissues. Our instrument system has essentially two roles, 1) localization and quantitation of very weak bioluminescence signals and 2) quantitation of intracellular environmental characteristics such as pH and calcium ion concentrations using fluorescent and bioluminescent probes. The instrument system exhibits over one million fold operating range allowing visualization and enhancement of quantum limited images with quantum limited response, spectral analysis of fluorescence signals, and transmitted light imaging. The computer control of the system implements rapid switching between light regimes, spatially resolved spectral scanning, and digital data processing for spectral shape analysis and for detailed analysis of the statistical distribution of single cell measurements. The system design and software algorithms used by the system are summarized. These design criteria are illustrated with examples taken from studies of bioluminescence, applications of bioluminescence to study developmental processes and gene expression in single living cells, and applications of fluorescent probes to study stimulus/response coupling in living cells.

  1. Quantification of sugars and sugar phosphates in Arabidopsis thaliana tissues using porous graphitic carbon liquid chromatography-electrospray ionization mass spectrometry.

    PubMed

    Antonio, Carla; Larson, Tony; Gilday, Alison; Graham, Ian; Bergström, Ed; Thomas-Oates, Jane

    2007-11-23

    This work reports the development and optimisation of a negative ion mode on-line LC-ESI-MS/MS method for the sensitive targeted analysis of the key glycolytic intermediates, sugars and sugar phosphates from plants, using a porous graphitic carbon (PGC) stationary phase and an MS compatible mobile phase. Using this newly developed method, separation and detection of a solution of standard compounds is achieved in less than 20min. Target metabolite compounds were identified in plant extracts from their characteristic retention times, and product ion spectra. This on-line PGC-ESI-MS/MS method shows good linearity over the concentration range 0-100microM, selectivity, short analysis time, and limits of detection of 0.1microM for disaccharides trehalose (Tre), sucrose (Suc), and maltose, and 1.5microM for hexose phosphates fructose-6-phosphate (Fru6P), glucose-1-phosphate (Glc1P), and glucose-6-phosphate (Glc6P), and phosphoenolpyruvate (PEP). This paper describes details of our method and its application to the simultaneous quantitative analysis of soluble sugars and sugar phosphates from Arabidopsis thaliana tissues. We have demonstrated the utility of our method for the analysis of biological samples by applying it to the simultaneous quantitation of changes in soluble sugars and sugar phosphates in A. thaliana Columbia-0 (Col-0) and its starchless phosphoglucomutase (pgm) mutant over a 12-h light/12-h dark growth cycle.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smentkowski, Vincent S., E-mail: smentkow@ge.com

    Changes in the oxidation state of an element can result in significant changes in the ionization efficiency and hence signal intensity during secondary ion mass spectrometry (SIMS) analysis; this is referred to as the SIMS matrix effect [Secondary Ion Mass Spectrometry: A Practical Handbook for Depth Profiling and Bulk Impurity Analysis, edited by R. G. Wilson, F. A. Stevie, and C. W. Magee (Wiley, New York, 1990)]. The SIMS matrix effect complicates quantitative analysis. Quantification of SIMS data requires the determination of relative sensitivity factors (RSFs), which can be used to convert the as measured intensity into concentration units [Secondarymore » Ion Mass Spectrometry: A Practical Handbook for Depth Profiling and Bulk Impurity Analysis, edited by R. G. Wilson, F. A. Stevie, and C. W. Magee (Wiley, New York, 1990)]. In this manuscript, the authors report both: RSFs which were determined for quantification of B in Si and SiO{sub 2} matrices using a dual beam time of flight secondary ion mass spectrometry (ToF-SIMS) instrument and the protocol they are using to provide quantitative ToF-SIMS images and line scan traces. The authors also compare RSF values that were determined using oxygen and Ar ion beams for erosion, discuss the problems that can be encountered when bulk calibration samples are used to determine RSFs, and remind the reader that errors in molecular details of the matrix (density, volume, etc.) that are used to convert from atoms/cm{sup 3} to other concentration units will propagate into errors in the determined concentrations.« less

  3. On the connection between autoimmunity, tic disorders and obsessive-compulsive disorders: a meta-analysis on anti-streptolysin O titres.

    PubMed

    Pozzi, Marco; Pellegrino, Paolo; Carnovale, Carla; Perrone, Valentina; Antoniazzi, Stefania; Perrotta, Cristiana; Radice, Sonia; Clementi, Emilio

    2014-12-01

    Anti-streptolysin O (ASO) titration is useful in the context of autoimmune pathologies, including specific cases of tic and obsessive-compulsive disorders occurring after streptococcal infections. There is currently a lack of consensus on the use of ASO titres; therefore we performed a meta-analysis to systematise available data and clarify the role of ASO titres in the context of neuropsychiatric disorders. A meta-analysis was performed on ASO titration in neuropsychiatric patients, including tic disorders and obsessive-compulsive disorders. Included studies reported numbers of positive subjects, depending on a chosen threshold, or detailed ASO titrations. Three hundred and twenty nine studies were identified, of which 13 were eligible for meta-analysis. Due to limited available data, only tic disorders were evaluated. The odds ratio of finding an abnormal ASO titre in patients was 3.22 (95% C.I. 1.51-6.88) as compared to healthy controls and 16.14 (95% C.I. 8.11-32.11) as compared to non-psychiatric patients. Studies using different thresholds were generally concordant. ASO titres were also compared quantitatively, finding an overall difference of the means of 70.50 U/ml (95% C.I. 25.21-115.80) in favour of patients with tic disorders. Based on current evidence, tic disorders are associated with a significant increase in ASO titres, evident both in a threshold-level perspective and on a quantitative level. These results encourage the systematisation of ASO titration in the context of tic disorders.

  4. Fast Determination of Ingredients in Solid Pharmaceuticals by Microwave-Enhanced In-Source Decay of Microwave Plasma Torch Mass Spectrometry.

    PubMed

    Su, Rui; Wang, Xinchen; Hou, Changming; Yang, Meiling; Huang, Keke; Chen, Huanwen

    2017-09-01

    Rapid qualitative and quantitative analysis of solid samples (e.g., pharmaceutical preparations) by using a small and low-resolution mass spectrometer without MS/MS function is still a challenge in ambient pressure ionization mass spectrometric analysis. Herein, a practically efficient method termed microwave-enhanced in-source decay (MEISD) using microwave plasma torch desorption ionization coupled with time-of-flight mass spectrometry (MPTDI-TOF MS) was developed for fast analysis of pharmaceutical tablets using a miniature TOF mass spectrometer without tandem mass function. The intensity of ISD fragmentation was evaluated under different microwave power values. Several factors, including desorption distance and time that might affect the signal intensity and fragmentation, were systematically investigated. It was observed that both the protonated molecular ions and major fragment ions from the active ingredients in tablets could be found in the full-scan mass spectra in positive ion mode, which were comparable to those obtained by a commercial LTQ-XL ion trap mass spectrometer. The structures of the ingredients could be elucidated in detail using the MEISD method, which promotes our understanding of the desorption/ionization processes in microwave plasma torch (MPT). Quantitative analysis of 10 tablets was achieved by full-scan MPTDI-TOF MS with low limit of detection (LOD, 0.763 mg/g), acceptable relative standard deviation (RSD < 7.33%, n =10), and 10 s for each tablet, showing promising applications in high throughput screening of counterfeit drugs. Graphical Abstract ᅟ.

  5. Amyloid deposition in the hippocampus and entorhinal cortex: Quantitative analysis of a transgenic mouse model

    PubMed Central

    Reilly, John F.; Games, Dora; Rydel, Russell E.; Freedman, Stephen; Schenk, Dale; Young, Warren G.; Morrison, John H.; Bloom, Floyd E.

    2003-01-01

    Various transgenic mouse models of Alzheimer's disease (AD) have been developed that overexpress mutant forms of amyloid precursor protein in an effort to elucidate more fully the potential role of β-amyloid (Aβ) in the etiopathogenesis of the disease. The present study represents the first complete 3D reconstruction of Aβ in the hippocampus and entorhinal cortex of PDAPP transgenic mice. Aβ deposits were detected by immunostaining and thioflavin fluorescence, and quantified by using high-throughput digital image acquisition and analysis. Quantitative analysis of amyloid load in hippocampal subfields showed a dramatic increase between 12 and 15 months of age, with little or no earlier detectable deposition. Three-dimensional reconstruction in the oldest brains visualized previously unrecognized sheets of Aβ coursing through the hippocampus and cerebral cortex. In contrast with previous hypotheses, compact plaques form before significant deposition of diffuse Aβ, suggesting that different mechanisms are involved in the deposition of diffuse amyloid and the aggregation into plaques. The dentate gyrus was the hippocampal subfield with the greatest amyloid burden. Sublaminar distribution of Aβ in the dentate gyrus correlated most closely with the termination of afferent projections from the lateral entorhinal cortex, mirroring the selective vulnerability of this circuit in human AD. This detailed temporal and spatial analysis of Aβ and compact amyloid deposition suggests that specific corticocortical circuits express selective, but late, vulnerability to the pathognomonic markers of amyloid deposition, and can provide a basis for detecting prior vulnerability factors. PMID:12697936

  6. Fast Determination of Ingredients in Solid Pharmaceuticals by Microwave-Enhanced In-Source Decay of Microwave Plasma Torch Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Su, Rui; Wang, Xinchen; Hou, Changming; Yang, Meiling; Huang, Keke; Chen, Huanwen

    2017-09-01

    Rapid qualitative and quantitative analysis of solid samples (e.g., pharmaceutical preparations) by using a small and low-resolution mass spectrometer without MS/MS function is still a challenge in ambient pressure ionization mass spectrometric analysis. Herein, a practically efficient method termed microwave-enhanced in-source decay (MEISD) using microwave plasma torch desorption ionization coupled with time-of-flight mass spectrometry (MPTDI-TOF MS) was developed for fast analysis of pharmaceutical tablets using a miniature TOF mass spectrometer without tandem mass function. The intensity of ISD fragmentation was evaluated under different microwave power values. Several factors, including desorption distance and time that might affect the signal intensity and fragmentation, were systematically investigated. It was observed that both the protonated molecular ions and major fragment ions from the active ingredients in tablets could be found in the full-scan mass spectra in positive ion mode, which were comparable to those obtained by a commercial LTQ-XL ion trap mass spectrometer. The structures of the ingredients could be elucidated in detail using the MEISD method, which promotes our understanding of the desorption/ionization processes in microwave plasma torch (MPT). Quantitative analysis of 10 tablets was achieved by full-scan MPTDI-TOF MS with low limit of detection (LOD, 0.763 mg/g), acceptable relative standard deviation (RSD < 7.33%, n =10), and 10 s for each tablet, showing promising applications in high throughput screening of counterfeit drugs. [Figure not available: see fulltext.

  7. A biomechanical modeling guided simultaneous motion estimation and image reconstruction technique (SMEIR-Bio) for 4D-CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Huang, Xiaokun; Zhang, You; Wang, Jing

    2017-03-01

    Four-dimensional (4D) cone-beam computed tomography (CBCT) enables motion tracking of anatomical structures and removes artifacts introduced by motion. However, the imaging time/dose of 4D-CBCT is substantially longer/higher than traditional 3D-CBCT. We previously developed a simultaneous motion estimation and image reconstruction (SMEIR) algorithm, to reconstruct high-quality 4D-CBCT from limited number of projections to reduce the imaging time/dose. However, the accuracy of SMEIR is limited in reconstructing low-contrast regions with fine structure details. In this study, we incorporate biomechanical modeling into the SMEIR algorithm (SMEIR-Bio), to improve the reconstruction accuracy at low-contrast regions with fine details. The efficacy of SMEIR-Bio is evaluated using 11 lung patient cases and compared to that of the original SMEIR algorithm. Qualitative and quantitative comparisons showed that SMEIR-Bio greatly enhances the accuracy of reconstructed 4D-CBCT volume in low-contrast regions, which can potentially benefit multiple clinical applications including the treatment outcome analysis.

  8. Peer tutoring programs in health professions schools.

    PubMed

    Santee, Jennifer; Garavalia, Linda

    2006-06-15

    Peer tutoring programs may be one method of maintaining quality of pharmacy education in the face of growing student enrollment and a small faculty body. A critical review of the literature was performed to ascertain whether peer tutoring programs improve or maintain the academic performance of health care professional students. Various electronic databases and abstracts from past American Association of Colleges of Pharmacy's annual meetings were searched to identify pertinent research. Only those articles with quantitative data, an experimental design, and comparative statistical analysis were included for review. Most studies found that peer tutoring had a positive impact on academic performance. These results may not be readily generalizable as there were numerous methodological flaws and limited descriptions of the programs and participants. Studies with better designs and more detail are needed to answer definitively whether peer tutoring is of benefit. Details of what resources were required should be included in the study to allow the reader to determine the feasibility of the intervention.

  9. Peer Tutoring Programs in Health Professions Schools

    PubMed Central

    Garavalia, Linda

    2006-01-01

    Objective Peer tutoring programs may be one method of maintaining quality of pharmacy education in the face of growing student enrollment and a small faculty body. A critical review of the literature was performed to ascertain whether peer tutoring programs improve or maintain the academic performance of health care professional students. Methods Various electronic databases and abstracts from past American Association of Colleges of Pharmacy's annual meetings were searched to identify pertinent research. Only those articles with quantitative data, an experimental design, and comparative statistical analysis were included for review. Results Most studies found that peer tutoring had a positive impact on academic performance. These results may not be readily generalizable as there were numerous methodological flaws and limited descriptions of the programs and participants. Implications Studies with better designs and more detail are needed to answer definitively whether peer tutoring is of benefit. Details of what resources were required should be included in the study to allow the reader to determine the feasibility of the intervention. PMID:17136190

  10. Comprehensive two-dimensional gas chromatography for the analysis of Fischer-Tropsch oil products.

    PubMed

    van der Westhuizen, Rina; Crous, Renier; de Villiers, André; Sandra, Pat

    2010-12-24

    The Fischer-Tropsch (FT) process involves a series of catalysed reactions of carbon monoxide and hydrogen, originating from coal, natural gas or biomass, leading to a variety of synthetic chemicals and fuels. The benefits of comprehensive two-dimensional gas chromatography (GC×GC) compared to one-dimensional GC (1D-GC) for the detailed investigation of the oil products of low and high temperature FT processes are presented. GC×GC provides more accurate quantitative data to construct Anderson-Schultz-Flory (ASF) selectivity models that correlate the FT product distribution with reaction variables. On the other hand, the high peak capacity and sensitivity of GC×GC allow the detailed study of components present at trace level. Analyses of the aromatic and oxygenated fractions of a high temperature FT (HT-FT) process are presented. GC×GC data have been used to optimise or tune the HT-FT process by using a lab-scale micro-FT-reactor. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Election Turnout Statistics in Many Countries: Similarities, Differences, and a Diffusive Field Model for Decision-Making

    PubMed Central

    Borghesi, Christian; Raynal, Jean-Claude; Bouchaud, Jean-Philippe

    2012-01-01

    We study in details the turnout rate statistics for 77 elections in 11 different countries. We show that the empirical results established in a previous paper for French elections appear to hold much more generally. We find in particular that the spatial correlation of turnout rates decay logarithmically with distance in all cases. This result is quantitatively reproduced by a decision model that assumes that each voter makes his mind as a result of three influence terms: one totally idiosyncratic component, one city-specific term with short-ranged fluctuations in space, and one long-ranged correlated field which propagates diffusively in space. A detailed analysis reveals several interesting features: for example, different countries have different degrees of local heterogeneities and seem to be characterized by a different propensity for individuals to conform to the cultural norm. We furthermore find clear signs of herding (i.e., strongly correlated decisions at the individual level) in some countries, but not in others. PMID:22615762

  12. Host–guest complexes between cryptophane-C and chloromethanes revisited

    PubMed Central

    Takacs, Z; Soltesova, M; Kowalewski, J; Lang, J; Brotin, T; Dutasta, J-P

    2013-01-01

    Cryptophane-C is composed of two nonequivalent cyclotribenzylene caps, one of which contains methoxy group substituents on the phenyl rings. The two caps are connected by three OCH2CH2O linkers in an anti arrangement. Host–guest complexes of cryptophane-C with dichloromethane and chloroform in solution were investigated in detail by nuclear magnetic resonance techniques and density functional theory (DFT) calculations. Variable temperature proton and carbon-13 spectra show a variety of dynamic processes, such as guest exchange and host conformational transitions. The guest exchange was studied quantitatively by exchange spectroscopy measurements or by line-shape analysis. The conformational preferences of the guest-containing host were interpreted through cross-relaxation measurements, providing evidence of the gauche+2 and gauche−2 conformations of the linkers. In addition, the mobility of the chloroform guest inside the cavity was studied by carbon-13 relaxation experiments. Combining different types of evidence led to a detailed picture of molecular recognition, interpreted in terms of conformational selection. Copyright © 2012 John Wiley & Sons, Ltd. PMID:23132654

  13. Phytochemical screening, anticancer and antioxidant activities of Origanum vulgare L. ssp. viride (Boiss.) Hayek, a plant of traditional usage.

    PubMed

    Koldaş, Serkan; Demirtas, Ibrahim; Ozen, Tevfik; Demirci, Mehmet Ali; Behçet, Lütfi

    2015-03-15

    A detailed phytochemical analysis of Origanum vulgare L. ssp. viride (Boiss.) Hayek was carried out and the antioxidant activities of five different crude extracts were determined. The antiproliferative activities of the extracts were determined using the xCELLigence system (Real Time Cell Analyzer). Differences between the essential oil and volatile organic compound profiles of the plant were shown. The main component of the essential oil was caryophyllene oxide, while the main volatile organic compounds were sabinene and eucalyptol as determined by HS-GC/MS. Phenolic contents of the extracts were determined qualitatively and quantitatively by HPLC/TOF-MS. Ten phenolic compounds were found in the extracts from O. vulgare and Origanum acutidens: rosmarinic acid (in highest abundance), chicoric acid, caffeic acid, p-coumaric acid, gallic acid, quercetin, apigenin-7-glucoside, kaempferol, naringenin and 4-hydroxybenzaldehyde. This study provides first results on the antiproliferative and antioxidant properties and detailed phytochemical screening of O. vulgare ssp. viride (Boiss.) Hayek. © 2014 Society of Chemical Industry.

  14. GHRS observations and theoretical modeling of early type stars in R136a

    NASA Astrophysics Data System (ADS)

    de Koter, A.; Heap, S.; Hubeny, I.; Lanz, T.; Hutchings, J.; Lamers, H. J. G. L. M.; Maran, S.; Schmutz, W.

    1994-05-01

    We present the first spectroscopic observations of individual stars in R136a, the most dense part of the starburst cluster 30 Doradus in the LMC. Spectra of two stars are scheduled to be obtained with the GHRS on board the HST: R136a5, the brightest of the complex and R136a2, a Wolf-Rayet star of type WN. The 30 Doradus cluster is the only starburst region in which individual stars can be studied. Therefore, quantitative knowledge of the basic stellar parameters will yield valuable insight into the formation of massive stars in starbursts and into their subsequent evolution. Detailed modeling of the structure of the atmosphere and wind of these stars will also lead to a better understanding of the mechanism(s) that govern their dynamics. We present the first results of our detailed quantitative spectral analysis using state-of-the-art non-LTE model atmospheres for stars with extended and expanding atmospheres. The models are computed using the Improved-Sobolev Approximation wind code (ISA-WIND) of de Koter, Schmutz & Lamers (1993, A&A 277, 561), which has been extended to include C, N and Si. Our model computations are not based on the core-halo approximation, but use a unified treatment of the photosphere and wind. This approach is essential for Wolf-Rayet stars. Our synthetic spectra, dominated by the P Cygni profiles of the UV resonance lines, also account for the numerous weak metal lines of photospheric origin.

  15. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  16. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  17. Quantitative laser diagnostic and modeling study of C2 and CH chemistry in combustion.

    PubMed

    Köhler, Markus; Brockhinke, Andreas; Braun-Unkhoff, Marina; Kohse-Höinghaus, Katharina

    2010-04-15

    Quantitative concentration measurements of CH and C(2) have been performed in laminar, premixed, flat flames of propene and cyclopentene with varying stoichiometry. A combination of cavity ring-down (CRD) spectroscopy and laser-induced fluorescence (LIF) was used to enable sensitive detection of these species with high spatial resolution. Previously, CH and C(2) chemistry had been studied, predominantly in methane flames, to understand potential correlations of their formation and consumption. For flames of larger hydrocarbon fuels, however, quantitative information on these small intermediates is scarce, especially under fuel-rich conditions. Also, the combustion chemistry of C(2) in particular has not been studied in detail, and although it has often been observed, its role in potential build-up reactions of higher hydrocarbon species is not well understood. The quantitative measurements performed here are the first to detect both species with good spatial resolution and high sensitivity in the same experiment in flames of C(3) and C(5) fuels. The experimental profiles were compared with results of combustion modeling to reveal details of the formation and consumption of these important combustion molecules, and the investigation was devoted to assist the further understanding of the role of C(2) and of its potential chemical interdependences with CH and other small radicals.

  18. Biotin Switch Assays for Quantitation of Reversible Cysteine Oxidation.

    PubMed

    Li, R; Kast, J

    2017-01-01

    Thiol groups in protein cysteine residues can be subjected to different oxidative modifications by reactive oxygen/nitrogen species. Reversible cysteine oxidation, including S-nitrosylation, S-sulfenylation, S-glutathionylation, and disulfide formation, modulate multiple biological functions, such as enzyme catalysis, antioxidant, and other signaling pathways. However, the biological relevance of reversible cysteine oxidation is typically underestimated, in part due to the low abundance and high reactivity of some of these modifications, and the lack of methods to enrich and quantify them. To facilitate future research efforts, this chapter describes detailed procedures to target the different modifications using mass spectrometry-based biotin switch assays. By switching the modification of interest to a biotin moiety, these assays leverage the high affinity between biotin and avidin to enrich the modification. The use of stable isotope labeling and a range of selective reducing agents facilitate the quantitation of individual as well as total reversible cysteine oxidation. The biotin switch assay has been widely applied to the quantitative analysis of S-nitrosylation in different disease models and is now also emerging as a valuable research tool for other oxidative cysteine modifications, highlighting its relevance as a versatile, robust strategy for carrying out in-depth studies in redox proteomics. © 2017 Elsevier Inc. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Wei; Sneed, Brian T.; Zhou, Lin

    Alnico alloys have long been used as strong permanent magnets because of their ferromagnetism and high coercivity. Understanding their structural details allows for better prediction of the resulting magnetic properties. However, quantitative three-dimensional characterization of the phase separation in these alloys is still challenged by the spatial quantification of nanoscale phases. Herein, we apply a dual tomography approach, where correlative scanning transmission electron microscopy (STEM) energy-dispersive X-ray spectroscopic (EDS) tomography and atom probe tomography (APT) are used to investigate the initial phase separation process of an alnico 8 alloy upon non-magnetic annealing. STEM-EDS tomography provides information on the morphology andmore » volume fractions of Fe–Co-rich and Νi–Al-rich phases after spinodal decomposition in addition to quantitative information of the composition of a nanoscale volume. Subsequent analysis of a portion of the same specimen by APT offers quantitative chemical information of each phase at the sub-nanometer scale. Furthermore, APT reveals small, 2–4 nm Fe-rich α 1 phases that are nucleated in the Ni-rich α 2 matrix. From this information, we show that phase separation of the alnico 8 alloy consists of both spinodal decomposition and nucleation and growth processes. Lastly, we discuss the complementary benefits and challenges associated with correlative STEM-EDS and APT.« less

  20. Quantitative mass spectrometry imaging of emtricitabine in cervical tissue model using infrared matrix-assisted laser desorption electrospray ionization

    PubMed Central

    Bokhart, Mark T.; Rosen, Elias; Thompson, Corbin; Sykes, Craig; Kashuba, Angela D. M.; Muddiman, David C.

    2015-01-01

    A quantitative mass spectrometry imaging (QMSI) technique using infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) is demonstrated for the antiretroviral (ARV) drug emtricitabine in incubated human cervical tissue. Method development of the QMSI technique leads to a gain in sensitivity and removal of interferences for several ARV drugs. Analyte response was significantly improved by a detailed evaluation of several cationization agents. Increased sensitivity and removal of an isobaric interference was demonstrated with sodium chloride in the electrospray solvent. Voxel-to-voxel variability was improved for the MSI experiments by normalizing analyte abundance to a uniformly applied compound with similar characteristics to the drug of interest. Finally, emtricitabine was quantified in tissue with a calibration curve generated from the stable isotope-labeled analog of emtricitabine followed by cross-validation using liquid chromatography tandem mass spectrometry (LC-MS/MS). The quantitative IR-MALDESI analysis proved to be reproducible with an emtricitabine concentration of 17.2±1.8 μg/gtissue. This amount corresponds to the detection of 7 fmol/voxel in the IR-MALDESI QMSI experiment. Adjacent tissue slices were analyzed using LC-MS/MS which resulted in an emtricitabine concentration of 28.4±2.8 μg/gtissue. PMID:25318460

  1. Quantitative Assessment of RNA-Protein Interactions with High Throughput Sequencing - RNA Affinity Profiling (HiTS-RAP)

    PubMed Central

    Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.

    2016-01-01

    Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240

  2. Qualitative research methods: key features and insights gained from use in infection prevention research.

    PubMed

    Forman, Jane; Creswell, John W; Damschroder, Laura; Kowalski, Christine P; Krein, Sarah L

    2008-12-01

    Infection control professionals and hospital epidemiologists are accustomed to using quantitative research. Although quantitative studies are extremely important in the field of infection control and prevention, often they cannot help us explain why certain factors affect the use of infection control practices and identify the underlying mechanisms through which they do so. Qualitative research methods, which use open-ended techniques, such as interviews, to collect data and nonstatistical techniques to analyze it, provide detailed, diverse insights of individuals, useful quotes that bring a realism to applied research, and information about how different health care settings operate. Qualitative research can illuminate the processes underlying statistical correlations, inform the development of interventions, and show how interventions work to produce observed outcomes. This article describes the key features of qualitative research and the advantages that such features add to existing quantitative research approaches in the study of infection control. We address the goal of qualitative research, the nature of the research process, sampling, data collection and analysis, validity, generalizability of findings, and presentation of findings. Health services researchers are increasingly using qualitative methods to address practical problems by uncovering interacting influences in complex health care environments. Qualitative research methods, applied with expertise and rigor, can contribute important insights to infection prevention efforts.

  3. Ultrahigh photoconductivity of bandgap-graded CdSxSe1-x nanowires probed by terahertz spectroscopy

    NASA Astrophysics Data System (ADS)

    Liu, Hongwei; Lu, Junpeng; Yang, Zongyin; Teng, Jinghua; Ke, Lin; Zhang, Xinhai; Tong, Limin; Sow, Chorng Haur

    2016-06-01

    Superiorly high photoconductivity is desirable in optoelectronic materials and devices for information transmission and processing. Achieving high photoconductivity via bandgap engineering in a bandgap-graded semiconductor nanowire has been proposed as a potential strategy. In this work, we report the ultrahigh photoconductivity of bandgap-graded CdSxSe1-x nanowires and its detailed analysis by means of ultrafast optical-pump terahertz-probe (OPTP) spectroscopy. The recombination rates and carrier mobility are quantitatively obtained via investigation of the transient carrier dynamics in the nanowires. By analysis of the terahertz (THz) spectra, we obtain an insight into the bandgap gradient and band alignment to carrier transport along the nanowires. The demonstration of the ultrahigh photoconductivity makes bandgap-graded CdSxSe1-x nanowires a promising candidate as building blocks for nanoscale electronic and photonic devices.

  4. Quantitative model of diffuse speckle contrast analysis for flow measurement.

    PubMed

    Liu, Jialin; Zhang, Hongchao; Lu, Jian; Ni, Xiaowu; Shen, Zhonghua

    2017-07-01

    Diffuse speckle contrast analysis (DSCA) is a noninvasive optical technique capable of monitoring deep tissue blood flow. However, a detailed study of the speckle contrast model for DSCA has yet to be presented. We deduced the theoretical relationship between speckle contrast and exposure time and further simplified it to a linear approximation model. The feasibility of this linear model was validated by the liquid phantoms which demonstrated that the slope of this linear approximation was able to rapidly determine the Brownian diffusion coefficient of the turbid media at multiple distances using multiexposure speckle imaging. Furthermore, we have theoretically quantified the influence of optical property on the measurements of the Brownian diffusion coefficient which was a consequence of the fact that the slope of this linear approximation was demonstrated to be equal to the inverse of correlation time of the speckle.

  5. Diagnosis of Fanconi Anemia: Chromosomal Breakage Analysis

    PubMed Central

    Oostra, Anneke B.; Nieuwint, Aggie W. M.; Joenje, Hans; de Winter, Johan P.

    2012-01-01

    Fanconi anemia (FA) is a rare inherited syndrome with diverse clinical symptoms including developmental defects, short stature, bone marrow failure, and a high risk of malignancies. Fifteen genetic subtypes have been distinguished so far. The mode of inheritance for all subtypes is autosomal recessive, except for FA-B, which is X-linked. Cells derived from FA patients are—by definition—hypersensitive to DNA cross-linking agents, such as mitomycin C, diepoxybutane, or cisplatinum, which becomes manifest as excessive growth inhibition, cell cycle arrest, and chromosomal breakage upon cellular exposure to these drugs. Here we provide a detailed laboratory protocol for the accurate assessment of the FA diagnosis as based on mitomycin C-induced chromosomal breakage analysis in whole-blood cultures. The method also enables a quantitative estimate of the degree of mosaicism in the lymphocyte compartment of the patient. PMID:22693659

  6. Korean coastal water depth/sediment and land cover mapping (1:25,000) by computer analysis of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Park, K. Y.; Miller, L. D.

    1978-01-01

    Computer analysis was applied to single date LANDSAT MSS imagery of a sample coastal area near Seoul, Korea equivalent to a 1:50,000 topographic map. Supervised image processing yielded a test classification map from this sample image containing 12 classes: 5 water depth/sediment classes, 2 shoreline/tidal classes, and 5 coastal land cover classes at a scale of 1:25,000 and with a training set accuracy of 76%. Unsupervised image classification was applied to a subportion of the site analyzed and produced classification maps comparable in results in a spatial sense. The results of this test indicated that it is feasible to produce such quantitative maps for detailed study of dynamic coastal processes given a LANDSAT image data base at sufficiently frequent time intervals.

  7. A meta-analysis on progressive atrophy in intractable temporal lobe epilepsy

    PubMed Central

    Caciagli, Lorenzo; Bernasconi, Andrea; Wiebe, Samuel; Koepp, Matthias J.; Bernasconi, Neda

    2017-01-01

    Objective: It remains unclear whether drug-resistant temporal lobe epilepsy (TLE) is associated with cumulative brain damage, with no expert consensus and no quantitative syntheses of the available evidence. Methods: We conducted a systematic review and meta-analysis of MRI studies on progressive atrophy, searching PubMed and Ovid MEDLINE databases for cross-sectional and longitudinal quantitative MRI studies on drug-resistant TLE. Results: We screened 2,976 records and assessed eligibility of 248 full-text articles. Forty-two articles met the inclusion criteria for quantitative evaluation. We observed a predominance of cross-sectional studies, use of different clinical indices of progression, and high heterogeneity in age-control procedures. Meta-analysis of 18/1 cross-sectional/longitudinal studies on hippocampal atrophy (n = 979 patients) yielded a pooled effect size of r = −0.42 for ipsilateral atrophy related to epilepsy duration (95% confidence interval [CI] −0.51 to −0.32; p < 0.0001; I2 = 65.22%) and r = −0.35 related to seizure frequency (95% CI −0.47 to −0.22; p < 0.0001; I2 = 61.97%). Sensitivity analyses did not change the results. Narrative synthesis of 25/3 cross-sectional/longitudinal studies on whole brain atrophy (n = 1,504 patients) indicated that >80% of articles reported duration-related progression in extratemporal cortical and subcortical regions. Detailed analysis of study design features yielded low to moderate levels of evidence for progressive atrophy across studies, mainly due to dominance of cross-sectional over longitudinal investigations, use of diverse measures of seizure estimates, and absence of consistent age control procedures. Conclusions: While the neuroimaging literature is overall suggestive of progressive atrophy in drug-resistant TLE, published studies have employed rather weak designs to directly demonstrate it. Longitudinal multicohort studies are needed to unequivocally differentiate aging from disease progression. PMID:28687722

  8. Development of a quantitative morphological assessment of toxicant-treated zebrafish larvae using brightfield imaging and high-content analysis.

    PubMed

    Deal, Samantha; Wambaugh, John; Judson, Richard; Mosher, Shad; Radio, Nick; Houck, Keith; Padilla, Stephanie

    2016-09-01

    One of the rate-limiting procedures in a developmental zebrafish screen is the morphological assessment of each larva. Most researchers opt for a time-consuming, structured visual assessment by trained human observer(s). The present studies were designed to develop a more objective, accurate and rapid method for screening zebrafish for dysmorphology. Instead of the very detailed human assessment, we have developed the computational malformation index, which combines the use of high-content imaging with a very brief human visual assessment. Each larva was quickly assessed by a human observer (basic visual assessment), killed, fixed and assessed for dysmorphology with the Zebratox V4 BioApplication using the Cellomics® ArrayScan® V(TI) high-content image analysis platform. The basic visual assessment adds in-life parameters, and the high-content analysis assesses each individual larva for various features (total area, width, spine length, head-tail length, length-width ratio, perimeter-area ratio). In developing the computational malformation index, a training set of hundreds of embryos treated with hundreds of chemicals were visually assessed using the basic or detailed method. In the second phase, we assessed both the stability of these high-content measurements and its performance using a test set of zebrafish treated with a dose range of two reference chemicals (trans-retinoic acid or cadmium). We found the measures were stable for at least 1 week and comparison of these automated measures to detailed visual inspection of the larvae showed excellent congruence. Our computational malformation index provides an objective manner for rapid phenotypic brightfield assessment of individual larva in a developmental zebrafish assay. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Quantitative Classification of Rice (Oryza sativa L.) Root Length and Diameter Using Image Analysis.

    PubMed

    Gu, Dongxiang; Zhen, Fengxian; Hannaway, David B; Zhu, Yan; Liu, Leilei; Cao, Weixing; Tang, Liang

    2017-01-01

    Quantitative study of root morphological characteristics of plants is helpful for understanding the relationships between their morphology and function. However, few studies and little detailed and accurate information of root characteristics were reported in fine-rooted plants like rice (Oryza sativa L.). The aims of this study were to quantitatively classify fine lateral roots (FLRs), thick lateral roots (TLRs), and nodal roots (NRs) and analyze their dynamics of mean diameter (MD), lengths and surface area percentage with growth stages in rice plant. Pot experiments were carried out during three years with three rice cultivars, three nitrogen (N) rates and three water regimes. In cultivar experiment, among the three cultivars, root length of 'Yangdao 6' was longest, while the MD of its FLR was the smallest, and the mean diameters for TLR and NR were the largest, the surface area percentage (SAP) of TLRs (SAPT) was the highest, indicating that Yangdao 6 has better nitrogen and water uptake ability. High N rate increased the length of different types of roots and increased the MD of lateral roots, decreased the SAP of FLRs (SAPF) and TLRs, but increased the SAP of NRs (SAPN). Moderate decrease of water supply increased root length and diameter, water stress increased the SAPF and SAPT, but decreased SAPN. The quantitative results indicate that rice plant tends to increase lateral roots to get more surface area for nitrogen and water uptake when available assimilates are limiting under nitrogen and water stress environments.

  10. Quantifying Golgi structure using EM: combining volume-SEM and stereology for higher throughput.

    PubMed

    Ferguson, Sophie; Steyer, Anna M; Mayhew, Terry M; Schwab, Yannick; Lucocq, John Milton

    2017-06-01

    Investigating organelles such as the Golgi complex depends increasingly on high-throughput quantitative morphological analyses from multiple experimental or genetic conditions. Light microscopy (LM) has been an effective tool for screening but fails to reveal fine details of Golgi structures such as vesicles, tubules and cisternae. Electron microscopy (EM) has sufficient resolution but traditional transmission EM (TEM) methods are slow and inefficient. Newer volume scanning EM (volume-SEM) methods now have the potential to speed up 3D analysis by automated sectioning and imaging. However, they produce large arrays of sections and/or images, which require labour-intensive 3D reconstruction for quantitation on limited cell numbers. Here, we show that the information storage, digital waste and workload involved in using volume-SEM can be reduced substantially using sampling-based stereology. Using the Golgi as an example, we describe how Golgi populations can be sensed quantitatively using single random slices and how accurate quantitative structural data on Golgi organelles of individual cells can be obtained using only 5-10 sections/images taken from a volume-SEM series (thereby sensing population parameters and cell-cell variability). The approach will be useful in techniques such as correlative LM and EM (CLEM) where small samples of cells are treated and where there may be variable responses. For Golgi study, we outline a series of stereological estimators that are suited to these analyses and suggest workflows, which have the potential to enhance the speed and relevance of data acquisition in volume-SEM.

  11. Determination of ¹⁵N-incorporation into plant proteins and their absolute quantitation: a new tool to study nitrogen flux dynamics and protein pool sizes elicited by plant-herbivore interactions.

    PubMed

    Ullmann-Zeunert, Lynn; Muck, Alexander; Wielsch, Natalie; Hufsky, Franziska; Stanton, Mariana A; Bartram, Stefan; Böcker, Sebastian; Baldwin, Ian T; Groten, Karin; Svatoš, Aleš

    2012-10-05

    Herbivory leads to changes in the allocation of nitrogen among different pools and tissues; however, a detailed quantitative analysis of these changes has been lacking. Here, we demonstrate that a mass spectrometric data-independent acquisition approach known as LC-MS(E), combined with a novel algorithm to quantify heavy atom enrichment in peptides, is able to quantify elicited changes in protein amounts and (15)N flux in a high throughput manner. The reliable identification/quantitation of rabbit phosphorylase b protein spiked into leaf protein extract was achieved. The linear dynamic range, reproducibility of technical and biological replicates, and differences between measured and expected (15)N-incorporation into the small (SSU) and large (LSU) subunits of ribulose-1,5-bisphosphate-carboxylase/oxygenase (RuBisCO) and RuBisCO activase 2 (RCA2) of Nicotiana attenuata plants grown in hydroponic culture at different known concentrations of (15)N-labeled nitrate were used to further evaluate the procedure. The utility of the method for whole-plant studies in ecologically realistic contexts was demonstrated by using (15)N-pulse protocols on plants growing in soil under unknown (15)N-incorporation levels. Additionally, we quantified the amounts of lipoxygenase 2 (LOX2) protein, an enzyme important in antiherbivore defense responses, demonstrating that the approach allows for in-depth quantitative proteomics and (15)N flux analyses of the metabolic dynamics elicited during plant-herbivore interactions.

  12. Quantitative Classification of Rice (Oryza sativa L.) Root Length and Diameter Using Image Analysis

    PubMed Central

    Gu, Dongxiang; Zhen, Fengxian; Hannaway, David B.; Zhu, Yan; Liu, Leilei; Cao, Weixing; Tang, Liang

    2017-01-01

    Quantitative study of root morphological characteristics of plants is helpful for understanding the relationships between their morphology and function. However, few studies and little detailed and accurate information of root characteristics were reported in fine-rooted plants like rice (Oryza sativa L.). The aims of this study were to quantitatively classify fine lateral roots (FLRs), thick lateral roots (TLRs), and nodal roots (NRs) and analyze their dynamics of mean diameter (MD), lengths and surface area percentage with growth stages in rice plant. Pot experiments were carried out during three years with three rice cultivars, three nitrogen (N) rates and three water regimes. In cultivar experiment, among the three cultivars, root length of ‘Yangdao 6’ was longest, while the MD of its FLR was the smallest, and the mean diameters for TLR and NR were the largest, the surface area percentage (SAP) of TLRs (SAPT) was the highest, indicating that Yangdao 6 has better nitrogen and water uptake ability. High N rate increased the length of different types of roots and increased the MD of lateral roots, decreased the SAP of FLRs (SAPF) and TLRs, but increased the SAP of NRs (SAPN). Moderate decrease of water supply increased root length and diameter, water stress increased the SAPF and SAPT, but decreased SAPN. The quantitative results indicate that rice plant tends to increase lateral roots to get more surface area for nitrogen and water uptake when available assimilates are limiting under nitrogen and water stress environments. PMID:28103264

  13. Vessel wall characterization using quantitative MRI: what's in a number?

    PubMed

    Coolen, Bram F; Calcagno, Claudia; van Ooij, Pim; Fayad, Zahi A; Strijkers, Gustav J; Nederveen, Aart J

    2018-02-01

    The past decade has witnessed the rapid development of new MRI technology for vessel wall imaging. Today, with advances in MRI hardware and pulse sequences, quantitative MRI of the vessel wall represents a real alternative to conventional qualitative imaging, which is hindered by significant intra- and inter-observer variability. Quantitative MRI can measure several important morphological and functional characteristics of the vessel wall. This review provides a detailed introduction to novel quantitative MRI methods for measuring vessel wall dimensions, plaque composition and permeability, endothelial shear stress and wall stiffness. Together, these methods show the versatility of non-invasive quantitative MRI for probing vascular disease at several stages. These quantitative MRI biomarkers can play an important role in the context of both treatment response monitoring and risk prediction. Given the rapid developments in scan acceleration techniques and novel image reconstruction, we foresee the possibility of integrating the acquisition of multiple quantitative vessel wall parameters within a single scan session.

  14. Self, Voices and Embodiment: A Phenomenological Analysis

    PubMed Central

    Rosen, C; Jones, N; Chase, KA; Grossman, LS; Gin, H; Sharma, RP

    2016-01-01

    Objective The primary aim of this study was to examine first-person phenomenological descriptions of the relationship between the self and Auditory Verbal Hallucinations (AVHs). Complex AVHs are frequently described as entities with clear interpersonal characteristics. Strikingly, investigations of first-person (subjective) descriptions of the phenomenology of the relationship are virtually absent from the literature. Method Twenty participants with psychosis and actively experiencing AVHs were recruited from the University of Illinois at Chicago. A mixed-methods design involving qualitative and quantitative components was utilized. Following a priority-sequence model of complementarity, quantitative analyses were used to test elements of emergent qualitative themes. Results The qualitative analysis identified three foundational constructs in the relationship between self and voices: ‘understanding of origin,’ ‘distinct interpersonal identities,’ and ‘locus of control.’ Quantitative analyses further supported identified links of these constructs. Subjects experienced their AVHs as having identities distinct from self and actively engaged with their AVHs experienced a greater sense of autonomy and control over AVHs. Discussion Given the clinical importance of AVHs and emerging strategies targeting the relationship between the hearer and voices, our findings highlight the importance of these relational constructs in improvement and innovation of clinical interventions. Our analyses also underscore the value of detailed voice assessments such as those provided by the Maastricht Interview are needed in the evaluation process. Subjects narratives shows that the relational phenomena between hearer and AVH(s) is dynamic, and can be influenced and changed through the hearers’ engagement, conversation, and negotiation with their voices. PMID:27099869

  15. Estimating raw material equivalents on a macro-level: comparison of multi-regional input-output analysis and hybrid LCI-IO.

    PubMed

    Schoer, Karl; Wood, Richard; Arto, Iñaki; Weinzettel, Jan

    2013-12-17

    The mass of material consumed by a population has become a useful proxy for measuring environmental pressure. The "raw material equivalents" (RME) metric of material consumption addresses the issue of including the full supply chain (including imports) when calculating national or product level material impacts. The RME calculation suffers from data availability, however, as quantitative data on production practices along the full supply chain (in different regions) is required. Hence, the RME is currently being estimated by three main approaches: (1) assuming domestic technology in foreign economies, (2) utilizing region-specific life-cycle inventories (in a hybrid framework), and (3) utilizing multi-regional input-output (MRIO) analysis to explicitly cover all regions of the supply chain. While the first approach has been shown to give inaccurate results, this paper focuses on the benefits and costs of the latter two approaches. We analyze results from two key (MRIO and hybrid) projects modeling raw material equivalents, adjusting the models in a stepwise manner in order to quantify the effects of individual conceptual elements. We attempt to isolate the MRIO gap, which denotes the quantitative impact of calculating the RME of imports by an MRIO approach instead of the hybrid model, focusing on the RME of EU external trade imports. While, the models give quantitatively similar results, differences become more pronounced when tracking more detailed material flows. We assess the advantages and disadvantages of the two approaches and look forward to ways to further harmonize data and approaches.

  16. Impact of pore space topology on permeability, cut-off frequencies and validity of wave propagation theories

    NASA Astrophysics Data System (ADS)

    Sarout, Joël.

    2012-04-01

    For the first time, a comprehensive and quantitative analysis of the domains of validity of popular wave propagation theories for porous/cracked media is provided. The case of a simple, yet versatile rock microstructure is detailed. The microstructural parameters controlling the applicability of the scattering theories, the effective medium theories, the quasi-static (Gassmann limit) and dynamic (inertial) poroelasticity are analysed in terms of pores/cracks characteristic size, geometry and connectivity. To this end, a new permeability model is devised combining the hydraulic radius and percolation concepts. The predictions of this model are compared to published micromechanical models of permeability for the limiting cases of capillary tubes and penny-shaped cracks. It is also compared to published experimental data on natural rocks in these limiting cases. It explicitly accounts for pore space topology around the percolation threshold and far above it. Thanks to this permeability model, the scattering, squirt-flow and Biot cut-off frequencies are quantitatively compared. This comparison leads to an explicit mapping of the domains of validity of these wave propagation theories as a function of the rock's actual microstructure. How this mapping impacts seismic, geophysical and ultrasonic wave velocity data interpretation is discussed. The methodology demonstrated here and the outcomes of this analysis are meant to constitute a quantitative guide for the selection of the most suitable modelling strategy to be employed for prediction and/or interpretation of rocks elastic properties in laboratory-or field-scale applications when information regarding the rock's microstructure is available.

  17. A quantitative and high-throughput assay of human papillomavirus DNA replication.

    PubMed

    Gagnon, David; Fradet-Turcotte, Amélie; Archambault, Jacques

    2015-01-01

    Replication of the human papillomavirus (HPV) double-stranded DNA genome is accomplished by the two viral proteins E1 and E2 in concert with host DNA replication factors. HPV DNA replication is an established model of eukaryotic DNA replication and a potential target for antiviral therapy. Assays to measure the transient replication of HPV DNA in transfected cells have been developed, which rely on a plasmid carrying the viral origin of DNA replication (ori) together with expression vectors for E1 and E2. Replication of the ori-plasmid is typically measured by Southern blotting or PCR analysis of newly replicated DNA (i.e., DpnI digested DNA) several days post-transfection. Although extremely valuable, these assays have been difficult to perform in a high-throughput and quantitative manner. Here, we describe a modified version of the transient DNA replication assay that circumvents these limitations by incorporating a firefly luciferase expression cassette in cis of the ori. Replication of this ori-plasmid by E1 and E2 results in increased levels of firefly luciferase activity that can be accurately quantified and normalized to those of Renilla luciferase expressed from a control plasmid, thus obviating the need for DNA extraction, digestion, and analysis. We provide a detailed protocol for performing the HPV type 31 DNA replication assay in a 96-well plate format suitable for small-molecule screening and EC50 determinations. The quantitative and high-throughput nature of the assay should greatly facilitate the study of HPV DNA replication and the identification of inhibitors thereof.

  18. Functional quantitative susceptibility mapping (fQSM).

    PubMed

    Balla, Dávid Z; Sanchez-Panchuelo, Rosa M; Wharton, Samuel J; Hagberg, Gisela E; Scheffler, Klaus; Francis, Susan T; Bowtell, Richard

    2014-10-15

    Blood oxygenation level dependent (BOLD) functional magnetic resonance imaging (fMRI) is a powerful technique, typically based on the statistical analysis of the magnitude component of the complex time-series. Here, we additionally interrogated the phase data of the fMRI time-series and used quantitative susceptibility mapping (QSM) in order to investigate the potential of functional QSM (fQSM) relative to standard magnitude BOLD fMRI. High spatial resolution data (1mm isotropic) were acquired every 3 seconds using zoomed multi-slice gradient-echo EPI collected at 7 T in single orientation (SO) and multiple orientation (MO) experiments, the latter involving 4 repetitions with the subject's head rotated relative to B0. Statistical parametric maps (SPM) were reconstructed for magnitude, phase and QSM time-series and each was subjected to detailed analysis. Several fQSM pipelines were evaluated and compared based on the relative number of voxels that were coincidentally found to be significant in QSM and magnitude SPMs (common voxels). We found that sensitivity and spatial reliability of fQSM relative to the magnitude data depended strongly on the arbitrary significance threshold defining "activated" voxels in SPMs, and on the efficiency of spatio-temporal filtering of the phase time-series. Sensitivity and spatial reliability depended slightly on whether MO or SO fQSM was performed and on the QSM calculation approach used for SO data. Our results present the potential of fQSM as a quantitative method of mapping BOLD changes. We also critically discuss the technical challenges and issues linked to this intriguing new technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. High Concentrations of Atmospheric Ammonia Induce Alterations in the Hepatic Proteome of Broilers (Gallus gallus): An iTRAQ-Based Quantitative Proteomic Analysis

    PubMed Central

    Zhang, Jize; Li, Cong; Tang, Xiangfang; Lu, Qingping; Sa, Renna; Zhang, Hongfu

    2015-01-01

    With the development of the poultry industry, ammonia, as a main contaminant in the air, is causing increasing problems with broiler health. To date, most studies of ammonia toxicity have focused on the nervous system and the gastrointestinal tract in mammals. However, few detailed studies have been conducted on the hepatic response to ammonia toxicity in poultry. The molecular mechanisms that underlie these effects remain unclear. In the present study, our group applied isobaric tags for relative and absolute quantitation (iTRAQ)-based quantitative proteomic analysis to investigate changes in the protein profile change in hepatic tissue of broilers exposed to high concentrations of atmospheric ammonia, with the goal of characterizing the molecular mechanisms of chronic liver injury from exposure to high ambient levels of ammonia. Overall, 30 differentially expressed proteins that are involved in nutrient metabolism (energy, lipid, and amino acid), immune response, transcriptional and translational regulation, stress response, and detoxification were identified. In particular, two of these proteins, beta-1 galactosidase (GLB1) and a kinase (PRKA) anchor protein 8-like (AKAP8 L), were previously suggested to be potential biomarkers of chronic liver injury. In addition to the changes in the protein profile, serum parameters and histochemical analyses of hepatic tissue also showed extensive hepatic damage in ammonia-exposed broilers. Altogether, these findings suggest that longtime exposure to high concentrations of atmospheric ammonia can trigger chronic hepatic injury in broilers via different mechanisms, providing new information that can be used for intervention using nutritional strategies in the future. PMID:25901992

  20. Quantitative HPLC Analysis of an Analgesic/Caffeine Formulation: Determination of Caffeine

    NASA Astrophysics Data System (ADS)

    Ferguson, Glenda K.

    1998-04-01

    A modern high performance liquid chromatography (HPLC) laboratory experiment which entails the separation of acetaminophen, aspirin, and caffeine and the quantitative assay of caffeine in commercial mixtures of these compounds has been developed. Our HPLC protocol resolves these compounds in only three minutes with a straightforward chromatographic apparatus which consists of a C-18 column, an isocratic mobile phase, UV detection at 254 nm, and an integrator; an expensive, sophisticated system is not required. The separation is both repeatable and rapid. Moreover, the experiment can be completed in a single three-hour period. The experiment is appropriate for any chemistry student who has completed a minimum of one year of general chemistry and is ideal for an analytical or instrumental analysis course. The experiment detailed herein involves the determination of caffeine in Goody's Extra Strength Headache Powders, a commercially available medication which contains acetaminophen, aspirin, and caffeine as active ingredients. However, the separation scheme is not limited to this brand of medication nor is it limited to caffeine as the analyte. With only minor procedural modifications, students can simultaneously quantitate all of these compounds in a commercial mixture. In our procedure, students prepare a series of four caffeine standard solutions as well as a solution from a pharmaceutical analgesic/caffeine mixture, chromatographically analyze each solution in quadruplicate, and plot relative average caffeine standard peak area versus concentration. From the mathematical relationship that results, the concentration of caffeine in the commercial formulation is obtained. Finally, the absolute standard deviation of the mean concentration is calculated.

  1. Effects of atrazine in fish, amphibians, and reptiles: an analysis based on quantitative weight of evidence.

    PubMed

    Van Der Kraak, Glen J; Hosmer, Alan J; Hanson, Mark L; Kloas, Werner; Solomon, Keith R

    2014-12-01

    A quantitative weight of evidence (WoE) approach was developed to evaluate studies used for regulatory purposes, as well as those in the open literature, that report the effects of the herbicide atrazine on fish, amphibians, and reptiles. The methodology for WoE analysis incorporated a detailed assessment of the relevance of the responses observed to apical endpoints directly related to survival, growth, development, and reproduction, as well as the strength and appropriateness of the experimental methods employed. Numerical scores were assigned for strength and relevance. The means of the scores for relevance and strength were then used to summarize and weigh the evidence for atrazine contributing to ecologically significant responses in the organisms of interest. The summary was presented graphically in a two-dimensional graph which showed the distributions of all the reports for a response. Over 1290 individual responses from studies in 31 species of fish, 32 amphibians, and 8 reptiles were evaluated. Overall, the WoE showed that atrazine might affect biomarker-type responses, such as expression of genes and/or associated proteins, concentrations of hormones, and biochemical processes (e.g. induction of detoxification responses), at concentrations sometimes found in the environment. However, these effects were not translated to adverse outcomes in terms of apical endpoints. The WoE approach provided a quantitative, transparent, reproducible, and robust framework that can be used to assist the decision-making process when assessing environmental chemicals. In addition, the process allowed easy identification of uncertainty and inconsistency in observations, and thus clearly identified areas where future investigations can be best directed.

  2. 3D MR flow analysis in realistic rapid-prototyping model systems of the thoracic aorta: comparison with in vivo data and computational fluid dynamics in identical vessel geometries.

    PubMed

    Canstein, C; Cachot, P; Faust, A; Stalder, A F; Bock, J; Frydrychowicz, A; Küffer, J; Hennig, J; Markl, M

    2008-03-01

    The knowledge of local vascular anatomy and function in the human body is of high interest for the diagnosis and treatment of cardiovascular disease. A comprehensive analysis of the hemodynamics in the thoracic aorta is presented based on the integration of flow-sensitive 4D MRI with state-of-the-art rapid prototyping technology and computational fluid dynamics (CFD). Rapid prototyping was used to transform aortic geometries as measured by contrast-enhanced MR angiography into realistic vascular models with large anatomical coverage. Integration into a flow circuit with patient-specific pulsatile in-flow conditions and application of flow-sensitive 4D MRI permitted detailed analysis of local and global 3D flow dynamics in a realistic vascular geometry. Visualization of characteristic 3D flow patterns and quantitative comparisons of the in vitro experiments with in vivo data and CFD simulations in identical vascular geometries were performed to evaluate the accuracy of vascular model systems. The results indicate the potential of such patient-specific model systems for detailed experimental simulation of realistic vascular hemodynamics. Further studies are warranted to examine the influence of refined boundary conditions of the human circulatory system such as fluid-wall interaction and their effect on normal and pathological blood flow characteristics associated with vascular geometry. (c) 2008 Wiley-Liss, Inc.

  3. 7 CFR 301.1-2 - Criteria for special need requests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... must contain the following: (1) Data drawn from a scientifically sound detection survey, showing that... State or political subdivision. The request should contain detailed information, including quantitative...

  4. 7 CFR 301.1-2 - Criteria for special need requests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... must contain the following: (1) Data drawn from a scientifically sound detection survey, showing that... State or political subdivision. The request should contain detailed information, including quantitative...

  5. 7 CFR 301.1-2 - Criteria for special need requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... must contain the following: (1) Data drawn from a scientifically sound detection survey, showing that... State or political subdivision. The request should contain detailed information, including quantitative...

  6. 7 CFR 301.1-2 - Criteria for special need requests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... must contain the following: (1) Data drawn from a scientifically sound detection survey, showing that... State or political subdivision. The request should contain detailed information, including quantitative...

  7. 7 CFR 301.1-2 - Criteria for special need requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... must contain the following: (1) Data drawn from a scientifically sound detection survey, showing that... State or political subdivision. The request should contain detailed information, including quantitative...

  8. Prediction of tautomer ratios by embedded-cluster integral equation theory

    NASA Astrophysics Data System (ADS)

    Kast, Stefan M.; Heil, Jochen; Güssregen, Stefan; Schmidt, K. Friedemann

    2010-04-01

    The "embedded cluster reference interaction site model" (EC-RISM) approach combines statistical-mechanical integral equation theory and quantum-chemical calculations for predicting thermodynamic data for chemical reactions in solution. The electronic structure of the solute is determined self-consistently with the structure of the solvent that is described by 3D RISM integral equation theory. The continuous solvent-site distribution is mapped onto a set of discrete background charges ("embedded cluster") that represent an additional contribution to the molecular Hamiltonian. The EC-RISM analysis of the SAMPL2 challenge set of tautomers proceeds in three stages. Firstly, the group of compounds for which quantitative experimental free energy data was provided was taken to determine appropriate levels of quantum-chemical theory for geometry optimization and free energy prediction. Secondly, the resulting workflow was applied to the full set, allowing for chemical interpretations of the results. Thirdly, disclosure of experimental data for parts of the compounds facilitated a detailed analysis of methodical issues and suggestions for future improvements of the model. Without specifically adjusting parameters, the EC-RISM model yields the smallest value of the root mean square error for the first set (0.6 kcal mol-1) as well as for the full set of quantitative reaction data (2.0 kcal mol-1) among the SAMPL2 participants.

  9. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  10. Simultaneous determination of eleven preservatives in cosmetics by micellar electrokinetic chromatography.

    PubMed

    Wang, Ping; Ding, Xiaojing; Li, Yun; Yang, Yuanyuan

    2012-01-01

    A new method for the simultaneous quantitation of 11 preservatives-imidazolidinyl urea, benzyl alcohol, dehydroacetic acid, sorbic acid, phenoxyethanol, benzoic acid, salicylic acid, and four parabens (methyl, ethyl, propyl, and butyl)-in cosmetics by micellar electrokinetic capillary chromatography was established and validated. The separation was performed using an uncoated fused-silica capillary (50 pm id x 60.2 cm, effective length 50 cm) with a running buffer consisting of 15 mmol/L sodium tetraborate, 60 mmol/L boric acid, and 100 mmol/L sodium dodecyl sulfate. A 1:10 dilution of the running buffer was used as the sample buffer to extract the cosmetic samples. The key factors, such as the concentration and pH of the running and sample buffers, which influence quantitative analysis of the above 11 preservatives in cosmetic samples, were investigated in detail. The linear ranges of the calibration curves for imidazolidinyl urea and the other 10 preservatives were 50-1000 and 10-200 mg/L, respectively. The correlation coefficients of the standard curves were all higher than 0.999. The recoveries at the concentrations studied ranged from 93.0 to 102.7%. RSDs were all less than 5%. The new method with simple sample pretreatment met the needs for routine analysis of the 11 preservatives in cosmetics.

  11. Perpetuating stigma? Differences between advertisements for psychiatric and non-psychiatric medication in two professional journals.

    PubMed

    Foster, Juliet L H

    2010-02-01

    Continuing debates regarding advertising and the pharmaceutical industry, and others detailing the continued stigmatization of mental health problems. To establish whether there are any differences in advertisements for psychiatric and non-psychiatric medication aimed at health professionals. Quantitative (t-tests, Chi-squared) and qualitative analysis of all unique advertisements for medication that appeared in two professional journals (the British Medical Journal and the British Journal of Psychiatry) between October 2005 and September 2006 was undertaken. Close attention was paid to both images and text used in the advertisements. Significant differences were found between advertisements for psychiatric and non-psychiatric medication in both quantitative and qualitative analysis: advertisements for psychiatric medication contain less text and are less likely to include specific information about the actual drug than non-psychiatric medication advertisements; images used in advertisements for psychiatric medication are more negative than those used for non-psychiatric medication, and are less likely to portray people in everyday situations. A distinction between mental health problems and other forms of ill health is clearly being maintained in medication advertisements; this has potentially stigmatizing consequences, both for professional and public perceptions. There are also troubling implications in light of the debates surrounding Direct to Consumer Advertising.

  12. Wide-scale quantitative phosphoproteomic analysis reveals that cold treatment of T cells closely mimics soluble antibody stimulation

    PubMed Central

    Ji, Qinqin; Salomon, Arthur R.

    2015-01-01

    The activation of T-lymphocytes through antigen-mediated T-cell receptor (TCR) clustering is vital in regulating the adaptive-immune response. Although T cell receptor signaling has been extensively studied, the fundamental mechanisms for signal initiation are not fully understood. Reduced temperature initiated some of the hallmarks of TCR signaling such as increased phosphorylation and activation on ERK and calcium release from the endoplasmic reticulum as well as coalesce T-cell membrane microdomains. The precise mechanism of TCR signaling initiation due to temperature change remains obscure. One critical question is whether signaling initiated by cold treatment of T cells differs from signaling initiated by crosslinking of the T cell receptor. To address this uncertainty, a wide-scale, quantitative mass spectrometry-based phosphoproteomic analysis was performed on T cells stimulated either by temperature shift or through crosslinking of the TCR. Careful statistical comparison between the two stimulations revealed a striking level of identity between the subset of 339 sites that changed significantly with both stimulations. This study demonstrates for the first time, at unprecedented detail, that T cell cold treatment was sufficient to initiate signaling patterns nearly identical to soluble antibody stimulation, shedding new light on the mechanism of activation of these critically important immune cells. PMID:25839225

  13. Quantitative 4D analyses of epithelial folding during Drosophila gastrulation.

    PubMed

    Khan, Zia; Wang, Yu-Chiun; Wieschaus, Eric F; Kaschube, Matthias

    2014-07-01

    Understanding the cellular and mechanical processes that underlie the shape changes of individual cells and their collective behaviors in a tissue during dynamic and complex morphogenetic events is currently one of the major frontiers in developmental biology. The advent of high-speed time-lapse microscopy and its use in monitoring the cellular events in fluorescently labeled developing organisms demonstrate tremendous promise in establishing detailed descriptions of these events and could potentially provide a foundation for subsequent hypothesis-driven research strategies. However, obtaining quantitative measurements of dynamic shapes and behaviors of cells and tissues in a rapidly developing metazoan embryo using time-lapse 3D microscopy remains technically challenging, with the main hurdle being the shortage of robust imaging processing and analysis tools. We have developed EDGE4D, a software tool for segmenting and tracking membrane-labeled cells using multi-photon microscopy data. Our results demonstrate that EDGE4D enables quantification of the dynamics of cell shape changes, cell interfaces and neighbor relations at single-cell resolution during a complex epithelial folding event in the early Drosophila embryo. We expect this tool to be broadly useful for the analysis of epithelial cell geometries and movements in a wide variety of developmental contexts. © 2014. Published by The Company of Biologists Ltd.

  14. Endosomal Interactions during Root Hair Growth

    PubMed Central

    von Wangenheim, Daniel; Rosero, Amparo; Komis, George; Šamajová, Olga; Ovečka, Miroslav; Voigt, Boris; Šamaj, Jozef

    2016-01-01

    The dynamic localization of endosomal compartments labeled with targeted fluorescent protein tags is routinely followed by time lapse fluorescence microscopy approaches and single particle tracking algorithms. In this way trajectories of individual endosomes can be mapped and linked to physiological processes as cell growth. However, other aspects of dynamic behavior including endosomal interactions are difficult to follow in this manner. Therefore, we characterized the localization and dynamic properties of early and late endosomes throughout the entire course of root hair formation by means of spinning disc time lapse imaging and post-acquisition automated multitracking and quantitative analysis. Our results show differential motile behavior of early and late endosomes and interactions of late endosomes that may be specified to particular root hair domains. Detailed data analysis revealed a particular transient interaction between late endosomes—termed herein as dancing-endosomes—which is not concluding to vesicular fusion. Endosomes preferentially located in the root hair tip interacted as dancing-endosomes and traveled short distances during this interaction. Finally, sizes of early and late endosomes were addressed by means of super-resolution structured illumination microscopy (SIM) to corroborate measurements on the spinning disc. This is a first study providing quantitative microscopic data on dynamic spatio-temporal interactions of endosomes during root hair tip growth. PMID:26858728

  15. Endosomal Interactions during Root Hair Growth.

    PubMed

    von Wangenheim, Daniel; Rosero, Amparo; Komis, George; Šamajová, Olga; Ovečka, Miroslav; Voigt, Boris; Šamaj, Jozef

    2015-01-01

    The dynamic localization of endosomal compartments labeled with targeted fluorescent protein tags is routinely followed by time lapse fluorescence microscopy approaches and single particle tracking algorithms. In this way trajectories of individual endosomes can be mapped and linked to physiological processes as cell growth. However, other aspects of dynamic behavior including endosomal interactions are difficult to follow in this manner. Therefore, we characterized the localization and dynamic properties of early and late endosomes throughout the entire course of root hair formation by means of spinning disc time lapse imaging and post-acquisition automated multitracking and quantitative analysis. Our results show differential motile behavior of early and late endosomes and interactions of late endosomes that may be specified to particular root hair domains. Detailed data analysis revealed a particular transient interaction between late endosomes-termed herein as dancing-endosomes-which is not concluding to vesicular fusion. Endosomes preferentially located in the root hair tip interacted as dancing-endosomes and traveled short distances during this interaction. Finally, sizes of early and late endosomes were addressed by means of super-resolution structured illumination microscopy (SIM) to corroborate measurements on the spinning disc. This is a first study providing quantitative microscopic data on dynamic spatio-temporal interactions of endosomes during root hair tip growth.

  16. Effects of Gas-Phase Radiation and Detailed Kinetics on the Burning and Extinction of a Solid Fuel

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer L.

    2001-01-01

    This is the first attempt to analyze both radiation and detailed kinetics on the burning and extinction of a solid fuel in a stagnation-point diffusion flame. We present a detailed and comparatively accurate computational model of a solid fuel flame along with a quantitative study of the kinetics mechanism, radiation interactions, and the extinction limits of the flame. A detailed kinetics model for the burning of solid trioxane (a trimer of formaldehyde) is coupled with a narrowband radiation model, with carbon dioxide, carbon monoxide, and water vapor as the gas-phase participating media. The solution of the solid trioxane diffusion flame over the flammable regime is presented in some detail, as this is the first solution of a heterogeneous trioxane flame. We identify high-temperature and low-temperature reaction paths for the heterogeneous trioxane flame. We then compare the adiabatic solution to solutions that include Surface radiation only and gas-phase and surface radiation using a black surface model. The analysis includes discussion of detailed flame chemistry over the flammable regime and, in particular, at the low stretch extinction limit. We emphasize the low stretch regime of the radiatively participating flame, since this is the region representative of microgravity flames. When only surface radiation is included, two extinction limits exist (the blow-off limit, and the low stretch radiative limit), and the burning rate and maximum flame temperatures are lower, as expected. With the inclusion of surface and gas-phase radiation, results show that, while flame temperatures are lower, the burning rate of the trioxane diffusion flame may actually increase at low stretch rate due to radiative feedback from the flame to the surface.

  17. Guidelines for evaluating fish habitat in Wisconsin streams.

    Treesearch

    Timothy D. Simonson; John Lyons; Paul D. Kanehl

    1993-01-01

    Describes procedures for evaluating the quality and quantity of habitat for fish in small and medium streams of Wisconsin. Provides detailed guidelines for collecting and analyzing specific quantitative habitat information.

  18. Colorimetric Determination of pH.

    ERIC Educational Resources Information Center

    Tucker, Sheryl; And Others

    1989-01-01

    Presented is an activity in which the pH of a solution can be quantitatively measured using a spectrophotometer. The theory, experimental details, sample preparation and selection, instrumentation, and results are discussed. (CW)

  19. Quantitative x-ray diffraction mineralogy of Los Angeles basin core samples

    USGS Publications Warehouse

    Hein, James R.; McIntyre, Brandie R.; Edwards, Brian D.; Lakota, Orion I.

    2006-01-01

    This report contains X-ray diffraction (XRD) analysis of mineralogy for 81 sediment samples from cores taken from three drill holes in the Los Angeles Basin in 2000-2001. We analyzed 26 samples from Pier F core, 29 from Pier C core, and 26 from the Webster core. These three sites provide an offshore-onshore record across the Southern California coastal zone. This report is designed to be a data repository; these data will be used in further studies, including geochemical modeling as part of the CABRILLO project. Summary tables quantify the major mineral groups, whereas detailed mineralogy is presented in three appendices. The rationale, methodology, and techniques are described in the following paper.

  20. Maximum power point tracker for photovoltaic power plants

    NASA Astrophysics Data System (ADS)

    Arcidiacono, V.; Corsi, S.; Lambri, L.

    The paper describes two different closed-loop control criteria for the maximum power point tracking of the voltage-current characteristic of a photovoltaic generator. The two criteria are discussed and compared, inter alia, with regard to the setting-up problems that they pose. Although a detailed analysis is not embarked upon, the paper also provides some quantitative information on the energy advantages obtained by using electronic maximum power point tracking systems, as compared with the situation in which the point of operation of the photovoltaic generator is not controlled at all. Lastly, the paper presents two high-efficiency MPPT converters for experimental photovoltaic plants of the stand-alone and the grid-interconnected type.

  1. Mathematical modeling of gene expression: a guide for the perplexed biologist

    PubMed Central

    Ay, Ahmet; Arnosti, David N.

    2011-01-01

    The detailed analysis of transcriptional networks holds a key for understanding central biological processes, and interest in this field has exploded due to new large-scale data acquisition techniques. Mathematical modeling can provide essential insights, but the diversity of modeling approaches can be a daunting prospect to investigators new to this area. For those interested in beginning a transcriptional mathematical modeling project we provide here an overview of major types of models and their applications to transcriptional networks. In this discussion of recent literature on thermodynamic, Boolean and differential equation models we focus on considerations critical for choosing and validating a modeling approach that will be useful for quantitative understanding of biological systems. PMID:21417596

  2. Functional-analytical capabilities of GIS technology in the study of water use risks

    NASA Astrophysics Data System (ADS)

    Nevidimova, O. G.; Yankovich, E. P.; Yankovich, K. S.

    2015-02-01

    Regional security aspects of economic activities are of great importance for legal regulation in environmental management. This has become a critical issue due to climate change, especially in regions where severe climate conditions have a great impact on almost all types of natural resource uses. A detailed analysis of climate and hydrological situation in Tomsk Oblast considering water use risks was carried out. Based on developed author's techniques an informational and analytical database was created using ArcGIS software platform, which combines statistical (quantitative) and spatial characteristics of natural hazards and socio-economic factors. This system was employed to perform areal zoning according to the degree of water use risks involved.

  3. Working toward resilience: a retrospective report of actions taken in support of a New York school crisis team following 9/11.

    PubMed

    Johnson, Kendall; Luna, Joanne M Tortorici

    2011-01-01

    A retrospective report details external support rendered to a Lower Manhattan school crisis team following the 9/11/01 terrorist attack on the World Trade Center This analysis occasions an opportunity for consideration of working assumptions, the formative use of data to plan support actions, and the subsequent emergence of a collaborative approach to post-disaster team support in school settings. The nature of assessment and nature of subsequent service delivery illustrates a community resilience-based approach to school crisis management. Recommendations for such work are based upon mixed qualitative and quantitative data gathered from on-scene team members as part of the ongoing support effort.

  4. Variation of BMP3 Contributes to Dog Breed Skull Diversity

    PubMed Central

    Schoenebeck, Jeffrey J.; Hutchinson, Sarah A.; Byers, Alexandra; Beale, Holly C.; Carrington, Blake; Faden, Daniel L.; Rimbault, Maud; Decker, Brennan; Kidd, Jeffrey M.; Sood, Raman; Boyko, Adam R.; Fondon, John W.; Wayne, Robert K.; Bustamante, Carlos D.; Ciruna, Brian; Ostrander, Elaine A.

    2012-01-01

    Since the beginnings of domestication, the craniofacial architecture of the domestic dog has morphed and radiated to human whims. By beginning to define the genetic underpinnings of breed skull shapes, we can elucidate mechanisms of morphological diversification while presenting a framework for understanding human cephalic disorders. Using intrabreed association mapping with museum specimen measurements, we show that skull shape is regulated by at least five quantitative trait loci (QTLs). Our detailed analysis using whole-genome sequencing uncovers a missense mutation in BMP3. Validation studies in zebrafish show that Bmp3 function in cranial development is ancient. Our study reveals the causal variant for a canine QTL contributing to a major morphologic trait. PMID:22876193

  5. Lipid membranes and single ion channel recording for the advanced physics laboratory

    NASA Astrophysics Data System (ADS)

    Klapper, Yvonne; Nienhaus, Karin; Röcker, Carlheinz; Ulrich Nienhaus, G.

    2014-05-01

    We present an easy-to-handle, low-cost, and reliable setup to study various physical phenomena on a nanometer-thin lipid bilayer using the so-called black lipid membrane technique. The apparatus allows us to precisely measure optical and electrical properties of free-standing lipid membranes, to study the formation of single ion channels, and to gain detailed information on the ion conduction properties of these channels using statistical physics and autocorrelation analysis. The experiments are well suited as part of an advanced physics or biophysics laboratory course; they interconnect physics, chemistry, and biology and will be appealing to students of the natural sciences who are interested in quantitative experimentation.

  6. Impact of Relativistic Electron Beam on Hole Acoustic Instability in Quantum Semiconductor Plasmas

    NASA Astrophysics Data System (ADS)

    Siddique, M.; Jamil, M.; Rasheed, A.; Areeb, F.; Javed, Asif; Sumera, P.

    2018-01-01

    We studied the influence of the classical relativistic beam of electrons on the hole acoustic wave (HAW) instability exciting in the semiconductor quantum plasmas. We conducted this study by using the quantum-hydrodynamic model of dense plasmas, incorporating the quantum effects of semiconductor plasma species which include degeneracy pressure, exchange-correlation potential and Bohm potential. Analysis of the quantum characteristics of semiconductor plasma species along with relativistic effect of beam electrons on the dispersion relation of the HAW is given in detail qualitatively and quantitatively by plotting them numerically. It is worth mentioning that the relativistic electron beam (REB) stabilises the HAWs exciting in semiconductor (GaAs) degenerate plasma.

  7. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Development of silicon photonic microring resonator biosensors for multiplexed cytokine assays and in vitro diagnostics

    NASA Astrophysics Data System (ADS)

    Luchansky, Matthew Sam

    In order to guide critical care therapies that are personalized to a patient's unique disease state, a diagnostic or theranostic medical device must quickly provide a detailed biomolecular understanding of disease onset and progression. This detailed molecular understanding of cellular processes and pathways requires the ability to measure multiple analytes in parallel. Though many traditional sensing technologies for biomarker analysis and fundamental biological studies (i.e. enzyme-linked immunosorbent assays, real-time polymerase chain reaction, etc.) rely on single-parameter measurements, it has become increasingly clear that the inherent complexity of many human illnesses and pathways necessitates quantitative and multiparameter analysis of biological samples. Currently used analytical methods are deficient in that they often provide either highly quantitative data for a single biomarker or qualitative data for many targets, but methods that simultaneously provide highly quantitative analysis of many targets have yet to be adequately developed. Fields such as medical diagnostics and cellular biology would benefit greatly from a technology that enables rapid, quantitative and reproducible assays for many targets within a single sample. In an effort to fill this unmet need, this doctoral dissertation describes the development of a clinically translational biosensing technology based on silicon photonics and developed in the chemistry research laboratory of Ryan C. Bailey. Silicon photonic microring resonators, a class of high-Q optical sensors, represent a promising platform for rapid, multiparameter in vitro measurements. The original device design utilizes 32-ring arrays for real-time biomolecular sensing without fluorescent labels, and these optical biosensors display great potential for more highly multiplexed (100s-1000s) measurements based on the impressive scalability of silicon device fabrication. Though this technology can be used to detect a variety of molecules, this dissertation establishes the utility of microring resonator chips for multiparameter analysis of several challenging protein targets in cell cultures, human blood sera, and other clinical samples such as cerebrospinal fluid. Various sandwich immunoassay formats for diverse protein analytes are described herein, but the bulk of this dissertation focuses on applying the technology to cytokine analysis. Cytokines are small signaling proteins that are present in serum and cell secretomes at concentrations in the pg/mL or ng/mL range. Cytokines are very challenging to quantitate due to their low abundance and small size, but play important roles in a variety of immune response and inflammatory pathways; cytokine quantitation is thus important in fundamental biological studies and diagnostics, and complex and overlapping cytokine roles make multiplexed measurements especially vital. In a typical experiment, microfluidics are used to spatially control chip functionalization by directing capture antibodies against a variety of protein targets to groups of microring sensors. In each case, binding of analytes to the rings causes a change in the local refractive index that is transduced into a real-time, quantitative optical signal. This photonic sensing modality is based on the interaction of the propagating evanescent field with molecules near the ring surface. Since each microring sensor in the array is monitored independently, this technology allows multiple proteins to be quantified in parallel from a single sample. This dissertation describes the fabrication, characterization, development, and application of silicon photonic microring resonator technology to multiplexed protein measurements in a variety of biological systems. Chapter 1 introduces the field of high-Q optical sensors and places microring resonator technology within the broader context of related whispering gallery mode devices. The final stages of cleanroom device fabrication, in which 8" silicon wafers that contain hundreds of ring resonator arrays are transformed into individual functional chips, are described in Chapter 2. Chapter 3 characterizes the physical and optical properties of the microring resonator arrays, especially focusing on the evanescent field profile and mass sensitivity metrics. Chapter 4 demonstrates the ability to apply ring resonator technology to cytokine detection and T cell secretion analysis. Chapter 5 builds on the initial cytokine work to demonstrate the simultaneous detection of multiple cytokines with higher throughput to enable studies of T cell differentiation. In preparation for reaching the goal of cytokine analysis in clinical samples, Chapter 6 describes magnetic bead-based signal enhancement of sandwich immunoassays for serum analysis. Additional examples of the utility of nanoparticles and sub-micron beads for signal amplification are described in Chapter 7, also demonstrating the ability to monitor single bead binding events. Chapter 8 describes an alternative cytokine signal enhancement strategy based on enzymatic amplification for human cerebrospinal fluid (CSF) analysis. Chapter 9 adds work with other CSF protein targets that are relevant to the continuing development of a multiparameter Alzheimer's Disease diagnostic chip. Future directions for multiplexed protein analysis as it pertains to important immunological studies and in vitro diagnostic applications are defined in Chapter 10. (Abstract shortened by UMI.).

  9. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.

  10. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  11. Range Process Simulation Tool

    NASA Technical Reports Server (NTRS)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  12. Photochemical Reactions of Tris (Oxalato) Iron (III): A First-Year Chemistry Experiment.

    ERIC Educational Resources Information Center

    Baker, A. D.; And Others

    1980-01-01

    Describes a first-year chemistry experiment that illustrates the fundamental concepts of a photoinduced reaction. Qualitative and quantitative parts of the photoreduction of potassium ferrioxalate are detailed. (CS)

  13. On the frequency-magnitude distribution of converging boundaries

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Laura, S.; Heuret, A.; Funiciello, F.

    2011-12-01

    The occurrence of the last mega-thrust earthquake in Japan has clearly remarked the high risk posed to society by such events in terms of social and economic losses even at large spatial scale. The primary component for a balanced and objective mitigation of the impact of these earthquakes is the correct forecast of where such kind of events may occur in the future. To date, there is a wide range of opinions about where mega-thrust earthquakes can occur. Here, we aim at presenting some detailed statistical analysis of a database of worldwide interplate earthquakes occurring at current subduction zones. The database has been recently published in the framework of the EURYI Project 'Convergent margins and seismogenesis: defining the risk of great earthquakes by using statistical data and modelling', and it provides a unique opportunity to explore in detail the seismogenic process in subducting lithosphere. In particular, the statistical analysis of this database allows us to explore many interesting scientific issues such as the existence of different frequency-magnitude distributions across the trenches, the quantitative characterization of subduction zones that are able to produce more likely mega-thrust earthquakes, the prominent features that characterize converging boundaries with different seismic activity and so on. Besides the scientific importance, such issues may lead to improve our mega-thrust earthquake forecasting capability.

  14. Investigation of the multiplet features of SrTiO 3 in X-ray absorption spectra based on configuration interaction calculations

    DOE PAGES

    Wu, M.; Xin, Houlin L.; Wang, J. O.; ...

    2018-04-24

    Synchrotron-based L 2,3-edge absorption spectra show strong sensitivities to the local electronic structure and chemical environment. However, detailed physical information cannot be extracted easily without computational aids. Here in this study using the experimental Ti L 2,3-edges absorption spectrum of SrTiO 3as a fingerprint and considering full multiplet effects, calculations yield different energy parameters characterizing local ground state properties. The peak splitting and intensity ratios of the L 3 and L 2 set of peaks are carefully analyzed quantitatively, giving rise to a small hybridization energy around 1.2 eV, and the different hybridization energy values reported in the literature aremore » further addressed. Finally, absorption spectra with different linearly polarized photons under various tetragonal crystal fields are investigated, revealing a non-linear orbital–lattice interaction, and a theoretical guidance for material engineering of SrTiO 3-based thin films and heterostructures is offered. Finally, detailed analysis of spectrum shifts with different tetragonal crystal fields suggests that the e g crystal field splitting is a necessary parameter for a thorough analysis of the spectra, even though it is not relevant for the ground state properties.« less

  15. Investigation of the multiplet features of SrTiO 3 in X-ray absorption spectra based on configuration interaction calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, M.; Xin, Houlin L.; Wang, J. O.

    Synchrotron-based L 2,3-edge absorption spectra show strong sensitivities to the local electronic structure and chemical environment. However, detailed physical information cannot be extracted easily without computational aids. Here in this study using the experimental Ti L 2,3-edges absorption spectrum of SrTiO 3as a fingerprint and considering full multiplet effects, calculations yield different energy parameters characterizing local ground state properties. The peak splitting and intensity ratios of the L 3 and L 2 set of peaks are carefully analyzed quantitatively, giving rise to a small hybridization energy around 1.2 eV, and the different hybridization energy values reported in the literature aremore » further addressed. Finally, absorption spectra with different linearly polarized photons under various tetragonal crystal fields are investigated, revealing a non-linear orbital–lattice interaction, and a theoretical guidance for material engineering of SrTiO 3-based thin films and heterostructures is offered. Finally, detailed analysis of spectrum shifts with different tetragonal crystal fields suggests that the e g crystal field splitting is a necessary parameter for a thorough analysis of the spectra, even though it is not relevant for the ground state properties.« less

  16. Quantifying the Contribution of the Liver to Glucose Homeostasis: A Detailed Kinetic Model of Human Hepatic Glucose Metabolism

    PubMed Central

    König, Matthias; Bulik, Sascha; Holzhütter, Hermann-Georg

    2012-01-01

    Despite the crucial role of the liver in glucose homeostasis, a detailed mathematical model of human hepatic glucose metabolism is lacking so far. Here we present a detailed kinetic model of glycolysis, gluconeogenesis and glycogen metabolism in human hepatocytes integrated with the hormonal control of these pathways by insulin, glucagon and epinephrine. Model simulations are in good agreement with experimental data on (i) the quantitative contributions of glycolysis, gluconeogenesis, and glycogen metabolism to hepatic glucose production and hepatic glucose utilization under varying physiological states. (ii) the time courses of postprandial glycogen storage as well as glycogen depletion in overnight fasting and short term fasting (iii) the switch from net hepatic glucose production under hypoglycemia to net hepatic glucose utilization under hyperglycemia essential for glucose homeostasis (iv) hormone perturbations of hepatic glucose metabolism. Response analysis reveals an extra high capacity of the liver to counteract changes of plasma glucose level below 5 mM (hypoglycemia) and above 7.5 mM (hyperglycemia). Our model may serve as an important module of a whole-body model of human glucose metabolism and as a valuable tool for understanding the role of the liver in glucose homeostasis under normal conditions and in diseases like diabetes or glycogen storage diseases. PMID:22761565

  17. Interpreting comprehensive two-dimensional gas chromatography using peak topography maps with application to petroleum forensics.

    PubMed

    Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M

    2016-01-01

    Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.

  18. Quantitative evaluation of orbital hybridization in carbon nanotubes under radial deformation using π-orbital axis vector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohnishi, Masato, E-mail: masato.ohnishi@rift.mech.tohoku.ac.jp; Suzuki, Ken; Miura, Hideo, E-mail: hmiura@rift.mech.tohoku.ac.jp

    2015-04-15

    When a radial strain is applied to a carbon nanotube (CNT), the increase in local curvature induces orbital hybridization. The effect of the curvature-induced orbital hybridization on the electronic properties of CNTs, however, has not been evaluated quantitatively. In this study, the strength of orbital hybridization in CNTs under homogeneous radial strain was evaluated quantitatively. Our analyses revealed the detailed procedure of the change in electronic structure of CNTs. In addition, the dihedral angle, the angle between π-orbital axis vectors of adjacent atoms, was found to effectively predict the strength of local orbital hybridization in deformed CNTs.

  19. Quantitative evaluation of the requirements for the promotion as associate professor at German medical faculties.

    PubMed

    Sorg, Heiko; Knobloch, Karsten

    2012-01-01

    First quantitative evaluation of the requirements for the promotion as associate professor (AP) at German medical faculties. Analysis of the AP-regulations of German medical faculties according to a validated scoring system, which has been adapted to this study. The overall scoring for the AP-requirements at 35 German medical faculties was 13.5±0.6 of 20 possible scoring points (95% confidence interval 12.2-14.7). More than 88% of the AP-regulations demand sufficient performance in teaching and research with adequate scientific publication. Furthermore, 83% of the faculties expect an expert review of the candidate's performance. Conference presentations required as an assistant professor as well as the reduction of the minimum time as an assistant professor do only play minor roles. The requirements for assistant professors to get nominated as an associate professor at German medical faculties are high with an only small range. In detail, however, it can be seen that there still exists large heterogeneity, which hinders equal opportunities and career possibilities. These data might be used for the ongoing objective discussion.

  20. A simple two-stage design for quantitative responses with application to a study in diabetic neuropathic pain.

    PubMed

    Whitehead, John; Valdés-Márquez, Elsa; Lissmats, Agneta

    2009-01-01

    Two-stage designs offer substantial advantages for early phase II studies. The interim analysis following the first stage allows the study to be stopped for futility, or more positively, it might lead to early progression to the trials needed for late phase II and phase III. If the study is to continue to its second stage, then there is an opportunity for a revision of the total sample size. Two-stage designs have been implemented widely in oncology studies in which there is a single treatment arm and patient responses are binary. In this paper the case of two-arm comparative studies in which responses are quantitative is considered. This setting is common in therapeutic areas other than oncology. It will be assumed that observations are normally distributed, but that there is some doubt concerning their standard deviation, motivating the need for sample size review. The work reported has been motivated by a study in diabetic neuropathic pain, and the development of the design for that trial is described in detail. Copyright 2008 John Wiley & Sons, Ltd.

  1. In vivo NMR microscopy allows short-term serial assessment of multiple skeletal implications of corticosteroid exposure

    PubMed Central

    Takahashi, Masaya; Wehrli, Felix W.; Hilaire, Luna; Zemel, Babette S.; Hwang, Scott N.

    2002-01-01

    Corticosteroids are in widespread clinical use but are known to have adverse skeletal side effects. Moreover, it is not known how soon these effects become apparent. Here, we describe a longitudinal approach to evaluate the short-term implications of excess corticosteroid exposure by quantitative in vivo magnetic resonance imaging and spectroscopy in conjunction with digital image processing and analysis in a rabbit model. Two-week treatment with dexamethasone induced a significant reduction in trabecular bone volume, which occurred at the expense of uniform trabecular thinning without affecting network architecture. Paralleling the loss in bone volume was conversion of hematopoietic to yellow marrow in the femoral metaphysis and atrophy of the femoral epiphyseal growth plate. This work demonstrates that detailed quantitative morphometric and physiological information can be obtained noninvasively at multiple skeletal locations. The method is likely to eventually replace invasive histomorphometry in that it obviates the need to sacrifice groups of animals at multiple time points. Finally, this work, which was performed on a clinical scanner, has implications for evaluating patients on high-dose steroid treatment. PMID:11904367

  2. A series of strategies for solving the shortage of reference standards for multi-components determination of traditional Chinese medicine, Mahoniae Caulis as a case.

    PubMed

    Wang, Wenguang; Ma, Xiaoli; Guo, Xiaoyu; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong

    2015-09-18

    In order to solve the bottleneck of reference standards shortage for comprehensive quality control of traditional Chinese medicines (TCMs), a series of strategies, including one single reference standard to determine multi-compounds (SSDMC), quantitative analysis by standardized reference extract (QASRE), and quantitative nuclear magnetic resonance spectroscopy (qNMR) were proposed, and Mahoniae Caulis was selected as an example to develop and validate these methods for simultaneous determination of four alkaloids, columbamine, jatrorrhizine, palmatine, and berberine. Comprehensive comparisons among these methods and with the conventional external standard method (ESM) were carried out. The relative expanded uncertainty of measurement was firstly used to compare their credibility. The results showed that all these three new developed methods can accurately accomplish the quantification by using only one purified reference standard, but each of them has its own advantages and disadvantages as well as the specific application scope, which were also discussed in detail in this paper. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Changes in cell morphology due to plasma membrane wounding by acoustic cavitation

    PubMed Central

    Schlicher, Robyn K.; Hutcheson, Joshua D.; Radhakrishna, Harish; Apkarian, Robert P.; Prausnitz, Mark R.

    2010-01-01

    Acoustic cavitation-mediated wounding (i.e., sonoporation) has great potential to improve medical and laboratory applications requiring intracellular uptake of exogenous molecules; however, the field lacks detailed understanding of cavitation-induced morphological changes in cells and their relative importance. Here, we present an in-depth study of the effects of acoustic cavitation on cells using electron and confocal microscopy coupled with quantitative flow cytometry. High resolution images of treated cells show that morphologically different types of blebs can occur after wounding conditions caused by ultrasound exposure as well as by mechanical shear and strong laser ablation. In addition, these treatments caused wound-induced non-lytic necrotic death resulting in cell bodies we call wound-derived perikarya (WD-P). However, only cells exposed to acoustic cavitation experienced ejection of intact nuclei and nearly instant lytic necrosis. Quantitative analysis by flow cytometry indicates that wound-derived perikarya are the dominant morphology of nonviable cells, except at the strongest wounding conditions, where nuclear ejection accounts for a significant portion of cell death after ultrasound exposure. PMID:20350691

  4. Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

    PubMed

    Counsell, Alyssa; Harlow, Lisa L

    2017-05-01

    With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.

  5. Investigation of EMIC wave scattering as the cause for the BARREL 17 January 2013 relativistic electron precipitation event: A quantitative comparison of simulation with observations

    DOE PAGES

    Li, Zan; Millan, Robyn M.; Hudson, Mary K.; ...

    2014-12-23

    Electromagnetic ion cyclotron (EMIC) waves were observed at multiple observatory locations for several hours on 17 January 2013. During the wave activity period, a duskside relativistic electron precipitation (REP) event was observed by one of the Balloon Array for Radiation belt Relativistic Electron Losses (BARREL) balloons and was magnetically mapped close to Geostationary Operational Environmental Satellite (GOES) 13. We simulate the relativistic electron pitch angle diffusion caused by gyroresonant interactions with EMIC waves using wave and particle data measured by multiple instruments on board GOES 13 and the Van Allen Probes. We show that the count rate, the energy distribution,more » and the time variation of the simulated precipitation all agree very well with the balloon observations, suggesting that EMIC wave scattering was likely the cause for the precipitation event. The event reported here is the first balloon REP event with closely conjugate EMIC wave observations, and our study employs the most detailed quantitative analysis on the link of EMIC waves with observed REP to date.« less

  6. Choice of Illumination System & Fluorophore for Multiplex Immunofluorescence on FFPE Tissue Sections

    PubMed Central

    Kishen, Ria E. B.; Kluth, David C.; Bellamy, Christopher O. C.

    2016-01-01

    The recent availability of novel dyes and alternative light sources to facilitate complex tissue immunofluorescence studies such as multiplex labelling has not been matched by reports critically evaluating the considerations and relative benefits of these new tools, particularly in combination. Product information is often limited to wavelengths used for older fluorophores (FITC, TRITC & corresponding Alexa dyes family). Consequently, novel agents such as Quantum dots are not widely appreciated or used, despite highly favourable properties including extremely bright emission, stability and potentially reduced tissue autofluorescence at the excitation wavelength. Using spectral analysis, we report here a detailed critical appraisal and comparative evaluation of different light sources and fluorophores in multiplex immunofluorescence of clinical biopsy sections. The comparison includes mercury light, metal halide and 3 different LED-based systems, using 7 Qdots (525, 565, 585, 605, 625, 705), Cy3 and Cy5. We discuss the considerations relevant to achieving the best combination of light source and fluorophore for accurate multiplex fluorescence quantitation. We highlight practical limitations and confounders to quantitation with filter-based approaches. PMID:27632367

  7. [Comparisons and analysis of the spectral response functions' difference between FY-2E's and FY2C's split window channels].

    PubMed

    Zhang, Yong; Li, Yuan; Rong, Zhi-Guo

    2010-06-01

    Remote sensors' channel spectral response function (SRF) was one of the key factors to influence the quantitative products' inversion algorithm, accuracy and the geophysical characteristics. Aiming at the adjustments of FY-2E's split window channels' SRF, detailed comparisons between the FY-2E and FY-2C corresponding channels' SRF differences were carried out based on three data collections: the NOAA AVHRR corresponding channels' calibration look up tables, field measured water surface radiance and atmospheric profiles at Lake Qinghai and radiance calculated from the PLANK function within all dynamic range of FY-2E/C. The results showed that the adjustments of FY-2E's split window channels' SRF would result in the spectral range's movements and influence the inversion algorithms of some ground quantitative products. On the other hand, these adjustments of FY-2E SRFs would increase the brightness temperature differences between FY-2E's two split window channels within all dynamic range relative to FY-2C's. This would improve the inversion ability of FY-2E's split window channels.

  8. Investigation of EMIC wave scattering as the cause for the BARREL 17 January 2013 relativistic electron precipitation event: A quantitative comparison of simulation with observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zan; Millan, Robyn M.; Hudson, Mary K.

    Electromagnetic ion cyclotron (EMIC) waves were observed at multiple observatory locations for several hours on 17 January 2013. During the wave activity period, a duskside relativistic electron precipitation (REP) event was observed by one of the Balloon Array for Radiation belt Relativistic Electron Losses (BARREL) balloons and was magnetically mapped close to Geostationary Operational Environmental Satellite (GOES) 13. We simulate the relativistic electron pitch angle diffusion caused by gyroresonant interactions with EMIC waves using wave and particle data measured by multiple instruments on board GOES 13 and the Van Allen Probes. We show that the count rate, the energy distribution,more » and the time variation of the simulated precipitation all agree very well with the balloon observations, suggesting that EMIC wave scattering was likely the cause for the precipitation event. The event reported here is the first balloon REP event with closely conjugate EMIC wave observations, and our study employs the most detailed quantitative analysis on the link of EMIC waves with observed REP to date.« less

  9. Assessing Microneurosurgical Skill with Medico-Engineering Technology.

    PubMed

    Harada, Kanako; Morita, Akio; Minakawa, Yoshiaki; Baek, Young Min; Sora, Shigeo; Sugita, Naohiko; Kimura, Toshikazu; Tanikawa, Rokuya; Ishikawa, Tatsuya; Mitsuishi, Mamoru

    2015-10-01

    Most methods currently used to assess surgical skill are rather subjective or not adequate for microneurosurgery. Objective and quantitative microneurosurgical skill assessment systems that are capable of accurate measurements are necessary for the further development of microneurosurgery. Infrared optical motion tracking markers, an inertial measurement unit, and strain gauges were mounted on tweezers to measure many parameters related to instrument manipulation. We then recorded the activity of 23 neurosurgeons. The task completion time, tool path, and needle-gripping force were evaluated for three stitches made in an anastomosis of 0.7-mm artificial blood vessels. Videos of the activity were evaluated by three blinded expert surgeons. Surgeons who had recently done many bypass procedures demonstrated better skills. These skilled surgeons performed the anastomosis with in a shorter time, with a shorter tool path, and with a lesser force when extracting the needle. These results show the potential contribution of the system to microsurgical skill assessment. Quantitative and detailed analysis of surgical tasks helps surgeons better understand the key features of the required skills. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Heavy metal concentrations in soils as determined by laser-induced breakdown spectroscopy (LIBS), with special emphasis on chromium.

    PubMed

    Senesi, G S; Dell'Aglio, M; Gaudiuso, R; De Giacomo, A; Zaccone, C; De Pascale, O; Miano, T M; Capitelli, M

    2009-05-01

    Soil is unanimously considered as one of the most important sink of heavy metals released by human activities. Heavy metal analysis of natural and polluted soils is generally conducted by the use of atomic absorption spectroscopy (AAS) or inductively coupled plasma optical emission spectroscopy (ICP-OES) on adequately obtained soil extracts. Although in recent years the emergent technique of laser-induced breakdown spectroscopy (LIBS) has been applied widely and with increasing success for the qualitative and quantitative analyses of a number of heavy metals in soil matrices with relevant simplification of the conventional methodologies, the technique still requires further confirmation before it can be applied fully successfully in soil analyses. The main objective of this work was to demonstrate that new developments in LIBS technique are able to provide reliable qualitative and quantitative analytical evaluation of several heavy metals in soils, with special focus on the element chromium (Cr), and with reference to the concentrations measured by conventional ICP spectroscopy. The preliminary qualitative LIBS analysis of five soil samples and one sewage sludge sample has allowed the detection of a number of elements including Al, Ca, Cr, Cu, Fe, Mg, Mn, Pb, Si, Ti, V and Zn. Of these, a quantitative analysis was also possible for the elements Cr, Cu, Pb, V and Zn based on the obtained linearity of the calibration curves constructed for each heavy metal, i.e., the proportionality between the intensity of the LIBS emission peaks and the concentration of each heavy metal in the sample measured by ICP. In particular, a triplet of emission lines for Cr could be used for its quantitative measurement. The consistency of experiments made on various samples was supported by the same characteristics of the laser-induced plasma (LIP), i.e., the typical linear distribution confirming the existence of local thermodynamic equilibrium (LTE) condition, and similar excitation temperatures and comparable electron number density measured for all samples. An index of the anthropogenic contribution of Cr in polluted soils was calculated in comparison to a non-polluted reference soil. Thus, the intensity ratios of the emission lines of heavy metal can be used to detect in few minutes the polluted areas for which a more detailed sampling and analysis can be useful.

  11. Morphological Properties of Siloxane-Hydrogel Contact Lens Surfaces.

    PubMed

    Stach, Sebastian; Ţălu, Ştefan; Trabattoni, Silvia; Tavazzi, Silvia; Głuchaczka, Alicja; Siek, Patrycja; Zając, Joanna; Giovanzana, Stefano

    2017-04-01

    The aim of this study was to quantitatively characterize the micromorphology of contact lens (CL) surfaces using atomic force microscopy (AFM) and multifractal analysis. AFM and multifractal analysis were used to characterize the topography of new and worn siloxane-hydrogel CLs made of Filcon V (I FDA group). CL surface roughness was studied by AFM in intermittent-contact mode, in air, on square areas of 25 and 100 μm 2 , by using a Nanoscope V MultiMode (Bruker). Detailed surface characterization of the surface topography was obtained using statistical parameters of 3-D (three-dimensional) surface roughness, in accordance with ISO 25178-2: 2012. Before wear, the surface was found to be characterized by out-of-plane and sharp structures, whilst after a wear of 8 h, two typical morphologies were observed. One morphology (sharp type) has a similar aspect as the unworn CLs and the other morphology (smooth type) is characterized by troughs and bumpy structures. The analysis of the AFM images revealed a multifractal geometry. The generalized dimension D q and the singularity spectrum f(α) provided quantitative values that characterize the local scale properties of CL surface geometry at nanometer scale. Surface statistical parameters deduced by multifractal analysis can be used to assess the CL micromorphology and can be used by manufacturers in developing CLs with improved surface characteristics. These parameters can also be used in understanding the tribological interactions of the back surface of the CL with the corneal surface and the front surface of the CL with the under-surface of the eyelid (friction, wear, and micro-elastohydrodynamic lubrication at a nanometer scale).

  12. Modeling and analysis of cell membrane systems with probabilistic model checking

    PubMed Central

    2011-01-01

    Background Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. Results We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Conclusions Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries. PMID:22369714

  13. Metabolomic Fingerprinting of Romaneschi Globe Artichokes by NMR Spectroscopy and Multivariate Data Analysis.

    PubMed

    de Falco, Bruna; Incerti, Guido; Pepe, Rosa; Amato, Mariana; Lanzotti, Virginia

    2016-09-01

    Globe artichoke (Cynara cardunculus L. var. scolymus L. Fiori) and cardoon (Cynara cardunculus L. var. altilis DC) are sources of nutraceuticals and bioactive compounds. To apply a NMR metabolomic fingerprinting approach to Cynara cardunculus heads to obtain simultaneous identification and quantitation of the major classes of organic compounds. The edible part of 14 Globe artichoke populations, belonging to the Romaneschi varietal group, were extracted to obtain apolar and polar organic extracts. The analysis was also extended to one species of cultivated cardoon for comparison. The (1) H-NMR of the extracts allowed simultaneous identification of the bioactive metabolites whose quantitation have been obtained by spectral integration followed by principal component analysis (PCA). Apolar organic extracts were mainly based on highly unsaturated long chain lipids. Polar organic extracts contained organic acids, amino acids, sugars (mainly inulin), caffeoyl derivatives (mainly cynarin), flavonoids, and terpenes. The level of nutraceuticals was found to be highest in the Italian landraces Bianco di Pertosa zia E and Natalina while cardoon showed the lowest content of all metabolites thus confirming the genetic distance between artichokes and cardoon. Metabolomic approach coupling NMR spectroscopy with multivariate data analysis allowed for a detailed metabolite profile of artichoke and cardoon varieties to be obtained. Relevant differences in the relative content of the metabolites were observed for the species analysed. This work is the first application of (1) H-NMR with multivariate statistics to provide a metabolomic fingerprinting of Cynara scolymus. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Investigation of a Nonlinear Control System

    NASA Technical Reports Server (NTRS)

    Flugge-Lotz, I; Taylor, C F; Lindberg, H E

    1958-01-01

    A discontinuous variation of coefficients of the differential equation describing the linear control system before nonlinear elements are added is studied in detail. The nonlinear feedback is applied to a second-order system. Simulation techniques are used to study performance of the nonlinear control system and to compare it with the linear system for a wide variety of inputs. A detailed quantitative study of the influence of relay delays and of a transport delay is presented.

  15. The Microphenotron: a robotic miniaturized plant phenotyping platform with diverse applications in chemical biology.

    PubMed

    Burrell, Thomas; Fozard, Susan; Holroyd, Geoff H; French, Andrew P; Pound, Michael P; Bigley, Christopher J; James Taylor, C; Forde, Brian G

    2017-01-01

    Chemical genetics provides a powerful alternative to conventional genetics for understanding gene function. However, its application to plants has been limited by the lack of a technology that allows detailed phenotyping of whole-seedling development in the context of a high-throughput chemical screen. We have therefore sought to develop an automated micro-phenotyping platform that would allow both root and shoot development to be monitored under conditions where the phenotypic effects of large numbers of small molecules can be assessed. The 'Microphenotron' platform uses 96-well microtitre plates to deliver chemical treatments to seedlings of Arabidopsis thaliana L. and is based around four components: (a) the 'Phytostrip', a novel seedling growth device that enables chemical treatments to be combined with the automated capture of images of developing roots and shoots; (b) an illuminated robotic platform that uses a commercially available robotic manipulator to capture images of developing shoots and roots; (c) software to control the sequence of robotic movements and integrate these with the image capture process; (d) purpose-made image analysis software for automated extraction of quantitative phenotypic data. Imaging of each plate (representing 80 separate assays) takes 4 min and can easily be performed daily for time-course studies. As currently configured, the Microphenotron has a capacity of 54 microtitre plates in a growth room footprint of 2.1 m 2 , giving a potential throughput of up to 4320 chemical treatments in a typical 10 days experiment. The Microphenotron has been validated by using it to screen a collection of 800 natural compounds for qualitative effects on root development and to perform a quantitative analysis of the effects of a range of concentrations of nitrate and ammonium on seedling development. The Microphenotron is an automated screening platform that for the first time is able to combine large numbers of individual chemical treatments with a detailed analysis of whole-seedling development, and particularly root system development. The Microphenotron should provide a powerful new tool for chemical genetics and for wider chemical biology applications, including the development of natural and synthetic chemical products for improved agricultural sustainability.

  16. Super-resolution mapping using multi-viewing CHRIS/PROBA data

    NASA Astrophysics Data System (ADS)

    Dwivedi, Manish; Kumar, Vinay

    2016-04-01

    High-spatial resolution Remote Sensing (RS) data provides detailed information which ensures high-definition visual image analysis of earth surface features. These data sets also support improved information extraction capabilities at a fine scale. In order to improve the spatial resolution of coarser resolution RS data, the Super Resolution Reconstruction (SRR) technique has become widely acknowledged which focused on multi-angular image sequences. In this study multi-angle CHRIS/PROBA data of Kutch area is used for SR image reconstruction to enhance the spatial resolution from 18 m to 6m in the hope to obtain a better land cover classification. Various SR approaches like Projection onto Convex Sets (POCS), Robust, Iterative Back Projection (IBP), Non-Uniform Interpolation and Structure-Adaptive Normalized Convolution (SANC) chosen for this study. Subjective assessment through visual interpretation shows substantial improvement in land cover details. Quantitative measures including peak signal to noise ratio and structural similarity are used for the evaluation of the image quality. It was observed that SANC SR technique using Vandewalle algorithm for the low resolution image registration outperformed the other techniques. After that SVM based classifier is used for the classification of SRR and data resampled to 6m spatial resolution using bi-cubic interpolation. A comparative analysis is carried out between classified data of bicubic interpolated and SR derived images of CHRIS/PROBA and SR derived classified data have shown a significant improvement of 10-12% in the overall accuracy. The results demonstrated that SR methods is able to improve spatial detail of multi-angle images as well as the classification accuracy.

  17. 75 FR 54117 - Building Energy Standards Program: Preliminary Determination Regarding Energy Efficiency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... Response to Comments on Previous Analysis C. Summary of the Comparative Analysis 1. Quantitative Analysis 2... preliminary quantitative analysis are specific building designs, in most cases with specific spaces defined... preliminary determination. C. Summary of the Comparative Analysis DOE carried out both a broad quantitative...

  18. Minimally-invasive Laser Ablation Inductively Coupled Plasma Mass Spectrometry analysis of model ancient copper alloys

    NASA Astrophysics Data System (ADS)

    Walaszek, Damian; Senn, Marianne; Wichser, Adrian; Faller, Markus; Wagner, Barbara; Bulska, Ewa; Ulrich, Andrea

    2014-09-01

    This work describes an evaluation of a strategy for multi-elemental analysis of typical ancient bronzes (copper, lead bronze and tin bronze) by means of laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS).The samples originating from archeological experiments on ancient metal smelting processes using direct reduction in a ‘bloomery’ furnace as well as historical casting techniques were investigated with the use of the previously proposed analytical procedure, including metallurgical observation and preliminary visual estimation of the homogeneity of the samples. The results of LA-ICPMS analysis were compared to the results of bulk composition obtained by X-ray fluorescence spectrometry (XRF) and by inductively coupled plasma mass spectrometry (ICPMS) after acid digestion. These results were coherent for most of the elements confirming the usefulness of the proposed analytical procedure, however the reliability of the quantitative information about the content of the most heterogeneously distributed elements was also discussed in more detail.

  19. Use of valence band Auger electron spectroscopy to study thin film growth: oxide and diamond-like carbon films

    NASA Astrophysics Data System (ADS)

    Steffen, H. J.

    1994-12-01

    It is demonstrated how Auger line shape analysis with factor analysis (FA), least-squares fitting and even simple peak height measurements may provide detailed information about the composition, different chemical states and also defect concentration or crystal order. Advantage is taken of the capability of Auger electron spectroscopy to give valence band structure information with high surface sensitivity and the special aspect of FA to identify and discriminate quantitatively unknown chemical species. Valence band spectra obtained from Ni, Fe, Cr and NiFe40Cr20 during oxygen exposure at room temperature reveal the oxidation process in the initial stage of the thin layer formation. Furthermore, the carbon chemical states that were formed during low energy C(+) and Ne(+) ion irradiation of graphite are delineated and the evolution of an amorphous network with sp3 bonds is disclosed. The analysis represents a unique method to quantify the fraction of sp3-hybridized carbon in diamond-like materials.

  20. Proteomic data analysis of glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke

    2016-05-01

    Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.

  1. Quantitative gel electrophoresis: new records in precision by elaborated staining and detection protocols.

    PubMed

    Deng, Xi; Schröder, Simone; Redweik, Sabine; Wätzig, Hermann

    2011-06-01

    Gel electrophoresis (GE) is a very common analytical technique for proteome research and protein analysis. Despite being developed decades ago, there is still a considerable need to improve its precision. Using the fluorescence of Colloidal Coomassie Blue -stained proteins in near-infrared (NIR), the major error source caused by the unpredictable background staining is strongly reduced. This result was generalized for various types of detectors. Since GE is a multi-step procedure, standardization of every single step is required. After detailed analysis of all steps, the staining and destaining were identified as the major source of the remaining variation. By employing standardized protocols, pooled percent relative standard deviations of 1.2-3.1% for band intensities were achieved for one-dimensional separations in repetitive experiments. The analysis of variance suggests that the same batch of staining solution should be used for gels of one experimental series to minimize day-to-day variation and to obtain high precision. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Methods for heat transfer and temperature field analysis of the insulated diesel

    NASA Technical Reports Server (NTRS)

    Morel, T.; Blumberg, P. N.; Fort, E. F.; Keribar, R.

    1984-01-01

    Work done during phase 1 of a three-year program aimed at developing a comprehensive heat transfer and thermal analysis methodology oriented specifically to the design requirements of insulated diesel engines is reported. The technology developed in this program makes possible a quantitative analysis of the low heat rejection concept. The program is comprehensive in that it addresses all the heat transfer issues that are critical to the successful development of the low heat rejection diesel engine: (1) in-cylinder convective and radiative heat transfer; (2) cyclic transient heat transfer in thin solid layers at component surfaces adjacent to the combustion chamber; and (3) steady-state heat conduction in the overall engine structure. The Integral Technologies, Inc. (ITI) program is comprised of a set of integrated analytical and experimental tasks. A detailed review of the ITI program approach is provided, including the technical issues which underlie it and a summay of the methods that were developed.

  3. qPIPSA: Relating enzymatic kinetic parameters and interaction fields

    PubMed Central

    Gabdoulline, Razif R; Stein, Matthias; Wade, Rebecca C

    2007-01-01

    Background The simulation of metabolic networks in quantitative systems biology requires the assignment of enzymatic kinetic parameters. Experimentally determined values are often not available and therefore computational methods to estimate these parameters are needed. It is possible to use the three-dimensional structure of an enzyme to perform simulations of a reaction and derive kinetic parameters. However, this is computationally demanding and requires detailed knowledge of the enzyme mechanism. We have therefore sought to develop a general, simple and computationally efficient procedure to relate protein structural information to enzymatic kinetic parameters that allows consistency between the kinetic and structural information to be checked and estimation of kinetic constants for structurally and mechanistically similar enzymes. Results We describe qPIPSA: quantitative Protein Interaction Property Similarity Analysis. In this analysis, molecular interaction fields, for example, electrostatic potentials, are computed from the enzyme structures. Differences in molecular interaction fields between enzymes are then related to the ratios of their kinetic parameters. This procedure can be used to estimate unknown kinetic parameters when enzyme structural information is available and kinetic parameters have been measured for related enzymes or were obtained under different conditions. The detailed interaction of the enzyme with substrate or cofactors is not modeled and is assumed to be similar for all the proteins compared. The protein structure modeling protocol employed ensures that differences between models reflect genuine differences between the protein sequences, rather than random fluctuations in protein structure. Conclusion Provided that the experimental conditions and the protein structural models refer to the same protein state or conformation, correlations between interaction fields and kinetic parameters can be established for sets of related enzymes. Outliers may arise due to variation in the importance of different contributions to the kinetic parameters, such as protein stability and conformational changes. The qPIPSA approach can assist in the validation as well as estimation of kinetic parameters, and provide insights into enzyme mechanism. PMID:17919319

  4. Fatigue crack growth in an aluminum alloy-fractographic study

    NASA Astrophysics Data System (ADS)

    Salam, I.; Muhammad, W.; Ejaz, N.

    2016-08-01

    A two-fold approach was adopted to understand the fatigue crack growth process in an Aluminum alloy; fatigue crack growth test of samples and analysis of fractured surfaces. Fatigue crack growth tests were conducted on middle tension M(T) samples prepared from an Aluminum alloy cylinder. The tests were conducted under constant amplitude loading at R ratio 0.1. The stress applied was from 20,30 and 40 per cent of the yield stress of the material. The fatigue crack growth data was recorded. After fatigue testing, the samples were subjected to detailed scanning electron microscopic (SEM) analysis. The resulting fracture surfaces were subjected to qualitative and quantitative fractographic examinations. Quantitative fracture analysis included an estimation of crack growth rate (CGR) in different regions. The effect of the microstructural features on fatigue crack growth was examined. It was observed that in stage II (crack growth region), the failure mode changes from intergranular to transgranular as the stress level increases. In the region of intergranular failure the localized brittle failure was observed and fatigue striations are difficult to reveal. However, in the region of transgranular failure the crack path is independent of the microstructural features. In this region, localized ductile failure mode was observed and well defined fatigue striations were present in the wake of fatigue crack. The effect of interaction of growing fatigue crack with microstructural features was not substantial. The final fracture (stage III) was ductile in all the cases.

  5. Quantitative proteomic analysis of paired colorectal cancer and non-tumorigenic tissues reveals signature proteins and perturbed pathways involved in CRC progression and metastasis.

    PubMed

    Sethi, Manveen K; Thaysen-Andersen, Morten; Kim, Hoguen; Park, Cheol Keun; Baker, Mark S; Packer, Nicolle H; Paik, Young-Ki; Hancock, William S; Fanayan, Susan

    2015-08-03

    Modern proteomics has proven instrumental in our understanding of the molecular deregulations associated with the development and progression of cancer. Herein, we profile membrane-enriched proteome of tumor and adjacent normal tissues from eight CRC patients using label-free nanoLC-MS/MS-based quantitative proteomics and advanced pathway analysis. Of the 948 identified proteins, 184 proteins were differentially expressed (P<0.05, fold change>1.5) between the tumor and non-tumor tissue (69 up-regulated and 115 down-regulated in tumor tissues). The CRC tumor and non-tumor tissues clustered tightly in separate groups using hierarchical cluster analysis of the differentially expressed proteins, indicating a strong CRC-association of this proteome subset. Specifically, cancer associated proteins such as FN1, TNC, DEFA1, ITGB2, MLEC, CDH17, EZR and pathways including actin cytoskeleton and RhoGDI signaling were deregulated. Stage-specific proteome signatures were identified including up-regulated ribosomal proteins and down-regulated annexin proteins in early stage CRC. Finally, EGFR(+) CRC tissues showed an EGFR-dependent down-regulation of cell adhesion molecules, relative to EGFR(-) tissues. Taken together, this study provides a detailed map of the altered proteome and associated protein pathways in CRC, which enhances our mechanistic understanding of CRC biology and opens avenues for a knowledge-driven search for candidate CRC protein markers. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Time-resolved measurements of supersonic fuel sprays using synchrotron X-rays.

    PubMed

    Powell, C F; Yue, Y; Poola, R; Wang, J

    2000-11-01

    A time-resolved radiographic technique has been developed for probing the fuel distribution close to the nozzle of a high-pressure single-hole diesel injector. The measurement was made using X-ray absorption of monochromatic synchrotron-generated radiation, allowing quantitative determination of the fuel distribution in this optically impenetrable region with a time resolution of better than 1 micros. These quantitative measurements constitute the most detailed near-nozzle study of a fuel spray to date.

  7. Too Hard to Control: Compromised Pain Anticipation and Modulation in Mild Traumatic Brain Injury

    DTIC Science & Technology

    2014-01-07

    modulation) will be able to answer these questions. In a related prior study, quantitative sensory testing was conducted in moderate to severe TBI and...found significant loss of thermal and touch sensibility compared with healthy con- trols.67 Although detailed quantitative sensory testing was not...IA. Pain and post traumatic stress disorder ‚Äì Review of clinical and experimental evidence. Neuropharmacology 2012; 62: 586–597. 36 First MB, Spitzer

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, W. James; Albertson, R Craig; Jacob, Rick E.

    Here we present a re-description of Abudefduf luridus and reassign it to the genus Similiparma. We supplement traditional diagnoses and descriptions of this species with quantitative anatomical data collected from a family-wide geometric morphometric analysis of head morphology (44 species representing all 30 damselfish genera) and data from cranial micro-CT scans of fishes in the genus Similiparma. The use of geometric morphometric analyses (and other methods of shape analysis) permits detailed comparisons between the morphology of specific taxa and the anatomical diversity that has arisen in an entire lineage. This provides a particularly useful supplement to traditional description methods andmore » we recommend the use of such techniques by systematists. Similiparma and its close relatives constitute a branch of the damselfish phylogenetic tree that predominantly inhabits rocky reefs in the Atlantic and Eastern Pacific, as opposed to the more commonly studied damselfishes that constitute a large portion of the ichthyofauna on all coral-reef communities.« less

  9. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  10. Approaches to Observe Anthropogenic Aerosol-Cloud Interactions.

    PubMed

    Quaas, Johannes

    Anthropogenic aerosol particles exert an-quantitatively very uncertain-effective radiative forcing due to aerosol-cloud interactions via an immediate altering of cloud albedo on the one hand and via rapid adjustments by alteration of cloud processes and by changes in thermodynamic profiles on the other hand. Large variability in cloud cover and properties and the therefore low signal-to-noise ratio for aerosol-induced perturbations hamper the identification of effects in observations. Six approaches are discussed as a means to isolate the impact of anthropogenic aerosol on clouds from natural cloud variability to estimate or constrain the effective forcing. These are (i) intentional cloud modification, (ii) ship tracks, (iii) differences between the hemispheres, (iv) trace gases, (v) weekly cycles and (vi) trends. Ship track analysis is recommendable for detailed process understanding, and the analysis of weekly cycles and long-term trends is most promising to derive estimates or constraints on the effective radiative forcing.

  11. A method to correct coordinate distortion in EBSD maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y.B., E-mail: yubz@dtu.dk; Elbrønd, A.; Lin, F.X.

    2014-10-15

    Drift during electron backscatter diffraction mapping leads to coordinate distortions in resulting orientation maps, which affects, in some cases significantly, the accuracy of analysis. A method, thin plate spline, is introduced and tested to correct such coordinate distortions in the maps after the electron backscatter diffraction measurements. The accuracy of the correction as well as theoretical and practical aspects of using the thin plate spline method is discussed in detail. By comparing with other correction methods, it is shown that the thin plate spline method is most efficient to correct different local distortions in the electron backscatter diffraction maps. -more » Highlights: • A new method is suggested to correct nonlinear spatial distortion in EBSD maps. • The method corrects EBSD maps more precisely than presently available methods. • Errors less than 1–2 pixels are typically obtained. • Direct quantitative analysis of dynamic data are available after this correction.« less

  12. health communication.

    PubMed

    Mullany, Louise; Smith, Catherine; Harvey, Kevin; Adolphs, Svenja

    2015-01-01

    This article explores the communicative choices of adolescents seeking advice from an internet-based health forum run by medical professionals. Techniques from the disciplines of sociolinguistics and corpus linguistics are integrated to examine the strategies used in adolescents’ health questions. We focus on the emergent theme of Weight and Eating, a concern which features prominently in adolescents’ requests to medical practitioners. The majority of advice requests are authored by adolescent girls, with queries peaking at age 12. A combined quantitative and qualitative analysis provides detailed insights into adolescents’ communicative strategies. Examinations of question types, register and a discourse-based analysis draw attention to dominant discourses of the body, including a ‘discourse of slenderness’ and a ‘discourse of normality’, which exercise negative influences on adolescents’ dietary behaviours. The findings are of applied linguistic relevance to health practitioners and educators, as they provide them with access to adolescents’ health queries in their own language.

  13. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  14. Logistic Regression in the Identification of Hazards in Construction

    NASA Astrophysics Data System (ADS)

    Drozd, Wojciech

    2017-10-01

    The construction site and its elements create circumstances that are conducive to the formation of risks to safety during the execution of works. Analysis indicates the critical importance of these factors in the set of characteristics that describe the causes of accidents in the construction industry. This article attempts to analyse the characteristics related to the construction site, in order to indicate their importance in defining the circumstances of accidents at work. The study includes sites inspected in 2014 - 2016 by the employees of the District Labour Inspectorate in Krakow (Poland). The analysed set of detailed (disaggregated) data includes both quantitative and qualitative characteristics. The substantive task focused on classification modelling in the identification of hazards in construction and identifying those of the analysed characteristics that are important in an accident. In terms of methodology, resource data analysis using statistical classifiers, in the form of logistic regression, was the method used.

  15. Fracture related-fold patterns analysis and hydrogeological implications: Insight from fault-propagation fold in Northwestern of Tunisia

    NASA Astrophysics Data System (ADS)

    Sanai, L.; Chenini, I.; Ben Mammou, A.; Mercier, E.

    2015-01-01

    The spatial distribution of fracturing in hard rocks is extremely related to the structural profile and traduces the kinematic evolution. The quantitative and qualitative analysis of fracturing combined to GIS techniques seem to be primordial and efficient in geometric characterization of lineament's network and to reconstruct the relative timing and interaction of the folding and fracturing histories. Also a detailed study of the area geology, lithology, tectonics, is primordial for any hydrogeological study. For that purpose we used a structural approach that consist in comparison between fracture sets before and after unfolding completed by aerospace data and DEM generated from topographic map. The above methodology applied in this study carried out in J. Rebia located in Northwestern of Tunisia demonstrated the heterogeneity of fracturing network and his relation with the fold growth throught time and his importance on groundwater flow.

  16. Residual transglutaminase in collagen - effects, detection, quantification, and removal.

    PubMed

    Schloegl, W; Klein, A; Fürst, R; Leicht, U; Volkmer, E; Schieker, M; Jus, S; Guebitz, G M; Stachel, I; Meyer, M; Wiggenhorn, M; Friess, W

    2012-02-01

    In the present study, we developed an enzyme-linked immunosorbent assay (ELISA) for microbial transglutaminase (mTG) from Streptomyces mobaraensis to overcome the lack of a quantification method for mTG. We further performed a detailed follow-on-analysis of insoluble porcine collagen type I enzymatically modified with mTG primarily focusing on residuals of mTG. Repeated washing (4 ×) reduced mTG-levels in the washing fluids but did not quantitatively remove mTG from the material (p < 0.000001). Substantial amounts of up to 40% of the enzyme utilized in the crosslinking mixture remained associated with the modified collagen. Binding was non-covalent as could be demonstrated by Western blot analysis. Acidic and alkaline dialysis of mTG treated collagen material enabled complete removal the enzyme. Treatment with guanidinium chloride, urea, or sodium chloride was less effective in reducing the mTG content. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  18. An ameliorative protocol for the quantification of purine 5',8-cyclo-2'-deoxynucleosides in oxidized DNA

    NASA Astrophysics Data System (ADS)

    Terzidis, Michael; Chatgilialoglu, Chryssostomos

    2015-07-01

    5',8-Cyclo-2'-deoxyadenosine (cdA) and 5',8-cyclo-2'-deoxyguanosine (cdG) are lesions resulting from hydroxyl radical (HO•) attack on the 5'H of the nucleoside sugar moiety and exist in both 5'R and 5'S diastereomeric forms. Increased levels of cdA and cdG are linked to Nucleotide Excision Repair mechanism deficiency and mutagenesis. Discrepancies in the damage measurements reported over recent years indicated the weakness of the actual protocols, in particular for ensuring the quantitative release of these lesions from the DNA sample and the appropriate method for their analysis. Herein we report the detailed revision leading to a cost-effective and efficient protocol for the DNA damage measurement, consisting of the nuclease benzonase and nuclease P1 enzymatic combination for DNA digestion followed by liquid chromatography isotope dilution tandem mass spectrometry analysis.

  19. Measurement of Galactic Logarithmic Spiral Arm Pitch Angle Using Two-Dimensional Fast Fourier Transform Decomposition

    NASA Astrophysics Data System (ADS)

    Davis, Benjamin L.; Berrier, J. C.; Shields, D. W.; Kennefick, J.; Kennefick, D.; Seigar, M. S.; Lacy, C. H. S.; Puerari, I.

    2012-01-01

    A logarithmic spiral is a prominent feature appearing in a majority of observed galaxies. This feature has long been associated with the traditional Hubble classification scheme, but historical quotes of pitch angle of spiral galaxies have been almost exclusively qualitative. We have developed a methodology, utilizing Two-Dimensional Fast Fourier Transformations of images of spiral galaxies, in order to isolate and measure the pitch angles of their spiral arms. Our technique provides a quantitative way to measure this morphological feature. This will allow the precise comparison of spiral galaxy evolution to other galactic parameters and test spiral arm genesis theories. In this work, we detail our image processing and analysis of spiral galaxy images and discuss the robustness of our analysis techniques. The authors gratefully acknowledge support for this work from NASA Grant NNX08AW03A.

  20. Population and Star Formation Histories from the Outer Limits Survey

    NASA Astrophysics Data System (ADS)

    Brondel, Brian Joseph; Saha, Abhijit; Olszewski, Edward

    2015-08-01

    The Outer Limits Survey (OLS) is a deep survey of selected fields in the outlying areas of the Magellanic Clouds based on the MOSAIC-II instrument on the Blanco 4-meter Telescope at CTIO. OLS is designed to probe the outer disk and halo structures of Magellanic System. The survey comprises ~50 fields obtained in Landolt R, I and Washington C, M and DDO51 filters, extending to a depth of about 24th magnitude in I. While qualitative examination of the resulting data has yielded interesting published results, we report here on quantitative analysis through matching of Hess diagrams to theoretical isochrones. We present analysis based on techniques developed by Dolphin (e.g., 2002, MNRAS, 332, 91) for fields observed by OLS. Our results broadly match those found by qualitative examination of the CMDs, but interesting details emerge from isochrone fitting.

Top